Eric Foster 35b96bc934 initial commit | 6 年前 | |
---|---|---|
.. | ||
examples | 6 年前 | |
test | 6 年前 | |
.npmignore | 6 年前 | |
.travis.yml | 6 年前 | |
LICENSE | 6 年前 | |
README.md | 6 年前 | |
package.json | 6 年前 | |
readdirp.js | 6 年前 | |
stream-api.js | 6 年前 |
Recursive version of fs.readdir. Exposes a stream api.
var readdirp = require('readdirp')
, path = require('path')
, es = require('event-stream');
// print out all JavaScript files along with their size
var stream = readdirp({ root: path.join(__dirname), fileFilter: '*.js' });
stream
.on('warn', function (err) {
console.error('non-fatal error', err);
// optionally call stream.destroy() here in order to abort and cause 'close' to be emitted
})
.on('error', function (err) { console.error('fatal error', err); })
.pipe(es.mapSync(function (entry) {
return { path: entry.path, size: entry.stat.size };
}))
.pipe(es.stringify())
.pipe(process.stdout);
Meant to be one of the recursive versions of fs functions, e.g., like mkdirp.
Table of Contents generated with DocToc
npm install readdirp
var entryStream = readdirp (options)
Reads given root recursively and returns a stream
of entry infos.
Behaves as follows:
emit('data')
passes an entry info whenever one is foundemit('warn')
passes a non-fatal Error
that prevents a file/directory from being processed (i.e., if it is
inaccessible to the user)emit('error')
passes a fatal Error
which also ends the stream (i.e., when illegal options where passed)emit('end')
called when all entries were found and no more will be emitted (i.e., we are done)emit('close')
called when the stream is destroyed via stream.destroy()
(which could be useful if you want to
manually abort even on a non fatal error) - at that point the stream is no longer readable
and no more entries,
warning or errors are emittedroot: path in which to start reading and recursing into subdirectories
fileFilter: filter to include/exclude files found (see Filters for more)
directoryFilter: filter to include/exclude directories found and to recurse into (see Filters for more)
depth: depth at which to stop recursing even if more subdirectories are found
entryType: determines if data events on the stream should be emitted for 'files'
, 'directories'
, 'both'
, or 'all'
. Setting to 'all'
will also include entries for other types of file descriptors like character devices, unix sockets and named pipes. Defaults to 'files'
.
lstat: if true
, readdirp uses fs.lstat
instead of fs.stat
in order to stat files and includes symlink entries in the stream along with files.
Has the following properties:
Example: (assuming root was /User/dev/readdirp
)
parentDir : 'test/bed/root_dir1',
fullParentDir : '/User/dev/readdirp/test/bed/root_dir1',
name : 'root_dir1_subdir1',
path : 'test/bed/root_dir1/root_dir1_subdir1',
fullPath : '/User/dev/readdirp/test/bed/root_dir1/root_dir1_subdir1',
stat : [ ... ]
There are three different ways to specify filters for files and directories respectively.
function: a function that takes an entry info as a parameter and returns true to include or false to exclude the entry
glob string: a string (e.g., *.js
) which is matched using minimatch, so go there for more
information.
Globstars (**
) are not supported since specifiying a recursive pattern for an already recursive function doesn't make sense.
Negated globs (as explained in the minimatch documentation) are allowed, e.g., !*.txt
matches everything but text files.
array of glob strings: either need to be all inclusive or all exclusive (negated) patterns otherwise an error is thrown.
[ '*.json', '*.js' ]
includes all JavaScript and Json files.
[ '!.git', '!node_modules' ]
includes all directories except the '.git' and 'node_modules'.
Directories that do not pass a filter will not be recursed into.
Although the stream api is recommended, readdirp also exposes a callback based api.
readdirp (options, callback1 [, callback2])
If callback2 is given, callback1 functions as the fileProcessed callback, and callback2 as the allProcessed callback.
If only callback1 is given, it functions as the allProcessed callback.
function (err, res) { ... }
function (entryInfo) { ... }
on('error', ..)
, on('warn', ..)
and on('end', ..)
handling omitted for brevity
var readdirp = require('readdirp');
// Glob file filter
readdirp({ root: './test/bed', fileFilter: '*.js' })
.on('data', function (entry) {
// do something with each JavaScript file entry
});
// Combined glob file filters
readdirp({ root: './test/bed', fileFilter: [ '*.js', '*.json' ] })
.on('data', function (entry) {
// do something with each JavaScript and Json file entry
});
// Combined negated directory filters
readdirp({ root: './test/bed', directoryFilter: [ '!.git', '!*modules' ] })
.on('data', function (entry) {
// do something with each file entry found outside '.git' or any modules directory
});
// Function directory filter
readdirp({ root: './test/bed', directoryFilter: function (di) { return di.name.length === 9; } })
.on('data', function (entry) {
// do something with each file entry found inside directories whose name has length 9
});
// Limiting depth
readdirp({ root: './test/bed', depth: 1 })
.on('data', function (entry) {
// do something with each file entry found up to 1 subdirectory deep
});
// callback api
readdirp(
{ root: '.' }
, function(fileInfo) {
// do something with file entry here
}
, function (err, res) {
// all done, move on or do final step for all file entries here
}
);
Try more examples by following instructions on how to get going.
Demonstrates error and data handling by listening to events emitted from the readdirp stream.
Demonstrates error handling by listening to events emitted from the readdirp stream and how to pipe the data stream into another destination stream.
Very naive implementation of grep, for demonstration purposes only.
Shows how to pass callbacks in order to handle errors and/or data.
The readdirp tests also will give you a good idea on how things work.