main funcions fixes

This commit is contained in:
2025-09-29 22:06:11 +09:00
parent 40e016e128
commit c8c3274527
7995 changed files with 1517998 additions and 1057 deletions

22
desktop-operator/node_modules/builder-util/LICENSE generated vendored Normal file
View File

@@ -0,0 +1,22 @@
The MIT License (MIT)
Copyright (c) 2015 Loopline Systems
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View File

@@ -0,0 +1,15 @@
(The MIT License)
Copyright (c) 2011-2017 JP Richardson
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files
(the 'Software'), to deal in the Software without restriction, including without limitation the rights to use, copy, modify,
merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS
OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View File

@@ -0,0 +1,262 @@
Node.js: fs-extra
=================
`fs-extra` adds file system methods that aren't included in the native `fs` module and adds promise support to the `fs` methods. It also uses [`graceful-fs`](https://github.com/isaacs/node-graceful-fs) to prevent `EMFILE` errors. It should be a drop in replacement for `fs`.
[![npm Package](https://img.shields.io/npm/v/fs-extra.svg)](https://www.npmjs.org/package/fs-extra)
[![License](https://img.shields.io/npm/l/fs-extra.svg)](https://github.com/jprichardson/node-fs-extra/blob/master/LICENSE)
[![build status](https://img.shields.io/github/workflow/status/jprichardson/node-fs-extra/Node.js%20CI/master)](https://github.com/jprichardson/node-fs-extra/actions/workflows/ci.yml?query=branch%3Amaster)
[![downloads per month](http://img.shields.io/npm/dm/fs-extra.svg)](https://www.npmjs.org/package/fs-extra)
[![JavaScript Style Guide](https://img.shields.io/badge/code_style-standard-brightgreen.svg)](https://standardjs.com)
Why?
----
I got tired of including `mkdirp`, `rimraf`, and `ncp` in most of my projects.
Installation
------------
npm install fs-extra
Usage
-----
`fs-extra` is a drop in replacement for native `fs`. All methods in `fs` are attached to `fs-extra`. All `fs` methods return promises if the callback isn't passed.
You don't ever need to include the original `fs` module again:
```js
const fs = require('fs') // this is no longer necessary
```
you can now do this:
```js
const fs = require('fs-extra')
```
or if you prefer to make it clear that you're using `fs-extra` and not `fs`, you may want
to name your `fs` variable `fse` like so:
```js
const fse = require('fs-extra')
```
you can also keep both, but it's redundant:
```js
const fs = require('fs')
const fse = require('fs-extra')
```
Sync vs Async vs Async/Await
-------------
Most methods are async by default. All async methods will return a promise if the callback isn't passed.
Sync methods on the other hand will throw if an error occurs.
Also Async/Await will throw an error if one occurs.
Example:
```js
const fs = require('fs-extra')
// Async with promises:
fs.copy('/tmp/myfile', '/tmp/mynewfile')
.then(() => console.log('success!'))
.catch(err => console.error(err))
// Async with callbacks:
fs.copy('/tmp/myfile', '/tmp/mynewfile', err => {
if (err) return console.error(err)
console.log('success!')
})
// Sync:
try {
fs.copySync('/tmp/myfile', '/tmp/mynewfile')
console.log('success!')
} catch (err) {
console.error(err)
}
// Async/Await:
async function copyFiles () {
try {
await fs.copy('/tmp/myfile', '/tmp/mynewfile')
console.log('success!')
} catch (err) {
console.error(err)
}
}
copyFiles()
```
Methods
-------
### Async
- [copy](docs/copy.md)
- [emptyDir](docs/emptyDir.md)
- [ensureFile](docs/ensureFile.md)
- [ensureDir](docs/ensureDir.md)
- [ensureLink](docs/ensureLink.md)
- [ensureSymlink](docs/ensureSymlink.md)
- [mkdirp](docs/ensureDir.md)
- [mkdirs](docs/ensureDir.md)
- [move](docs/move.md)
- [outputFile](docs/outputFile.md)
- [outputJson](docs/outputJson.md)
- [pathExists](docs/pathExists.md)
- [readJson](docs/readJson.md)
- [remove](docs/remove.md)
- [writeJson](docs/writeJson.md)
### Sync
- [copySync](docs/copy-sync.md)
- [emptyDirSync](docs/emptyDir-sync.md)
- [ensureFileSync](docs/ensureFile-sync.md)
- [ensureDirSync](docs/ensureDir-sync.md)
- [ensureLinkSync](docs/ensureLink-sync.md)
- [ensureSymlinkSync](docs/ensureSymlink-sync.md)
- [mkdirpSync](docs/ensureDir-sync.md)
- [mkdirsSync](docs/ensureDir-sync.md)
- [moveSync](docs/move-sync.md)
- [outputFileSync](docs/outputFile-sync.md)
- [outputJsonSync](docs/outputJson-sync.md)
- [pathExistsSync](docs/pathExists-sync.md)
- [readJsonSync](docs/readJson-sync.md)
- [removeSync](docs/remove-sync.md)
- [writeJsonSync](docs/writeJson-sync.md)
**NOTE:** You can still use the native Node.js methods. They are promisified and copied over to `fs-extra`. See [notes on `fs.read()`, `fs.write()`, & `fs.writev()`](docs/fs-read-write-writev.md)
### What happened to `walk()` and `walkSync()`?
They were removed from `fs-extra` in v2.0.0. If you need the functionality, `walk` and `walkSync` are available as separate packages, [`klaw`](https://github.com/jprichardson/node-klaw) and [`klaw-sync`](https://github.com/manidlou/node-klaw-sync).
Third Party
-----------
### CLI
[fse-cli](https://www.npmjs.com/package/@atao60/fse-cli) allows you to run `fs-extra` from a console or from [npm](https://www.npmjs.com) scripts.
### TypeScript
If you like TypeScript, you can use `fs-extra` with it: https://github.com/DefinitelyTyped/DefinitelyTyped/tree/master/types/fs-extra
### File / Directory Watching
If you want to watch for changes to files or directories, then you should use [chokidar](https://github.com/paulmillr/chokidar).
### Obtain Filesystem (Devices, Partitions) Information
[fs-filesystem](https://github.com/arthurintelligence/node-fs-filesystem) allows you to read the state of the filesystem of the host on which it is run. It returns information about both the devices and the partitions (volumes) of the system.
### Misc.
- [fs-extra-debug](https://github.com/jdxcode/fs-extra-debug) - Send your fs-extra calls to [debug](https://npmjs.org/package/debug).
- [mfs](https://github.com/cadorn/mfs) - Monitor your fs-extra calls.
Hacking on fs-extra
-------------------
Wanna hack on `fs-extra`? Great! Your help is needed! [fs-extra is one of the most depended upon Node.js packages](http://nodei.co/npm/fs-extra.png?downloads=true&downloadRank=true&stars=true). This project
uses [JavaScript Standard Style](https://github.com/feross/standard) - if the name or style choices bother you,
you're gonna have to get over it :) If `standard` is good enough for `npm`, it's good enough for `fs-extra`.
[![js-standard-style](https://cdn.rawgit.com/feross/standard/master/badge.svg)](https://github.com/feross/standard)
What's needed?
- First, take a look at existing issues. Those are probably going to be where the priority lies.
- More tests for edge cases. Specifically on different platforms. There can never be enough tests.
- Improve test coverage.
Note: If you make any big changes, **you should definitely file an issue for discussion first.**
### Running the Test Suite
fs-extra contains hundreds of tests.
- `npm run lint`: runs the linter ([standard](http://standardjs.com/))
- `npm run unit`: runs the unit tests
- `npm test`: runs both the linter and the tests
### Windows
If you run the tests on the Windows and receive a lot of symbolic link `EPERM` permission errors, it's
because on Windows you need elevated privilege to create symbolic links. You can add this to your Windows's
account by following the instructions here: http://superuser.com/questions/104845/permission-to-make-symbolic-links-in-windows-7
However, I didn't have much luck doing this.
Since I develop on Mac OS X, I use VMWare Fusion for Windows testing. I create a shared folder that I map to a drive on Windows.
I open the `Node.js command prompt` and run as `Administrator`. I then map the network drive running the following command:
net use z: "\\vmware-host\Shared Folders"
I can then navigate to my `fs-extra` directory and run the tests.
Naming
------
I put a lot of thought into the naming of these functions. Inspired by @coolaj86's request. So he deserves much of the credit for raising the issue. See discussion(s) here:
* https://github.com/jprichardson/node-fs-extra/issues/2
* https://github.com/flatiron/utile/issues/11
* https://github.com/ryanmcgrath/wrench-js/issues/29
* https://github.com/substack/node-mkdirp/issues/17
First, I believe that in as many cases as possible, the [Node.js naming schemes](http://nodejs.org/api/fs.html) should be chosen. However, there are problems with the Node.js own naming schemes.
For example, `fs.readFile()` and `fs.readdir()`: the **F** is capitalized in *File* and the **d** is not capitalized in *dir*. Perhaps a bit pedantic, but they should still be consistent. Also, Node.js has chosen a lot of POSIX naming schemes, which I believe is great. See: `fs.mkdir()`, `fs.rmdir()`, `fs.chown()`, etc.
We have a dilemma though. How do you consistently name methods that perform the following POSIX commands: `cp`, `cp -r`, `mkdir -p`, and `rm -rf`?
My perspective: when in doubt, err on the side of simplicity. A directory is just a hierarchical grouping of directories and files. Consider that for a moment. So when you want to copy it or remove it, in most cases you'll want to copy or remove all of its contents. When you want to create a directory, if the directory that it's suppose to be contained in does not exist, then in most cases you'll want to create that too.
So, if you want to remove a file or a directory regardless of whether it has contents, just call `fs.remove(path)`. If you want to copy a file or a directory whether it has contents, just call `fs.copy(source, destination)`. If you want to create a directory regardless of whether its parent directories exist, just call `fs.mkdirs(path)` or `fs.mkdirp(path)`.
Credit
------
`fs-extra` wouldn't be possible without using the modules from the following authors:
- [Isaac Shlueter](https://github.com/isaacs)
- [Charlie McConnel](https://github.com/avianflu)
- [James Halliday](https://github.com/substack)
- [Andrew Kelley](https://github.com/andrewrk)
License
-------
Licensed under MIT
Copyright (c) 2011-2017 [JP Richardson](https://github.com/jprichardson)
[1]: http://nodejs.org/docs/latest/api/fs.html
[jsonfile]: https://github.com/jprichardson/node-jsonfile

View File

@@ -0,0 +1,169 @@
'use strict'
const fs = require('graceful-fs')
const path = require('path')
const mkdirsSync = require('../mkdirs').mkdirsSync
const utimesMillisSync = require('../util/utimes').utimesMillisSync
const stat = require('../util/stat')
function copySync (src, dest, opts) {
if (typeof opts === 'function') {
opts = { filter: opts }
}
opts = opts || {}
opts.clobber = 'clobber' in opts ? !!opts.clobber : true // default to true for now
opts.overwrite = 'overwrite' in opts ? !!opts.overwrite : opts.clobber // overwrite falls back to clobber
// Warn about using preserveTimestamps on 32-bit node
if (opts.preserveTimestamps && process.arch === 'ia32') {
process.emitWarning(
'Using the preserveTimestamps option in 32-bit node is not recommended;\n\n' +
'\tsee https://github.com/jprichardson/node-fs-extra/issues/269',
'Warning', 'fs-extra-WARN0002'
)
}
const { srcStat, destStat } = stat.checkPathsSync(src, dest, 'copy', opts)
stat.checkParentPathsSync(src, srcStat, dest, 'copy')
return handleFilterAndCopy(destStat, src, dest, opts)
}
function handleFilterAndCopy (destStat, src, dest, opts) {
if (opts.filter && !opts.filter(src, dest)) return
const destParent = path.dirname(dest)
if (!fs.existsSync(destParent)) mkdirsSync(destParent)
return getStats(destStat, src, dest, opts)
}
function startCopy (destStat, src, dest, opts) {
if (opts.filter && !opts.filter(src, dest)) return
return getStats(destStat, src, dest, opts)
}
function getStats (destStat, src, dest, opts) {
const statSync = opts.dereference ? fs.statSync : fs.lstatSync
const srcStat = statSync(src)
if (srcStat.isDirectory()) return onDir(srcStat, destStat, src, dest, opts)
else if (srcStat.isFile() ||
srcStat.isCharacterDevice() ||
srcStat.isBlockDevice()) return onFile(srcStat, destStat, src, dest, opts)
else if (srcStat.isSymbolicLink()) return onLink(destStat, src, dest, opts)
else if (srcStat.isSocket()) throw new Error(`Cannot copy a socket file: ${src}`)
else if (srcStat.isFIFO()) throw new Error(`Cannot copy a FIFO pipe: ${src}`)
throw new Error(`Unknown file: ${src}`)
}
function onFile (srcStat, destStat, src, dest, opts) {
if (!destStat) return copyFile(srcStat, src, dest, opts)
return mayCopyFile(srcStat, src, dest, opts)
}
function mayCopyFile (srcStat, src, dest, opts) {
if (opts.overwrite) {
fs.unlinkSync(dest)
return copyFile(srcStat, src, dest, opts)
} else if (opts.errorOnExist) {
throw new Error(`'${dest}' already exists`)
}
}
function copyFile (srcStat, src, dest, opts) {
fs.copyFileSync(src, dest)
if (opts.preserveTimestamps) handleTimestamps(srcStat.mode, src, dest)
return setDestMode(dest, srcStat.mode)
}
function handleTimestamps (srcMode, src, dest) {
// Make sure the file is writable before setting the timestamp
// otherwise open fails with EPERM when invoked with 'r+'
// (through utimes call)
if (fileIsNotWritable(srcMode)) makeFileWritable(dest, srcMode)
return setDestTimestamps(src, dest)
}
function fileIsNotWritable (srcMode) {
return (srcMode & 0o200) === 0
}
function makeFileWritable (dest, srcMode) {
return setDestMode(dest, srcMode | 0o200)
}
function setDestMode (dest, srcMode) {
return fs.chmodSync(dest, srcMode)
}
function setDestTimestamps (src, dest) {
// The initial srcStat.atime cannot be trusted
// because it is modified by the read(2) system call
// (See https://nodejs.org/api/fs.html#fs_stat_time_values)
const updatedSrcStat = fs.statSync(src)
return utimesMillisSync(dest, updatedSrcStat.atime, updatedSrcStat.mtime)
}
function onDir (srcStat, destStat, src, dest, opts) {
if (!destStat) return mkDirAndCopy(srcStat.mode, src, dest, opts)
return copyDir(src, dest, opts)
}
function mkDirAndCopy (srcMode, src, dest, opts) {
fs.mkdirSync(dest)
copyDir(src, dest, opts)
return setDestMode(dest, srcMode)
}
function copyDir (src, dest, opts) {
fs.readdirSync(src).forEach(item => copyDirItem(item, src, dest, opts))
}
function copyDirItem (item, src, dest, opts) {
const srcItem = path.join(src, item)
const destItem = path.join(dest, item)
const { destStat } = stat.checkPathsSync(srcItem, destItem, 'copy', opts)
return startCopy(destStat, srcItem, destItem, opts)
}
function onLink (destStat, src, dest, opts) {
let resolvedSrc = fs.readlinkSync(src)
if (opts.dereference) {
resolvedSrc = path.resolve(process.cwd(), resolvedSrc)
}
if (!destStat) {
return fs.symlinkSync(resolvedSrc, dest)
} else {
let resolvedDest
try {
resolvedDest = fs.readlinkSync(dest)
} catch (err) {
// dest exists and is a regular file or directory,
// Windows may throw UNKNOWN error. If dest already exists,
// fs throws error anyway, so no need to guard against it here.
if (err.code === 'EINVAL' || err.code === 'UNKNOWN') return fs.symlinkSync(resolvedSrc, dest)
throw err
}
if (opts.dereference) {
resolvedDest = path.resolve(process.cwd(), resolvedDest)
}
if (stat.isSrcSubdir(resolvedSrc, resolvedDest)) {
throw new Error(`Cannot copy '${resolvedSrc}' to a subdirectory of itself, '${resolvedDest}'.`)
}
// prevent copy if src is a subdir of dest since unlinking
// dest in this case would result in removing src contents
// and therefore a broken symlink would be created.
if (fs.statSync(dest).isDirectory() && stat.isSrcSubdir(resolvedDest, resolvedSrc)) {
throw new Error(`Cannot overwrite '${resolvedDest}' with '${resolvedSrc}'.`)
}
return copyLink(resolvedSrc, dest)
}
}
function copyLink (resolvedSrc, dest) {
fs.unlinkSync(dest)
return fs.symlinkSync(resolvedSrc, dest)
}
module.exports = copySync

View File

@@ -0,0 +1,235 @@
'use strict'
const fs = require('graceful-fs')
const path = require('path')
const mkdirs = require('../mkdirs').mkdirs
const pathExists = require('../path-exists').pathExists
const utimesMillis = require('../util/utimes').utimesMillis
const stat = require('../util/stat')
function copy (src, dest, opts, cb) {
if (typeof opts === 'function' && !cb) {
cb = opts
opts = {}
} else if (typeof opts === 'function') {
opts = { filter: opts }
}
cb = cb || function () {}
opts = opts || {}
opts.clobber = 'clobber' in opts ? !!opts.clobber : true // default to true for now
opts.overwrite = 'overwrite' in opts ? !!opts.overwrite : opts.clobber // overwrite falls back to clobber
// Warn about using preserveTimestamps on 32-bit node
if (opts.preserveTimestamps && process.arch === 'ia32') {
process.emitWarning(
'Using the preserveTimestamps option in 32-bit node is not recommended;\n\n' +
'\tsee https://github.com/jprichardson/node-fs-extra/issues/269',
'Warning', 'fs-extra-WARN0001'
)
}
stat.checkPaths(src, dest, 'copy', opts, (err, stats) => {
if (err) return cb(err)
const { srcStat, destStat } = stats
stat.checkParentPaths(src, srcStat, dest, 'copy', err => {
if (err) return cb(err)
if (opts.filter) return handleFilter(checkParentDir, destStat, src, dest, opts, cb)
return checkParentDir(destStat, src, dest, opts, cb)
})
})
}
function checkParentDir (destStat, src, dest, opts, cb) {
const destParent = path.dirname(dest)
pathExists(destParent, (err, dirExists) => {
if (err) return cb(err)
if (dirExists) return getStats(destStat, src, dest, opts, cb)
mkdirs(destParent, err => {
if (err) return cb(err)
return getStats(destStat, src, dest, opts, cb)
})
})
}
function handleFilter (onInclude, destStat, src, dest, opts, cb) {
Promise.resolve(opts.filter(src, dest)).then(include => {
if (include) return onInclude(destStat, src, dest, opts, cb)
return cb()
}, error => cb(error))
}
function startCopy (destStat, src, dest, opts, cb) {
if (opts.filter) return handleFilter(getStats, destStat, src, dest, opts, cb)
return getStats(destStat, src, dest, opts, cb)
}
function getStats (destStat, src, dest, opts, cb) {
const stat = opts.dereference ? fs.stat : fs.lstat
stat(src, (err, srcStat) => {
if (err) return cb(err)
if (srcStat.isDirectory()) return onDir(srcStat, destStat, src, dest, opts, cb)
else if (srcStat.isFile() ||
srcStat.isCharacterDevice() ||
srcStat.isBlockDevice()) return onFile(srcStat, destStat, src, dest, opts, cb)
else if (srcStat.isSymbolicLink()) return onLink(destStat, src, dest, opts, cb)
else if (srcStat.isSocket()) return cb(new Error(`Cannot copy a socket file: ${src}`))
else if (srcStat.isFIFO()) return cb(new Error(`Cannot copy a FIFO pipe: ${src}`))
return cb(new Error(`Unknown file: ${src}`))
})
}
function onFile (srcStat, destStat, src, dest, opts, cb) {
if (!destStat) return copyFile(srcStat, src, dest, opts, cb)
return mayCopyFile(srcStat, src, dest, opts, cb)
}
function mayCopyFile (srcStat, src, dest, opts, cb) {
if (opts.overwrite) {
fs.unlink(dest, err => {
if (err) return cb(err)
return copyFile(srcStat, src, dest, opts, cb)
})
} else if (opts.errorOnExist) {
return cb(new Error(`'${dest}' already exists`))
} else return cb()
}
function copyFile (srcStat, src, dest, opts, cb) {
fs.copyFile(src, dest, err => {
if (err) return cb(err)
if (opts.preserveTimestamps) return handleTimestampsAndMode(srcStat.mode, src, dest, cb)
return setDestMode(dest, srcStat.mode, cb)
})
}
function handleTimestampsAndMode (srcMode, src, dest, cb) {
// Make sure the file is writable before setting the timestamp
// otherwise open fails with EPERM when invoked with 'r+'
// (through utimes call)
if (fileIsNotWritable(srcMode)) {
return makeFileWritable(dest, srcMode, err => {
if (err) return cb(err)
return setDestTimestampsAndMode(srcMode, src, dest, cb)
})
}
return setDestTimestampsAndMode(srcMode, src, dest, cb)
}
function fileIsNotWritable (srcMode) {
return (srcMode & 0o200) === 0
}
function makeFileWritable (dest, srcMode, cb) {
return setDestMode(dest, srcMode | 0o200, cb)
}
function setDestTimestampsAndMode (srcMode, src, dest, cb) {
setDestTimestamps(src, dest, err => {
if (err) return cb(err)
return setDestMode(dest, srcMode, cb)
})
}
function setDestMode (dest, srcMode, cb) {
return fs.chmod(dest, srcMode, cb)
}
function setDestTimestamps (src, dest, cb) {
// The initial srcStat.atime cannot be trusted
// because it is modified by the read(2) system call
// (See https://nodejs.org/api/fs.html#fs_stat_time_values)
fs.stat(src, (err, updatedSrcStat) => {
if (err) return cb(err)
return utimesMillis(dest, updatedSrcStat.atime, updatedSrcStat.mtime, cb)
})
}
function onDir (srcStat, destStat, src, dest, opts, cb) {
if (!destStat) return mkDirAndCopy(srcStat.mode, src, dest, opts, cb)
return copyDir(src, dest, opts, cb)
}
function mkDirAndCopy (srcMode, src, dest, opts, cb) {
fs.mkdir(dest, err => {
if (err) return cb(err)
copyDir(src, dest, opts, err => {
if (err) return cb(err)
return setDestMode(dest, srcMode, cb)
})
})
}
function copyDir (src, dest, opts, cb) {
fs.readdir(src, (err, items) => {
if (err) return cb(err)
return copyDirItems(items, src, dest, opts, cb)
})
}
function copyDirItems (items, src, dest, opts, cb) {
const item = items.pop()
if (!item) return cb()
return copyDirItem(items, item, src, dest, opts, cb)
}
function copyDirItem (items, item, src, dest, opts, cb) {
const srcItem = path.join(src, item)
const destItem = path.join(dest, item)
stat.checkPaths(srcItem, destItem, 'copy', opts, (err, stats) => {
if (err) return cb(err)
const { destStat } = stats
startCopy(destStat, srcItem, destItem, opts, err => {
if (err) return cb(err)
return copyDirItems(items, src, dest, opts, cb)
})
})
}
function onLink (destStat, src, dest, opts, cb) {
fs.readlink(src, (err, resolvedSrc) => {
if (err) return cb(err)
if (opts.dereference) {
resolvedSrc = path.resolve(process.cwd(), resolvedSrc)
}
if (!destStat) {
return fs.symlink(resolvedSrc, dest, cb)
} else {
fs.readlink(dest, (err, resolvedDest) => {
if (err) {
// dest exists and is a regular file or directory,
// Windows may throw UNKNOWN error. If dest already exists,
// fs throws error anyway, so no need to guard against it here.
if (err.code === 'EINVAL' || err.code === 'UNKNOWN') return fs.symlink(resolvedSrc, dest, cb)
return cb(err)
}
if (opts.dereference) {
resolvedDest = path.resolve(process.cwd(), resolvedDest)
}
if (stat.isSrcSubdir(resolvedSrc, resolvedDest)) {
return cb(new Error(`Cannot copy '${resolvedSrc}' to a subdirectory of itself, '${resolvedDest}'.`))
}
// do not copy if src is a subdir of dest since unlinking
// dest in this case would result in removing src contents
// and therefore a broken symlink would be created.
if (destStat.isDirectory() && stat.isSrcSubdir(resolvedDest, resolvedSrc)) {
return cb(new Error(`Cannot overwrite '${resolvedDest}' with '${resolvedSrc}'.`))
}
return copyLink(resolvedSrc, dest, cb)
})
}
})
}
function copyLink (resolvedSrc, dest, cb) {
fs.unlink(dest, err => {
if (err) return cb(err)
return fs.symlink(resolvedSrc, dest, cb)
})
}
module.exports = copy

View File

@@ -0,0 +1,7 @@
'use strict'
const u = require('universalify').fromCallback
module.exports = {
copy: u(require('./copy')),
copySync: require('./copy-sync')
}

View File

@@ -0,0 +1,39 @@
'use strict'
const u = require('universalify').fromPromise
const fs = require('../fs')
const path = require('path')
const mkdir = require('../mkdirs')
const remove = require('../remove')
const emptyDir = u(async function emptyDir (dir) {
let items
try {
items = await fs.readdir(dir)
} catch {
return mkdir.mkdirs(dir)
}
return Promise.all(items.map(item => remove.remove(path.join(dir, item))))
})
function emptyDirSync (dir) {
let items
try {
items = fs.readdirSync(dir)
} catch {
return mkdir.mkdirsSync(dir)
}
items.forEach(item => {
item = path.join(dir, item)
remove.removeSync(item)
})
}
module.exports = {
emptyDirSync,
emptydirSync: emptyDirSync,
emptyDir,
emptydir: emptyDir
}

View File

@@ -0,0 +1,69 @@
'use strict'
const u = require('universalify').fromCallback
const path = require('path')
const fs = require('graceful-fs')
const mkdir = require('../mkdirs')
function createFile (file, callback) {
function makeFile () {
fs.writeFile(file, '', err => {
if (err) return callback(err)
callback()
})
}
fs.stat(file, (err, stats) => { // eslint-disable-line handle-callback-err
if (!err && stats.isFile()) return callback()
const dir = path.dirname(file)
fs.stat(dir, (err, stats) => {
if (err) {
// if the directory doesn't exist, make it
if (err.code === 'ENOENT') {
return mkdir.mkdirs(dir, err => {
if (err) return callback(err)
makeFile()
})
}
return callback(err)
}
if (stats.isDirectory()) makeFile()
else {
// parent is not a directory
// This is just to cause an internal ENOTDIR error to be thrown
fs.readdir(dir, err => {
if (err) return callback(err)
})
}
})
})
}
function createFileSync (file) {
let stats
try {
stats = fs.statSync(file)
} catch {}
if (stats && stats.isFile()) return
const dir = path.dirname(file)
try {
if (!fs.statSync(dir).isDirectory()) {
// parent is not a directory
// This is just to cause an internal ENOTDIR error to be thrown
fs.readdirSync(dir)
}
} catch (err) {
// If the stat call above failed because the directory doesn't exist, create it
if (err && err.code === 'ENOENT') mkdir.mkdirsSync(dir)
else throw err
}
fs.writeFileSync(file, '')
}
module.exports = {
createFile: u(createFile),
createFileSync
}

View File

@@ -0,0 +1,23 @@
'use strict'
const { createFile, createFileSync } = require('./file')
const { createLink, createLinkSync } = require('./link')
const { createSymlink, createSymlinkSync } = require('./symlink')
module.exports = {
// file
createFile,
createFileSync,
ensureFile: createFile,
ensureFileSync: createFileSync,
// link
createLink,
createLinkSync,
ensureLink: createLink,
ensureLinkSync: createLinkSync,
// symlink
createSymlink,
createSymlinkSync,
ensureSymlink: createSymlink,
ensureSymlinkSync: createSymlinkSync
}

View File

@@ -0,0 +1,64 @@
'use strict'
const u = require('universalify').fromCallback
const path = require('path')
const fs = require('graceful-fs')
const mkdir = require('../mkdirs')
const pathExists = require('../path-exists').pathExists
const { areIdentical } = require('../util/stat')
function createLink (srcpath, dstpath, callback) {
function makeLink (srcpath, dstpath) {
fs.link(srcpath, dstpath, err => {
if (err) return callback(err)
callback(null)
})
}
fs.lstat(dstpath, (_, dstStat) => {
fs.lstat(srcpath, (err, srcStat) => {
if (err) {
err.message = err.message.replace('lstat', 'ensureLink')
return callback(err)
}
if (dstStat && areIdentical(srcStat, dstStat)) return callback(null)
const dir = path.dirname(dstpath)
pathExists(dir, (err, dirExists) => {
if (err) return callback(err)
if (dirExists) return makeLink(srcpath, dstpath)
mkdir.mkdirs(dir, err => {
if (err) return callback(err)
makeLink(srcpath, dstpath)
})
})
})
})
}
function createLinkSync (srcpath, dstpath) {
let dstStat
try {
dstStat = fs.lstatSync(dstpath)
} catch {}
try {
const srcStat = fs.lstatSync(srcpath)
if (dstStat && areIdentical(srcStat, dstStat)) return
} catch (err) {
err.message = err.message.replace('lstat', 'ensureLink')
throw err
}
const dir = path.dirname(dstpath)
const dirExists = fs.existsSync(dir)
if (dirExists) return fs.linkSync(srcpath, dstpath)
mkdir.mkdirsSync(dir)
return fs.linkSync(srcpath, dstpath)
}
module.exports = {
createLink: u(createLink),
createLinkSync
}

View File

@@ -0,0 +1,99 @@
'use strict'
const path = require('path')
const fs = require('graceful-fs')
const pathExists = require('../path-exists').pathExists
/**
* Function that returns two types of paths, one relative to symlink, and one
* relative to the current working directory. Checks if path is absolute or
* relative. If the path is relative, this function checks if the path is
* relative to symlink or relative to current working directory. This is an
* initiative to find a smarter `srcpath` to supply when building symlinks.
* This allows you to determine which path to use out of one of three possible
* types of source paths. The first is an absolute path. This is detected by
* `path.isAbsolute()`. When an absolute path is provided, it is checked to
* see if it exists. If it does it's used, if not an error is returned
* (callback)/ thrown (sync). The other two options for `srcpath` are a
* relative url. By default Node's `fs.symlink` works by creating a symlink
* using `dstpath` and expects the `srcpath` to be relative to the newly
* created symlink. If you provide a `srcpath` that does not exist on the file
* system it results in a broken symlink. To minimize this, the function
* checks to see if the 'relative to symlink' source file exists, and if it
* does it will use it. If it does not, it checks if there's a file that
* exists that is relative to the current working directory, if does its used.
* This preserves the expectations of the original fs.symlink spec and adds
* the ability to pass in `relative to current working direcotry` paths.
*/
function symlinkPaths (srcpath, dstpath, callback) {
if (path.isAbsolute(srcpath)) {
return fs.lstat(srcpath, (err) => {
if (err) {
err.message = err.message.replace('lstat', 'ensureSymlink')
return callback(err)
}
return callback(null, {
toCwd: srcpath,
toDst: srcpath
})
})
} else {
const dstdir = path.dirname(dstpath)
const relativeToDst = path.join(dstdir, srcpath)
return pathExists(relativeToDst, (err, exists) => {
if (err) return callback(err)
if (exists) {
return callback(null, {
toCwd: relativeToDst,
toDst: srcpath
})
} else {
return fs.lstat(srcpath, (err) => {
if (err) {
err.message = err.message.replace('lstat', 'ensureSymlink')
return callback(err)
}
return callback(null, {
toCwd: srcpath,
toDst: path.relative(dstdir, srcpath)
})
})
}
})
}
}
function symlinkPathsSync (srcpath, dstpath) {
let exists
if (path.isAbsolute(srcpath)) {
exists = fs.existsSync(srcpath)
if (!exists) throw new Error('absolute srcpath does not exist')
return {
toCwd: srcpath,
toDst: srcpath
}
} else {
const dstdir = path.dirname(dstpath)
const relativeToDst = path.join(dstdir, srcpath)
exists = fs.existsSync(relativeToDst)
if (exists) {
return {
toCwd: relativeToDst,
toDst: srcpath
}
} else {
exists = fs.existsSync(srcpath)
if (!exists) throw new Error('relative srcpath does not exist')
return {
toCwd: srcpath,
toDst: path.relative(dstdir, srcpath)
}
}
}
}
module.exports = {
symlinkPaths,
symlinkPathsSync
}

View File

@@ -0,0 +1,31 @@
'use strict'
const fs = require('graceful-fs')
function symlinkType (srcpath, type, callback) {
callback = (typeof type === 'function') ? type : callback
type = (typeof type === 'function') ? false : type
if (type) return callback(null, type)
fs.lstat(srcpath, (err, stats) => {
if (err) return callback(null, 'file')
type = (stats && stats.isDirectory()) ? 'dir' : 'file'
callback(null, type)
})
}
function symlinkTypeSync (srcpath, type) {
let stats
if (type) return type
try {
stats = fs.lstatSync(srcpath)
} catch {
return 'file'
}
return (stats && stats.isDirectory()) ? 'dir' : 'file'
}
module.exports = {
symlinkType,
symlinkTypeSync
}

View File

@@ -0,0 +1,82 @@
'use strict'
const u = require('universalify').fromCallback
const path = require('path')
const fs = require('../fs')
const _mkdirs = require('../mkdirs')
const mkdirs = _mkdirs.mkdirs
const mkdirsSync = _mkdirs.mkdirsSync
const _symlinkPaths = require('./symlink-paths')
const symlinkPaths = _symlinkPaths.symlinkPaths
const symlinkPathsSync = _symlinkPaths.symlinkPathsSync
const _symlinkType = require('./symlink-type')
const symlinkType = _symlinkType.symlinkType
const symlinkTypeSync = _symlinkType.symlinkTypeSync
const pathExists = require('../path-exists').pathExists
const { areIdentical } = require('../util/stat')
function createSymlink (srcpath, dstpath, type, callback) {
callback = (typeof type === 'function') ? type : callback
type = (typeof type === 'function') ? false : type
fs.lstat(dstpath, (err, stats) => {
if (!err && stats.isSymbolicLink()) {
Promise.all([
fs.stat(srcpath),
fs.stat(dstpath)
]).then(([srcStat, dstStat]) => {
if (areIdentical(srcStat, dstStat)) return callback(null)
_createSymlink(srcpath, dstpath, type, callback)
})
} else _createSymlink(srcpath, dstpath, type, callback)
})
}
function _createSymlink (srcpath, dstpath, type, callback) {
symlinkPaths(srcpath, dstpath, (err, relative) => {
if (err) return callback(err)
srcpath = relative.toDst
symlinkType(relative.toCwd, type, (err, type) => {
if (err) return callback(err)
const dir = path.dirname(dstpath)
pathExists(dir, (err, dirExists) => {
if (err) return callback(err)
if (dirExists) return fs.symlink(srcpath, dstpath, type, callback)
mkdirs(dir, err => {
if (err) return callback(err)
fs.symlink(srcpath, dstpath, type, callback)
})
})
})
})
}
function createSymlinkSync (srcpath, dstpath, type) {
let stats
try {
stats = fs.lstatSync(dstpath)
} catch {}
if (stats && stats.isSymbolicLink()) {
const srcStat = fs.statSync(srcpath)
const dstStat = fs.statSync(dstpath)
if (areIdentical(srcStat, dstStat)) return
}
const relative = symlinkPathsSync(srcpath, dstpath)
srcpath = relative.toDst
type = symlinkTypeSync(relative.toCwd, type)
const dir = path.dirname(dstpath)
const exists = fs.existsSync(dir)
if (exists) return fs.symlinkSync(srcpath, dstpath, type)
mkdirsSync(dir)
return fs.symlinkSync(srcpath, dstpath, type)
}
module.exports = {
createSymlink: u(createSymlink),
createSymlinkSync
}

View File

@@ -0,0 +1,128 @@
'use strict'
// This is adapted from https://github.com/normalize/mz
// Copyright (c) 2014-2016 Jonathan Ong me@jongleberry.com and Contributors
const u = require('universalify').fromCallback
const fs = require('graceful-fs')
const api = [
'access',
'appendFile',
'chmod',
'chown',
'close',
'copyFile',
'fchmod',
'fchown',
'fdatasync',
'fstat',
'fsync',
'ftruncate',
'futimes',
'lchmod',
'lchown',
'link',
'lstat',
'mkdir',
'mkdtemp',
'open',
'opendir',
'readdir',
'readFile',
'readlink',
'realpath',
'rename',
'rm',
'rmdir',
'stat',
'symlink',
'truncate',
'unlink',
'utimes',
'writeFile'
].filter(key => {
// Some commands are not available on some systems. Ex:
// fs.opendir was added in Node.js v12.12.0
// fs.rm was added in Node.js v14.14.0
// fs.lchown is not available on at least some Linux
return typeof fs[key] === 'function'
})
// Export cloned fs:
Object.assign(exports, fs)
// Universalify async methods:
api.forEach(method => {
exports[method] = u(fs[method])
})
// We differ from mz/fs in that we still ship the old, broken, fs.exists()
// since we are a drop-in replacement for the native module
exports.exists = function (filename, callback) {
if (typeof callback === 'function') {
return fs.exists(filename, callback)
}
return new Promise(resolve => {
return fs.exists(filename, resolve)
})
}
// fs.read(), fs.write(), & fs.writev() need special treatment due to multiple callback args
exports.read = function (fd, buffer, offset, length, position, callback) {
if (typeof callback === 'function') {
return fs.read(fd, buffer, offset, length, position, callback)
}
return new Promise((resolve, reject) => {
fs.read(fd, buffer, offset, length, position, (err, bytesRead, buffer) => {
if (err) return reject(err)
resolve({ bytesRead, buffer })
})
})
}
// Function signature can be
// fs.write(fd, buffer[, offset[, length[, position]]], callback)
// OR
// fs.write(fd, string[, position[, encoding]], callback)
// We need to handle both cases, so we use ...args
exports.write = function (fd, buffer, ...args) {
if (typeof args[args.length - 1] === 'function') {
return fs.write(fd, buffer, ...args)
}
return new Promise((resolve, reject) => {
fs.write(fd, buffer, ...args, (err, bytesWritten, buffer) => {
if (err) return reject(err)
resolve({ bytesWritten, buffer })
})
})
}
// fs.writev only available in Node v12.9.0+
if (typeof fs.writev === 'function') {
// Function signature is
// s.writev(fd, buffers[, position], callback)
// We need to handle the optional arg, so we use ...args
exports.writev = function (fd, buffers, ...args) {
if (typeof args[args.length - 1] === 'function') {
return fs.writev(fd, buffers, ...args)
}
return new Promise((resolve, reject) => {
fs.writev(fd, buffers, ...args, (err, bytesWritten, buffers) => {
if (err) return reject(err)
resolve({ bytesWritten, buffers })
})
})
}
}
// fs.realpath.native sometimes not available if fs is monkey-patched
if (typeof fs.realpath.native === 'function') {
exports.realpath.native = u(fs.realpath.native)
} else {
process.emitWarning(
'fs.realpath.native is not a function. Is fs being monkey-patched?',
'Warning', 'fs-extra-WARN0003'
)
}

View File

@@ -0,0 +1,16 @@
'use strict'
module.exports = {
// Export promiseified graceful-fs:
...require('./fs'),
// Export extra methods:
...require('./copy'),
...require('./empty'),
...require('./ensure'),
...require('./json'),
...require('./mkdirs'),
...require('./move'),
...require('./output-file'),
...require('./path-exists'),
...require('./remove')
}

View File

@@ -0,0 +1,16 @@
'use strict'
const u = require('universalify').fromPromise
const jsonFile = require('./jsonfile')
jsonFile.outputJson = u(require('./output-json'))
jsonFile.outputJsonSync = require('./output-json-sync')
// aliases
jsonFile.outputJSON = jsonFile.outputJson
jsonFile.outputJSONSync = jsonFile.outputJsonSync
jsonFile.writeJSON = jsonFile.writeJson
jsonFile.writeJSONSync = jsonFile.writeJsonSync
jsonFile.readJSON = jsonFile.readJson
jsonFile.readJSONSync = jsonFile.readJsonSync
module.exports = jsonFile

View File

@@ -0,0 +1,11 @@
'use strict'
const jsonFile = require('jsonfile')
module.exports = {
// jsonfile exports
readJson: jsonFile.readFile,
readJsonSync: jsonFile.readFileSync,
writeJson: jsonFile.writeFile,
writeJsonSync: jsonFile.writeFileSync
}

View File

@@ -0,0 +1,12 @@
'use strict'
const { stringify } = require('jsonfile/utils')
const { outputFileSync } = require('../output-file')
function outputJsonSync (file, data, options) {
const str = stringify(data, options)
outputFileSync(file, str, options)
}
module.exports = outputJsonSync

View File

@@ -0,0 +1,12 @@
'use strict'
const { stringify } = require('jsonfile/utils')
const { outputFile } = require('../output-file')
async function outputJson (file, data, options = {}) {
const str = stringify(data, options)
await outputFile(file, str, options)
}
module.exports = outputJson

View File

@@ -0,0 +1,14 @@
'use strict'
const u = require('universalify').fromPromise
const { makeDir: _makeDir, makeDirSync } = require('./make-dir')
const makeDir = u(_makeDir)
module.exports = {
mkdirs: makeDir,
mkdirsSync: makeDirSync,
// alias
mkdirp: makeDir,
mkdirpSync: makeDirSync,
ensureDir: makeDir,
ensureDirSync: makeDirSync
}

View File

@@ -0,0 +1,27 @@
'use strict'
const fs = require('../fs')
const { checkPath } = require('./utils')
const getMode = options => {
const defaults = { mode: 0o777 }
if (typeof options === 'number') return options
return ({ ...defaults, ...options }).mode
}
module.exports.makeDir = async (dir, options) => {
checkPath(dir)
return fs.mkdir(dir, {
mode: getMode(options),
recursive: true
})
}
module.exports.makeDirSync = (dir, options) => {
checkPath(dir)
return fs.mkdirSync(dir, {
mode: getMode(options),
recursive: true
})
}

View File

@@ -0,0 +1,21 @@
// Adapted from https://github.com/sindresorhus/make-dir
// Copyright (c) Sindre Sorhus <sindresorhus@gmail.com> (sindresorhus.com)
// Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
// The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
'use strict'
const path = require('path')
// https://github.com/nodejs/node/issues/8987
// https://github.com/libuv/libuv/pull/1088
module.exports.checkPath = function checkPath (pth) {
if (process.platform === 'win32') {
const pathHasInvalidWinCharacters = /[<>:"|?*]/.test(pth.replace(path.parse(pth).root, ''))
if (pathHasInvalidWinCharacters) {
const error = new Error(`Path contains invalid characters: ${pth}`)
error.code = 'EINVAL'
throw error
}
}
}

View File

@@ -0,0 +1,7 @@
'use strict'
const u = require('universalify').fromCallback
module.exports = {
move: u(require('./move')),
moveSync: require('./move-sync')
}

View File

@@ -0,0 +1,54 @@
'use strict'
const fs = require('graceful-fs')
const path = require('path')
const copySync = require('../copy').copySync
const removeSync = require('../remove').removeSync
const mkdirpSync = require('../mkdirs').mkdirpSync
const stat = require('../util/stat')
function moveSync (src, dest, opts) {
opts = opts || {}
const overwrite = opts.overwrite || opts.clobber || false
const { srcStat, isChangingCase = false } = stat.checkPathsSync(src, dest, 'move', opts)
stat.checkParentPathsSync(src, srcStat, dest, 'move')
if (!isParentRoot(dest)) mkdirpSync(path.dirname(dest))
return doRename(src, dest, overwrite, isChangingCase)
}
function isParentRoot (dest) {
const parent = path.dirname(dest)
const parsedPath = path.parse(parent)
return parsedPath.root === parent
}
function doRename (src, dest, overwrite, isChangingCase) {
if (isChangingCase) return rename(src, dest, overwrite)
if (overwrite) {
removeSync(dest)
return rename(src, dest, overwrite)
}
if (fs.existsSync(dest)) throw new Error('dest already exists.')
return rename(src, dest, overwrite)
}
function rename (src, dest, overwrite) {
try {
fs.renameSync(src, dest)
} catch (err) {
if (err.code !== 'EXDEV') throw err
return moveAcrossDevice(src, dest, overwrite)
}
}
function moveAcrossDevice (src, dest, overwrite) {
const opts = {
overwrite,
errorOnExist: true
}
copySync(src, dest, opts)
return removeSync(src)
}
module.exports = moveSync

View File

@@ -0,0 +1,75 @@
'use strict'
const fs = require('graceful-fs')
const path = require('path')
const copy = require('../copy').copy
const remove = require('../remove').remove
const mkdirp = require('../mkdirs').mkdirp
const pathExists = require('../path-exists').pathExists
const stat = require('../util/stat')
function move (src, dest, opts, cb) {
if (typeof opts === 'function') {
cb = opts
opts = {}
}
opts = opts || {}
const overwrite = opts.overwrite || opts.clobber || false
stat.checkPaths(src, dest, 'move', opts, (err, stats) => {
if (err) return cb(err)
const { srcStat, isChangingCase = false } = stats
stat.checkParentPaths(src, srcStat, dest, 'move', err => {
if (err) return cb(err)
if (isParentRoot(dest)) return doRename(src, dest, overwrite, isChangingCase, cb)
mkdirp(path.dirname(dest), err => {
if (err) return cb(err)
return doRename(src, dest, overwrite, isChangingCase, cb)
})
})
})
}
function isParentRoot (dest) {
const parent = path.dirname(dest)
const parsedPath = path.parse(parent)
return parsedPath.root === parent
}
function doRename (src, dest, overwrite, isChangingCase, cb) {
if (isChangingCase) return rename(src, dest, overwrite, cb)
if (overwrite) {
return remove(dest, err => {
if (err) return cb(err)
return rename(src, dest, overwrite, cb)
})
}
pathExists(dest, (err, destExists) => {
if (err) return cb(err)
if (destExists) return cb(new Error('dest already exists.'))
return rename(src, dest, overwrite, cb)
})
}
function rename (src, dest, overwrite, cb) {
fs.rename(src, dest, err => {
if (!err) return cb()
if (err.code !== 'EXDEV') return cb(err)
return moveAcrossDevice(src, dest, overwrite, cb)
})
}
function moveAcrossDevice (src, dest, overwrite, cb) {
const opts = {
overwrite,
errorOnExist: true
}
copy(src, dest, opts, err => {
if (err) return cb(err)
return remove(src, cb)
})
}
module.exports = move

View File

@@ -0,0 +1,40 @@
'use strict'
const u = require('universalify').fromCallback
const fs = require('graceful-fs')
const path = require('path')
const mkdir = require('../mkdirs')
const pathExists = require('../path-exists').pathExists
function outputFile (file, data, encoding, callback) {
if (typeof encoding === 'function') {
callback = encoding
encoding = 'utf8'
}
const dir = path.dirname(file)
pathExists(dir, (err, itDoes) => {
if (err) return callback(err)
if (itDoes) return fs.writeFile(file, data, encoding, callback)
mkdir.mkdirs(dir, err => {
if (err) return callback(err)
fs.writeFile(file, data, encoding, callback)
})
})
}
function outputFileSync (file, ...args) {
const dir = path.dirname(file)
if (fs.existsSync(dir)) {
return fs.writeFileSync(file, ...args)
}
mkdir.mkdirsSync(dir)
fs.writeFileSync(file, ...args)
}
module.exports = {
outputFile: u(outputFile),
outputFileSync
}

View File

@@ -0,0 +1,12 @@
'use strict'
const u = require('universalify').fromPromise
const fs = require('../fs')
function pathExists (path) {
return fs.access(path).then(() => true).catch(() => false)
}
module.exports = {
pathExists: u(pathExists),
pathExistsSync: fs.existsSync
}

View File

@@ -0,0 +1,22 @@
'use strict'
const fs = require('graceful-fs')
const u = require('universalify').fromCallback
const rimraf = require('./rimraf')
function remove (path, callback) {
// Node 14.14.0+
if (fs.rm) return fs.rm(path, { recursive: true, force: true }, callback)
rimraf(path, callback)
}
function removeSync (path) {
// Node 14.14.0+
if (fs.rmSync) return fs.rmSync(path, { recursive: true, force: true })
rimraf.sync(path)
}
module.exports = {
remove: u(remove),
removeSync
}

View File

@@ -0,0 +1,302 @@
'use strict'
const fs = require('graceful-fs')
const path = require('path')
const assert = require('assert')
const isWindows = (process.platform === 'win32')
function defaults (options) {
const methods = [
'unlink',
'chmod',
'stat',
'lstat',
'rmdir',
'readdir'
]
methods.forEach(m => {
options[m] = options[m] || fs[m]
m = m + 'Sync'
options[m] = options[m] || fs[m]
})
options.maxBusyTries = options.maxBusyTries || 3
}
function rimraf (p, options, cb) {
let busyTries = 0
if (typeof options === 'function') {
cb = options
options = {}
}
assert(p, 'rimraf: missing path')
assert.strictEqual(typeof p, 'string', 'rimraf: path should be a string')
assert.strictEqual(typeof cb, 'function', 'rimraf: callback function required')
assert(options, 'rimraf: invalid options argument provided')
assert.strictEqual(typeof options, 'object', 'rimraf: options should be object')
defaults(options)
rimraf_(p, options, function CB (er) {
if (er) {
if ((er.code === 'EBUSY' || er.code === 'ENOTEMPTY' || er.code === 'EPERM') &&
busyTries < options.maxBusyTries) {
busyTries++
const time = busyTries * 100
// try again, with the same exact callback as this one.
return setTimeout(() => rimraf_(p, options, CB), time)
}
// already gone
if (er.code === 'ENOENT') er = null
}
cb(er)
})
}
// Two possible strategies.
// 1. Assume it's a file. unlink it, then do the dir stuff on EPERM or EISDIR
// 2. Assume it's a directory. readdir, then do the file stuff on ENOTDIR
//
// Both result in an extra syscall when you guess wrong. However, there
// are likely far more normal files in the world than directories. This
// is based on the assumption that a the average number of files per
// directory is >= 1.
//
// If anyone ever complains about this, then I guess the strategy could
// be made configurable somehow. But until then, YAGNI.
function rimraf_ (p, options, cb) {
assert(p)
assert(options)
assert(typeof cb === 'function')
// sunos lets the root user unlink directories, which is... weird.
// so we have to lstat here and make sure it's not a dir.
options.lstat(p, (er, st) => {
if (er && er.code === 'ENOENT') {
return cb(null)
}
// Windows can EPERM on stat. Life is suffering.
if (er && er.code === 'EPERM' && isWindows) {
return fixWinEPERM(p, options, er, cb)
}
if (st && st.isDirectory()) {
return rmdir(p, options, er, cb)
}
options.unlink(p, er => {
if (er) {
if (er.code === 'ENOENT') {
return cb(null)
}
if (er.code === 'EPERM') {
return (isWindows)
? fixWinEPERM(p, options, er, cb)
: rmdir(p, options, er, cb)
}
if (er.code === 'EISDIR') {
return rmdir(p, options, er, cb)
}
}
return cb(er)
})
})
}
function fixWinEPERM (p, options, er, cb) {
assert(p)
assert(options)
assert(typeof cb === 'function')
options.chmod(p, 0o666, er2 => {
if (er2) {
cb(er2.code === 'ENOENT' ? null : er)
} else {
options.stat(p, (er3, stats) => {
if (er3) {
cb(er3.code === 'ENOENT' ? null : er)
} else if (stats.isDirectory()) {
rmdir(p, options, er, cb)
} else {
options.unlink(p, cb)
}
})
}
})
}
function fixWinEPERMSync (p, options, er) {
let stats
assert(p)
assert(options)
try {
options.chmodSync(p, 0o666)
} catch (er2) {
if (er2.code === 'ENOENT') {
return
} else {
throw er
}
}
try {
stats = options.statSync(p)
} catch (er3) {
if (er3.code === 'ENOENT') {
return
} else {
throw er
}
}
if (stats.isDirectory()) {
rmdirSync(p, options, er)
} else {
options.unlinkSync(p)
}
}
function rmdir (p, options, originalEr, cb) {
assert(p)
assert(options)
assert(typeof cb === 'function')
// try to rmdir first, and only readdir on ENOTEMPTY or EEXIST (SunOS)
// if we guessed wrong, and it's not a directory, then
// raise the original error.
options.rmdir(p, er => {
if (er && (er.code === 'ENOTEMPTY' || er.code === 'EEXIST' || er.code === 'EPERM')) {
rmkids(p, options, cb)
} else if (er && er.code === 'ENOTDIR') {
cb(originalEr)
} else {
cb(er)
}
})
}
function rmkids (p, options, cb) {
assert(p)
assert(options)
assert(typeof cb === 'function')
options.readdir(p, (er, files) => {
if (er) return cb(er)
let n = files.length
let errState
if (n === 0) return options.rmdir(p, cb)
files.forEach(f => {
rimraf(path.join(p, f), options, er => {
if (errState) {
return
}
if (er) return cb(errState = er)
if (--n === 0) {
options.rmdir(p, cb)
}
})
})
})
}
// this looks simpler, and is strictly *faster*, but will
// tie up the JavaScript thread and fail on excessively
// deep directory trees.
function rimrafSync (p, options) {
let st
options = options || {}
defaults(options)
assert(p, 'rimraf: missing path')
assert.strictEqual(typeof p, 'string', 'rimraf: path should be a string')
assert(options, 'rimraf: missing options')
assert.strictEqual(typeof options, 'object', 'rimraf: options should be object')
try {
st = options.lstatSync(p)
} catch (er) {
if (er.code === 'ENOENT') {
return
}
// Windows can EPERM on stat. Life is suffering.
if (er.code === 'EPERM' && isWindows) {
fixWinEPERMSync(p, options, er)
}
}
try {
// sunos lets the root user unlink directories, which is... weird.
if (st && st.isDirectory()) {
rmdirSync(p, options, null)
} else {
options.unlinkSync(p)
}
} catch (er) {
if (er.code === 'ENOENT') {
return
} else if (er.code === 'EPERM') {
return isWindows ? fixWinEPERMSync(p, options, er) : rmdirSync(p, options, er)
} else if (er.code !== 'EISDIR') {
throw er
}
rmdirSync(p, options, er)
}
}
function rmdirSync (p, options, originalEr) {
assert(p)
assert(options)
try {
options.rmdirSync(p)
} catch (er) {
if (er.code === 'ENOTDIR') {
throw originalEr
} else if (er.code === 'ENOTEMPTY' || er.code === 'EEXIST' || er.code === 'EPERM') {
rmkidsSync(p, options)
} else if (er.code !== 'ENOENT') {
throw er
}
}
}
function rmkidsSync (p, options) {
assert(p)
assert(options)
options.readdirSync(p).forEach(f => rimrafSync(path.join(p, f), options))
if (isWindows) {
// We only end up here once we got ENOTEMPTY at least once, and
// at this point, we are guaranteed to have removed all the kids.
// So, we know that it won't be ENOENT or ENOTDIR or anything else.
// try really hard to delete stuff on windows, because it has a
// PROFOUNDLY annoying habit of not closing handles promptly when
// files are deleted, resulting in spurious ENOTEMPTY errors.
const startTime = Date.now()
do {
try {
const ret = options.rmdirSync(p, options)
return ret
} catch {}
} while (Date.now() - startTime < 500) // give up after 500ms
} else {
const ret = options.rmdirSync(p, options)
return ret
}
}
module.exports = rimraf
rimraf.sync = rimrafSync

View File

@@ -0,0 +1,154 @@
'use strict'
const fs = require('../fs')
const path = require('path')
const util = require('util')
function getStats (src, dest, opts) {
const statFunc = opts.dereference
? (file) => fs.stat(file, { bigint: true })
: (file) => fs.lstat(file, { bigint: true })
return Promise.all([
statFunc(src),
statFunc(dest).catch(err => {
if (err.code === 'ENOENT') return null
throw err
})
]).then(([srcStat, destStat]) => ({ srcStat, destStat }))
}
function getStatsSync (src, dest, opts) {
let destStat
const statFunc = opts.dereference
? (file) => fs.statSync(file, { bigint: true })
: (file) => fs.lstatSync(file, { bigint: true })
const srcStat = statFunc(src)
try {
destStat = statFunc(dest)
} catch (err) {
if (err.code === 'ENOENT') return { srcStat, destStat: null }
throw err
}
return { srcStat, destStat }
}
function checkPaths (src, dest, funcName, opts, cb) {
util.callbackify(getStats)(src, dest, opts, (err, stats) => {
if (err) return cb(err)
const { srcStat, destStat } = stats
if (destStat) {
if (areIdentical(srcStat, destStat)) {
const srcBaseName = path.basename(src)
const destBaseName = path.basename(dest)
if (funcName === 'move' &&
srcBaseName !== destBaseName &&
srcBaseName.toLowerCase() === destBaseName.toLowerCase()) {
return cb(null, { srcStat, destStat, isChangingCase: true })
}
return cb(new Error('Source and destination must not be the same.'))
}
if (srcStat.isDirectory() && !destStat.isDirectory()) {
return cb(new Error(`Cannot overwrite non-directory '${dest}' with directory '${src}'.`))
}
if (!srcStat.isDirectory() && destStat.isDirectory()) {
return cb(new Error(`Cannot overwrite directory '${dest}' with non-directory '${src}'.`))
}
}
if (srcStat.isDirectory() && isSrcSubdir(src, dest)) {
return cb(new Error(errMsg(src, dest, funcName)))
}
return cb(null, { srcStat, destStat })
})
}
function checkPathsSync (src, dest, funcName, opts) {
const { srcStat, destStat } = getStatsSync(src, dest, opts)
if (destStat) {
if (areIdentical(srcStat, destStat)) {
const srcBaseName = path.basename(src)
const destBaseName = path.basename(dest)
if (funcName === 'move' &&
srcBaseName !== destBaseName &&
srcBaseName.toLowerCase() === destBaseName.toLowerCase()) {
return { srcStat, destStat, isChangingCase: true }
}
throw new Error('Source and destination must not be the same.')
}
if (srcStat.isDirectory() && !destStat.isDirectory()) {
throw new Error(`Cannot overwrite non-directory '${dest}' with directory '${src}'.`)
}
if (!srcStat.isDirectory() && destStat.isDirectory()) {
throw new Error(`Cannot overwrite directory '${dest}' with non-directory '${src}'.`)
}
}
if (srcStat.isDirectory() && isSrcSubdir(src, dest)) {
throw new Error(errMsg(src, dest, funcName))
}
return { srcStat, destStat }
}
// recursively check if dest parent is a subdirectory of src.
// It works for all file types including symlinks since it
// checks the src and dest inodes. It starts from the deepest
// parent and stops once it reaches the src parent or the root path.
function checkParentPaths (src, srcStat, dest, funcName, cb) {
const srcParent = path.resolve(path.dirname(src))
const destParent = path.resolve(path.dirname(dest))
if (destParent === srcParent || destParent === path.parse(destParent).root) return cb()
fs.stat(destParent, { bigint: true }, (err, destStat) => {
if (err) {
if (err.code === 'ENOENT') return cb()
return cb(err)
}
if (areIdentical(srcStat, destStat)) {
return cb(new Error(errMsg(src, dest, funcName)))
}
return checkParentPaths(src, srcStat, destParent, funcName, cb)
})
}
function checkParentPathsSync (src, srcStat, dest, funcName) {
const srcParent = path.resolve(path.dirname(src))
const destParent = path.resolve(path.dirname(dest))
if (destParent === srcParent || destParent === path.parse(destParent).root) return
let destStat
try {
destStat = fs.statSync(destParent, { bigint: true })
} catch (err) {
if (err.code === 'ENOENT') return
throw err
}
if (areIdentical(srcStat, destStat)) {
throw new Error(errMsg(src, dest, funcName))
}
return checkParentPathsSync(src, srcStat, destParent, funcName)
}
function areIdentical (srcStat, destStat) {
return destStat.ino && destStat.dev && destStat.ino === srcStat.ino && destStat.dev === srcStat.dev
}
// return true if dest is a subdir of src, otherwise false.
// It only checks the path strings.
function isSrcSubdir (src, dest) {
const srcArr = path.resolve(src).split(path.sep).filter(i => i)
const destArr = path.resolve(dest).split(path.sep).filter(i => i)
return srcArr.reduce((acc, cur, i) => acc && destArr[i] === cur, true)
}
function errMsg (src, dest, funcName) {
return `Cannot ${funcName} '${src}' to a subdirectory of itself, '${dest}'.`
}
module.exports = {
checkPaths,
checkPathsSync,
checkParentPaths,
checkParentPathsSync,
isSrcSubdir,
areIdentical
}

View File

@@ -0,0 +1,26 @@
'use strict'
const fs = require('graceful-fs')
function utimesMillis (path, atime, mtime, callback) {
// if (!HAS_MILLIS_RES) return fs.utimes(path, atime, mtime, callback)
fs.open(path, 'r+', (err, fd) => {
if (err) return callback(err)
fs.futimes(fd, atime, mtime, futimesErr => {
fs.close(fd, closeErr => {
if (callback) callback(futimesErr || closeErr)
})
})
})
}
function utimesMillisSync (path, atime, mtime) {
const fd = fs.openSync(path, 'r+')
fs.futimesSync(fd, atime, mtime)
return fs.closeSync(fd)
}
module.exports = {
utimesMillis,
utimesMillisSync
}

View File

@@ -0,0 +1,67 @@
{
"name": "fs-extra",
"version": "10.1.0",
"description": "fs-extra contains methods that aren't included in the vanilla Node.js fs package. Such as recursive mkdir, copy, and remove.",
"engines": {
"node": ">=12"
},
"homepage": "https://github.com/jprichardson/node-fs-extra",
"repository": {
"type": "git",
"url": "https://github.com/jprichardson/node-fs-extra"
},
"keywords": [
"fs",
"file",
"file system",
"copy",
"directory",
"extra",
"mkdirp",
"mkdir",
"mkdirs",
"recursive",
"json",
"read",
"write",
"extra",
"delete",
"remove",
"touch",
"create",
"text",
"output",
"move",
"promise"
],
"author": "JP Richardson <jprichardson@gmail.com>",
"license": "MIT",
"dependencies": {
"graceful-fs": "^4.2.0",
"jsonfile": "^6.0.1",
"universalify": "^2.0.0"
},
"devDependencies": {
"at-least-node": "^1.0.0",
"klaw": "^2.1.1",
"klaw-sync": "^3.0.2",
"minimist": "^1.1.1",
"mocha": "^5.0.5",
"nyc": "^15.0.0",
"proxyquire": "^2.0.1",
"read-dir-files": "^0.1.1",
"standard": "^16.0.3"
},
"main": "./lib/index.js",
"files": [
"lib/",
"!lib/**/__tests__/"
],
"scripts": {
"lint": "standard",
"test-find": "find ./lib/**/__tests__ -name *.test.js | xargs mocha",
"test": "npm run lint && npm run unit",
"unit": "nyc node test.js"
},
"sideEffects": false
}

View File

@@ -0,0 +1,15 @@
(The MIT License)
Copyright (c) 2012-2015, JP Richardson <jprichardson@gmail.com>
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files
(the 'Software'), to deal in the Software without restriction, including without limitation the rights to use, copy, modify,
merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS
OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View File

@@ -0,0 +1,230 @@
Node.js - jsonfile
================
Easily read/write JSON files in Node.js. _Note: this module cannot be used in the browser._
[![npm Package](https://img.shields.io/npm/v/jsonfile.svg?style=flat-square)](https://www.npmjs.org/package/jsonfile)
[![linux build status](https://img.shields.io/github/actions/workflow/status/jprichardson/node-jsonfile/ci.yml?branch=master)](https://github.com/jprichardson/node-jsonfile/actions?query=branch%3Amaster)
[![windows Build status](https://img.shields.io/appveyor/ci/jprichardson/node-jsonfile/master.svg?label=windows%20build)](https://ci.appveyor.com/project/jprichardson/node-jsonfile/branch/master)
<a href="https://github.com/feross/standard"><img src="https://cdn.rawgit.com/feross/standard/master/sticker.svg" alt="Standard JavaScript" width="100"></a>
Why?
----
Writing `JSON.stringify()` and then `fs.writeFile()` and `JSON.parse()` with `fs.readFile()` enclosed in `try/catch` blocks became annoying.
Installation
------------
npm install --save jsonfile
API
---
* [`readFile(filename, [options], callback)`](#readfilefilename-options-callback)
* [`readFileSync(filename, [options])`](#readfilesyncfilename-options)
* [`writeFile(filename, obj, [options], callback)`](#writefilefilename-obj-options-callback)
* [`writeFileSync(filename, obj, [options])`](#writefilesyncfilename-obj-options)
----
### readFile(filename, [options], callback)
`options` (`object`, default `undefined`): Pass in any [`fs.readFile`](https://nodejs.org/api/fs.html#fs_fs_readfile_path_options_callback) options or set `reviver` for a [JSON reviver](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/parse).
- `throws` (`boolean`, default: `true`). If `JSON.parse` throws an error, pass this error to the callback.
If `false`, returns `null` for the object.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
jsonfile.readFile(file, function (err, obj) {
if (err) console.error(err)
console.dir(obj)
})
```
You can also use this method with promises. The `readFile` method will return a promise if you do not pass a callback function.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
jsonfile.readFile(file)
.then(obj => console.dir(obj))
.catch(error => console.error(error))
```
----
### readFileSync(filename, [options])
`options` (`object`, default `undefined`): Pass in any [`fs.readFileSync`](https://nodejs.org/api/fs.html#fs_fs_readfilesync_path_options) options or set `reviver` for a [JSON reviver](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/parse).
- `throws` (`boolean`, default: `true`). If an error is encountered reading or parsing the file, throw the error. If `false`, returns `null` for the object.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
console.dir(jsonfile.readFileSync(file))
```
----
### writeFile(filename, obj, [options], callback)
`options`: Pass in any [`fs.writeFile`](https://nodejs.org/api/fs.html#fs_fs_writefile_file_data_options_callback) options or set `replacer` for a [JSON replacer](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify). Can also pass in `spaces`, or override `EOL` string or set `finalEOL` flag as `false` to not save the file with `EOL` at the end.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFile(file, obj, function (err) {
if (err) console.error(err)
})
```
Or use with promises as follows:
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFile(file, obj)
.then(res => {
console.log('Write complete')
})
.catch(error => console.error(error))
```
**formatting with spaces:**
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFile(file, obj, { spaces: 2 }, function (err) {
if (err) console.error(err)
})
```
**overriding EOL:**
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFile(file, obj, { spaces: 2, EOL: '\r\n' }, function (err) {
if (err) console.error(err)
})
```
**disabling the EOL at the end of file:**
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFile(file, obj, { spaces: 2, finalEOL: false }, function (err) {
if (err) console.log(err)
})
```
**appending to an existing JSON file:**
You can use `fs.writeFile` option `{ flag: 'a' }` to achieve this.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/mayAlreadyExistedData.json'
const obj = { name: 'JP' }
jsonfile.writeFile(file, obj, { flag: 'a' }, function (err) {
if (err) console.error(err)
})
```
----
### writeFileSync(filename, obj, [options])
`options`: Pass in any [`fs.writeFileSync`](https://nodejs.org/api/fs.html#fs_fs_writefilesync_file_data_options) options or set `replacer` for a [JSON replacer](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify). Can also pass in `spaces`, or override `EOL` string or set `finalEOL` flag as `false` to not save the file with `EOL` at the end.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFileSync(file, obj)
```
**formatting with spaces:**
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFileSync(file, obj, { spaces: 2 })
```
**overriding EOL:**
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFileSync(file, obj, { spaces: 2, EOL: '\r\n' })
```
**disabling the EOL at the end of file:**
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFileSync(file, obj, { spaces: 2, finalEOL: false })
```
**appending to an existing JSON file:**
You can use `fs.writeFileSync` option `{ flag: 'a' }` to achieve this.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/mayAlreadyExistedData.json'
const obj = { name: 'JP' }
jsonfile.writeFileSync(file, obj, { flag: 'a' })
```
License
-------
(MIT License)
Copyright 2012-2016, JP Richardson <jprichardson@gmail.com>

View File

@@ -0,0 +1,88 @@
let _fs
try {
_fs = require('graceful-fs')
} catch (_) {
_fs = require('fs')
}
const universalify = require('universalify')
const { stringify, stripBom } = require('./utils')
async function _readFile (file, options = {}) {
if (typeof options === 'string') {
options = { encoding: options }
}
const fs = options.fs || _fs
const shouldThrow = 'throws' in options ? options.throws : true
let data = await universalify.fromCallback(fs.readFile)(file, options)
data = stripBom(data)
let obj
try {
obj = JSON.parse(data, options ? options.reviver : null)
} catch (err) {
if (shouldThrow) {
err.message = `${file}: ${err.message}`
throw err
} else {
return null
}
}
return obj
}
const readFile = universalify.fromPromise(_readFile)
function readFileSync (file, options = {}) {
if (typeof options === 'string') {
options = { encoding: options }
}
const fs = options.fs || _fs
const shouldThrow = 'throws' in options ? options.throws : true
try {
let content = fs.readFileSync(file, options)
content = stripBom(content)
return JSON.parse(content, options.reviver)
} catch (err) {
if (shouldThrow) {
err.message = `${file}: ${err.message}`
throw err
} else {
return null
}
}
}
async function _writeFile (file, obj, options = {}) {
const fs = options.fs || _fs
const str = stringify(obj, options)
await universalify.fromCallback(fs.writeFile)(file, str, options)
}
const writeFile = universalify.fromPromise(_writeFile)
function writeFileSync (file, obj, options = {}) {
const fs = options.fs || _fs
const str = stringify(obj, options)
// not sure if fs.writeFileSync returns anything, but just in case
return fs.writeFileSync(file, str, options)
}
// NOTE: do not change this export format; required for ESM compat
// see https://github.com/jprichardson/node-jsonfile/pull/162 for details
module.exports = {
readFile,
readFileSync,
writeFile,
writeFileSync
}

View File

@@ -0,0 +1,40 @@
{
"name": "jsonfile",
"version": "6.2.0",
"description": "Easily read/write JSON files.",
"repository": {
"type": "git",
"url": "git@github.com:jprichardson/node-jsonfile.git"
},
"keywords": [
"read",
"write",
"file",
"json",
"fs",
"fs-extra"
],
"author": "JP Richardson <jprichardson@gmail.com>",
"license": "MIT",
"dependencies": {
"universalify": "^2.0.0"
},
"optionalDependencies": {
"graceful-fs": "^4.1.6"
},
"devDependencies": {
"mocha": "^8.2.0",
"rimraf": "^2.4.0",
"standard": "^16.0.1"
},
"main": "index.js",
"files": [
"index.js",
"utils.js"
],
"scripts": {
"lint": "standard",
"test": "npm run lint && npm run unit",
"unit": "mocha"
}
}

View File

@@ -0,0 +1,14 @@
function stringify (obj, { EOL = '\n', finalEOL = true, replacer = null, spaces } = {}) {
const EOF = finalEOL ? EOL : ''
const str = JSON.stringify(obj, replacer, spaces)
return str.replace(/\n/g, EOL) + EOF
}
function stripBom (content) {
// we do this because JSON.parse would convert it to a utf8 string if encoding wasn't specified
if (Buffer.isBuffer(content)) content = content.toString('utf8')
return content.replace(/^\uFEFF/, '')
}
module.exports = { stringify, stripBom }

View File

@@ -0,0 +1,20 @@
(The MIT License)
Copyright (c) 2017, Ryan Zimmerman <opensrc@ryanzim.com>
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the 'Software'), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software is furnished to do so,
subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View File

@@ -0,0 +1,76 @@
# universalify
![GitHub Workflow Status (branch)](https://img.shields.io/github/actions/workflow/status/RyanZim/universalify/ci.yml?branch=master)
![Coveralls github branch](https://img.shields.io/coveralls/github/RyanZim/universalify/master.svg)
![npm](https://img.shields.io/npm/dm/universalify.svg)
![npm](https://img.shields.io/npm/l/universalify.svg)
Make a callback- or promise-based function support both promises and callbacks.
Uses the native promise implementation.
## Installation
```bash
npm install universalify
```
## API
### `universalify.fromCallback(fn)`
Takes a callback-based function to universalify, and returns the universalified function.
Function must take a callback as the last parameter that will be called with the signature `(error, result)`. `universalify` does not support calling the callback with three or more arguments, and does not ensure that the callback is only called once.
```js
function callbackFn (n, cb) {
setTimeout(() => cb(null, n), 15)
}
const fn = universalify.fromCallback(callbackFn)
// Works with Promises:
fn('Hello World!')
.then(result => console.log(result)) // -> Hello World!
.catch(error => console.error(error))
// Works with Callbacks:
fn('Hi!', (error, result) => {
if (error) return console.error(error)
console.log(result)
// -> Hi!
})
```
### `universalify.fromPromise(fn)`
Takes a promise-based function to universalify, and returns the universalified function.
Function must return a valid JS promise. `universalify` does not ensure that a valid promise is returned.
```js
function promiseFn (n) {
return new Promise(resolve => {
setTimeout(() => resolve(n), 15)
})
}
const fn = universalify.fromPromise(promiseFn)
// Works with Promises:
fn('Hello World!')
.then(result => console.log(result)) // -> Hello World!
.catch(error => console.error(error))
// Works with Callbacks:
fn('Hi!', (error, result) => {
if (error) return console.error(error)
console.log(result)
// -> Hi!
})
```
## License
MIT

View File

@@ -0,0 +1,24 @@
'use strict'
exports.fromCallback = function (fn) {
return Object.defineProperty(function (...args) {
if (typeof args[args.length - 1] === 'function') fn.apply(this, args)
else {
return new Promise((resolve, reject) => {
args.push((err, res) => (err != null) ? reject(err) : resolve(res))
fn.apply(this, args)
})
}
}, 'name', { value: fn.name })
}
exports.fromPromise = function (fn) {
return Object.defineProperty(function (...args) {
const cb = args[args.length - 1]
if (typeof cb !== 'function') return fn.apply(this, args)
else {
args.pop()
fn.apply(this, args).then(r => cb(null, r), cb)
}
}, 'name', { value: fn.name })
}

View File

@@ -0,0 +1,34 @@
{
"name": "universalify",
"version": "2.0.1",
"description": "Make a callback- or promise-based function support both promises and callbacks.",
"keywords": [
"callback",
"native",
"promise"
],
"homepage": "https://github.com/RyanZim/universalify#readme",
"bugs": "https://github.com/RyanZim/universalify/issues",
"license": "MIT",
"author": "Ryan Zimmerman <opensrc@ryanzim.com>",
"files": [
"index.js"
],
"repository": {
"type": "git",
"url": "git+https://github.com/RyanZim/universalify.git"
},
"scripts": {
"test": "standard && nyc --reporter text --reporter lcovonly tape test/*.js | colortape"
},
"devDependencies": {
"colortape": "^0.1.2",
"coveralls": "^3.0.1",
"nyc": "^15.0.0",
"standard": "^14.3.1",
"tape": "^5.0.1"
},
"engines": {
"node": ">= 10.0.0"
}
}

View File

@@ -0,0 +1,2 @@
export declare function getPath7za(): Promise<string>;
export declare function getPath7x(): Promise<string>;

18
desktop-operator/node_modules/builder-util/out/7za.js generated vendored Normal file
View File

@@ -0,0 +1,18 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.getPath7za = getPath7za;
exports.getPath7x = getPath7x;
const _7zip_bin_1 = require("7zip-bin");
const fs_extra_1 = require("fs-extra");
const fs = require("fs");
async function getPath7za() {
if (fs.existsSync(_7zip_bin_1.path7za)) {
await (0, fs_extra_1.chmod)(_7zip_bin_1.path7za, 0o755);
}
return _7zip_bin_1.path7za;
}
async function getPath7x() {
await (0, fs_extra_1.chmod)(_7zip_bin_1.path7x, 0o755);
return _7zip_bin_1.path7x;
}
//# sourceMappingURL=7za.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"7za.js","sourceRoot":"","sources":["../src/7za.ts"],"names":[],"mappings":";;AAIA,gCAKC;AAED,8BAGC;AAdD,wCAA0C;AAC1C,uCAAgC;AAChC,yBAAwB;AAEjB,KAAK,UAAU,UAAU;IAC9B,IAAI,EAAE,CAAC,UAAU,CAAC,mBAAO,CAAC,EAAE,CAAC;QAC3B,MAAM,IAAA,gBAAK,EAAC,mBAAO,EAAE,KAAK,CAAC,CAAA;IAC7B,CAAC;IACD,OAAO,mBAAO,CAAA;AAChB,CAAC;AAEM,KAAK,UAAU,SAAS;IAC7B,MAAM,IAAA,gBAAK,EAAC,kBAAM,EAAE,KAAK,CAAC,CAAA;IAC1B,OAAO,kBAAM,CAAA;AACf,CAAC","sourcesContent":["import { path7x, path7za } from \"7zip-bin\"\nimport { chmod } from \"fs-extra\"\nimport * as fs from \"fs\"\n\nexport async function getPath7za(): Promise<string> {\n if (fs.existsSync(path7za)) {\n await chmod(path7za, 0o755)\n }\n return path7za\n}\n\nexport async function getPath7x(): Promise<string> {\n await chmod(path7x, 0o755)\n return path7x\n}\n"]}

View File

@@ -0,0 +1,7 @@
export declare class DebugLogger {
readonly isEnabled: boolean;
readonly data: any;
constructor(isEnabled?: boolean);
add(key: string, value: any): void;
save(file: string): Promise<void>;
}

View File

@@ -0,0 +1,51 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.DebugLogger = void 0;
const fs_extra_1 = require("fs-extra");
const util_1 = require("./util");
class DebugLogger {
constructor(isEnabled = true) {
this.isEnabled = isEnabled;
this.data = {};
}
add(key, value) {
if (!this.isEnabled) {
return;
}
const dataPath = key.split(".");
let o = this.data;
let lastName = null;
for (const p of dataPath) {
if (p === dataPath[dataPath.length - 1]) {
lastName = p;
break;
}
else {
if (o[p] == null) {
o[p] = Object.create(null);
}
else if (typeof o[p] === "string") {
o[p] = [o[p]];
}
o = o[p];
}
}
if (Array.isArray(o[lastName])) {
o[lastName] = [...o[lastName], value];
}
else {
o[lastName] = value;
}
}
save(file) {
// toml and json doesn't correctly output multiline string as multiline
if (this.isEnabled && Object.keys(this.data).length > 0) {
return (0, fs_extra_1.outputFile)(file, (0, util_1.serializeToYaml)(this.data));
}
else {
return Promise.resolve();
}
}
}
exports.DebugLogger = DebugLogger;
//# sourceMappingURL=DebugLogger.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"DebugLogger.js","sourceRoot":"","sources":["../src/DebugLogger.ts"],"names":[],"mappings":";;;AAAA,uCAAqC;AACrC,iCAAwC;AAExC,MAAa,WAAW;IAGtB,YAAqB,YAAY,IAAI;QAAhB,cAAS,GAAT,SAAS,CAAO;QAF5B,SAAI,GAAQ,EAAE,CAAA;IAEiB,CAAC;IAEzC,GAAG,CAAC,GAAW,EAAE,KAAU;QACzB,IAAI,CAAC,IAAI,CAAC,SAAS,EAAE,CAAC;YACpB,OAAM;QACR,CAAC;QAED,MAAM,QAAQ,GAAG,GAAG,CAAC,KAAK,CAAC,GAAG,CAAC,CAAA;QAC/B,IAAI,CAAC,GAAG,IAAI,CAAC,IAAI,CAAA;QACjB,IAAI,QAAQ,GAAkB,IAAI,CAAA;QAClC,KAAK,MAAM,CAAC,IAAI,QAAQ,EAAE,CAAC;YACzB,IAAI,CAAC,KAAK,QAAQ,CAAC,QAAQ,CAAC,MAAM,GAAG,CAAC,CAAC,EAAE,CAAC;gBACxC,QAAQ,GAAG,CAAC,CAAA;gBACZ,MAAK;YACP,CAAC;iBAAM,CAAC;gBACN,IAAI,CAAC,CAAC,CAAC,CAAC,IAAI,IAAI,EAAE,CAAC;oBACjB,CAAC,CAAC,CAAC,CAAC,GAAG,MAAM,CAAC,MAAM,CAAC,IAAI,CAAC,CAAA;gBAC5B,CAAC;qBAAM,IAAI,OAAO,CAAC,CAAC,CAAC,CAAC,KAAK,QAAQ,EAAE,CAAC;oBACpC,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAA;gBACf,CAAC;gBACD,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,CAAA;YACV,CAAC;QACH,CAAC;QAED,IAAI,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,QAAS,CAAC,CAAC,EAAE,CAAC;YAChC,CAAC,CAAC,QAAS,CAAC,GAAG,CAAC,GAAG,CAAC,CAAC,QAAS,CAAC,EAAE,KAAK,CAAC,CAAA;QACzC,CAAC;aAAM,CAAC;YACN,CAAC,CAAC,QAAS,CAAC,GAAG,KAAK,CAAA;QACtB,CAAC;IACH,CAAC;IAED,IAAI,CAAC,IAAY;QACf,uEAAuE;QACvE,IAAI,IAAI,CAAC,SAAS,IAAI,MAAM,CAAC,IAAI,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;YACxD,OAAO,IAAA,qBAAU,EAAC,IAAI,EAAE,IAAA,sBAAe,EAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAA;QACrD,CAAC;aAAM,CAAC;YACN,OAAO,OAAO,CAAC,OAAO,EAAE,CAAA;QAC1B,CAAC;IACH,CAAC;CACF;AA1CD,kCA0CC","sourcesContent":["import { outputFile } from \"fs-extra\"\nimport { serializeToYaml } from \"./util\"\n\nexport class DebugLogger {\n readonly data: any = {}\n\n constructor(readonly isEnabled = true) {}\n\n add(key: string, value: any) {\n if (!this.isEnabled) {\n return\n }\n\n const dataPath = key.split(\".\")\n let o = this.data\n let lastName: string | null = null\n for (const p of dataPath) {\n if (p === dataPath[dataPath.length - 1]) {\n lastName = p\n break\n } else {\n if (o[p] == null) {\n o[p] = Object.create(null)\n } else if (typeof o[p] === \"string\") {\n o[p] = [o[p]]\n }\n o = o[p]\n }\n }\n\n if (Array.isArray(o[lastName!])) {\n o[lastName!] = [...o[lastName!], value]\n } else {\n o[lastName!] = value\n }\n }\n\n save(file: string) {\n // toml and json doesn't correctly output multiline string as multiline\n if (this.isEnabled && Object.keys(this.data).length > 0) {\n return outputFile(file, serializeToYaml(this.data))\n } else {\n return Promise.resolve()\n }\n }\n}\n"]}

View File

@@ -0,0 +1,14 @@
export declare enum Arch {
ia32 = 0,
x64 = 1,
armv7l = 2,
arm64 = 3,
universal = 4
}
export type ArchType = "x64" | "ia32" | "armv7l" | "arm64" | "universal";
export declare function toLinuxArchString(arch: Arch, targetName: string): string;
export declare function getArchCliNames(): Array<string>;
export declare function getArchSuffix(arch: Arch, defaultArch?: string): string;
export declare function archFromString(name: string): Arch;
export declare function defaultArchFromString(name?: string): Arch;
export declare function getArtifactArchName(arch: Arch, ext: string): string;

92
desktop-operator/node_modules/builder-util/out/arch.js generated vendored Normal file
View File

@@ -0,0 +1,92 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.Arch = void 0;
exports.toLinuxArchString = toLinuxArchString;
exports.getArchCliNames = getArchCliNames;
exports.getArchSuffix = getArchSuffix;
exports.archFromString = archFromString;
exports.defaultArchFromString = defaultArchFromString;
exports.getArtifactArchName = getArtifactArchName;
var Arch;
(function (Arch) {
Arch[Arch["ia32"] = 0] = "ia32";
Arch[Arch["x64"] = 1] = "x64";
Arch[Arch["armv7l"] = 2] = "armv7l";
Arch[Arch["arm64"] = 3] = "arm64";
Arch[Arch["universal"] = 4] = "universal";
})(Arch || (exports.Arch = Arch = {}));
function toLinuxArchString(arch, targetName) {
switch (arch) {
case Arch.x64:
return targetName === "flatpak" ? "x86_64" : "amd64";
case Arch.ia32:
return targetName === "pacman" ? "i686" : "i386";
case Arch.armv7l:
return targetName === "snap" || targetName === "deb" ? "armhf" : targetName === "flatpak" ? "arm" : "armv7l";
case Arch.arm64:
return targetName === "pacman" || targetName === "rpm" || targetName === "flatpak" ? "aarch64" : "arm64";
default:
throw new Error(`Unsupported arch ${arch}`);
}
}
function getArchCliNames() {
return [Arch[Arch.ia32], Arch[Arch.x64], Arch[Arch.armv7l], Arch[Arch.arm64]];
}
function getArchSuffix(arch, defaultArch) {
return arch === defaultArchFromString(defaultArch) ? "" : `-${Arch[arch]}`;
}
function archFromString(name) {
switch (name) {
case "x64":
return Arch.x64;
case "ia32":
return Arch.ia32;
case "arm64":
return Arch.arm64;
case "arm":
case "armv7l":
return Arch.armv7l;
case "universal":
return Arch.universal;
default:
throw new Error(`Unsupported arch ${name}`);
}
}
function defaultArchFromString(name) {
return name ? archFromString(name) : Arch.x64;
}
function getArtifactArchName(arch, ext) {
let archName = Arch[arch];
const isAppImage = ext === "AppImage" || ext === "appimage";
if (arch === Arch.x64) {
if (isAppImage || ext === "rpm" || ext === "flatpak") {
archName = "x86_64";
}
else if (ext === "deb" || ext === "snap") {
archName = "amd64";
}
}
else if (arch === Arch.ia32) {
if (ext === "deb" || isAppImage || ext === "snap" || ext === "flatpak") {
archName = "i386";
}
else if (ext === "pacman" || ext === "rpm") {
archName = "i686";
}
}
else if (arch === Arch.armv7l) {
if (ext === "snap") {
archName = "armhf";
}
else if (ext === "flatpak") {
archName = "arm";
}
}
else if (arch === Arch.arm64) {
if (ext === "pacman" || ext === "rpm" || ext === "flatpak") {
archName = "aarch64";
}
}
return archName;
}
//# sourceMappingURL=arch.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,11 @@
import { CancellationToken } from "builder-util-runtime";
export declare class AsyncTaskManager {
private readonly cancellationToken;
readonly tasks: Array<Promise<any>>;
private readonly errors;
constructor(cancellationToken: CancellationToken);
add(task: () => Promise<any>): void;
addTask(promise: Promise<any>): void;
cancelTasks(): void;
awaitTasks(): Promise<Array<any>>;
}

View File

@@ -0,0 +1,86 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.AsyncTaskManager = void 0;
const log_1 = require("./log");
const promise_1 = require("./promise");
class AsyncTaskManager {
constructor(cancellationToken) {
this.cancellationToken = cancellationToken;
this.tasks = [];
this.errors = [];
}
add(task) {
if (this.cancellationToken == null || !this.cancellationToken.cancelled) {
this.addTask(task());
}
}
addTask(promise) {
if (this.cancellationToken.cancelled) {
log_1.log.debug({ reason: "cancelled", stack: new Error().stack }, "async task not added");
if ("cancel" in promise) {
;
promise.cancel();
}
return;
}
this.tasks.push(promise.catch(it => {
log_1.log.debug({ error: it.message || it.toString() }, "async task error");
this.errors.push(it);
return Promise.resolve(null);
}));
}
cancelTasks() {
for (const task of this.tasks) {
if ("cancel" in task) {
;
task.cancel();
}
}
this.tasks.length = 0;
}
async awaitTasks() {
if (this.cancellationToken.cancelled) {
this.cancelTasks();
return [];
}
const checkErrors = () => {
if (this.errors.length > 0) {
this.cancelTasks();
throwError(this.errors);
return;
}
};
checkErrors();
let result = null;
const tasks = this.tasks;
let list = tasks.slice();
tasks.length = 0;
while (list.length > 0) {
const subResult = await Promise.all(list);
result = result == null ? subResult : result.concat(subResult);
checkErrors();
if (tasks.length === 0) {
break;
}
else {
if (this.cancellationToken.cancelled) {
this.cancelTasks();
return [];
}
list = tasks.slice();
tasks.length = 0;
}
}
return result || [];
}
}
exports.AsyncTaskManager = AsyncTaskManager;
function throwError(errors) {
if (errors.length === 1) {
throw errors[0];
}
else if (errors.length > 1) {
throw new promise_1.NestedError(errors, "Cannot cleanup: ");
}
}
//# sourceMappingURL=asyncTaskManager.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"asyncTaskManager.js","sourceRoot":"","sources":["../src/asyncTaskManager.ts"],"names":[],"mappings":";;;AACA,+BAA2B;AAC3B,uCAAuC;AAEvC,MAAa,gBAAgB;IAI3B,YAA6B,iBAAoC;QAApC,sBAAiB,GAAjB,iBAAiB,CAAmB;QAHxD,UAAK,GAAwB,EAAE,CAAA;QACvB,WAAM,GAAiB,EAAE,CAAA;IAE0B,CAAC;IAErE,GAAG,CAAC,IAAwB;QAC1B,IAAI,IAAI,CAAC,iBAAiB,IAAI,IAAI,IAAI,CAAC,IAAI,CAAC,iBAAiB,CAAC,SAAS,EAAE,CAAC;YACxE,IAAI,CAAC,OAAO,CAAC,IAAI,EAAE,CAAC,CAAA;QACtB,CAAC;IACH,CAAC;IAED,OAAO,CAAC,OAAqB;QAC3B,IAAI,IAAI,CAAC,iBAAiB,CAAC,SAAS,EAAE,CAAC;YACrC,SAAG,CAAC,KAAK,CAAC,EAAE,MAAM,EAAE,WAAW,EAAE,KAAK,EAAE,IAAI,KAAK,EAAE,CAAC,KAAK,EAAE,EAAE,sBAAsB,CAAC,CAAA;YACpF,IAAI,QAAQ,IAAI,OAAO,EAAE,CAAC;gBACxB,CAAC;gBAAC,OAAe,CAAC,MAAM,EAAE,CAAA;YAC5B,CAAC;YACD,OAAM;QACR,CAAC;QAED,IAAI,CAAC,KAAK,CAAC,IAAI,CACb,OAAO,CAAC,KAAK,CAAC,EAAE,CAAC,EAAE;YACjB,SAAG,CAAC,KAAK,CAAC,EAAE,KAAK,EAAE,EAAE,CAAC,OAAO,IAAI,EAAE,CAAC,QAAQ,EAAE,EAAE,EAAE,kBAAkB,CAAC,CAAA;YACrE,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,EAAE,CAAC,CAAA;YACpB,OAAO,OAAO,CAAC,OAAO,CAAC,IAAI,CAAC,CAAA;QAC9B,CAAC,CAAC,CACH,CAAA;IACH,CAAC;IAED,WAAW;QACT,KAAK,MAAM,IAAI,IAAI,IAAI,CAAC,KAAK,EAAE,CAAC;YAC9B,IAAI,QAAQ,IAAI,IAAI,EAAE,CAAC;gBACrB,CAAC;gBAAC,IAAY,CAAC,MAAM,EAAE,CAAA;YACzB,CAAC;QACH,CAAC;QACD,IAAI,CAAC,KAAK,CAAC,MAAM,GAAG,CAAC,CAAA;IACvB,CAAC;IAED,KAAK,CAAC,UAAU;QACd,IAAI,IAAI,CAAC,iBAAiB,CAAC,SAAS,EAAE,CAAC;YACrC,IAAI,CAAC,WAAW,EAAE,CAAA;YAClB,OAAO,EAAE,CAAA;QACX,CAAC;QAED,MAAM,WAAW,GAAG,GAAG,EAAE;YACvB,IAAI,IAAI,CAAC,MAAM,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;gBAC3B,IAAI,CAAC,WAAW,EAAE,CAAA;gBAClB,UAAU,CAAC,IAAI,CAAC,MAAM,CAAC,CAAA;gBACvB,OAAM;YACR,CAAC;QACH,CAAC,CAAA;QAED,WAAW,EAAE,CAAA;QAEb,IAAI,MAAM,GAAsB,IAAI,CAAA;QACpC,MAAM,KAAK,GAAG,IAAI,CAAC,KAAK,CAAA;QACxB,IAAI,IAAI,GAAG,KAAK,CAAC,KAAK,EAAE,CAAA;QACxB,KAAK,CAAC,MAAM,GAAG,CAAC,CAAA;QAChB,OAAO,IAAI,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;YACvB,MAAM,SAAS,GAAG,MAAM,OAAO,CAAC,GAAG,CAAC,IAAI,CAAC,CAAA;YACzC,MAAM,GAAG,MAAM,IAAI,IAAI,CAAC,CAAC,CAAC,SAAS,CAAC,CAAC,CAAC,MAAM,CAAC,MAAM,CAAC,SAAS,CAAC,CAAA;YAC9D,WAAW,EAAE,CAAA;YACb,IAAI,KAAK,CAAC,MAAM,KAAK,CAAC,EAAE,CAAC;gBACvB,MAAK;YACP,CAAC;iBAAM,CAAC;gBACN,IAAI,IAAI,CAAC,iBAAiB,CAAC,SAAS,EAAE,CAAC;oBACrC,IAAI,CAAC,WAAW,EAAE,CAAA;oBAClB,OAAO,EAAE,CAAA;gBACX,CAAC;gBAED,IAAI,GAAG,KAAK,CAAC,KAAK,EAAE,CAAA;gBACpB,KAAK,CAAC,MAAM,GAAG,CAAC,CAAA;YAClB,CAAC;QACH,CAAC;QACD,OAAO,MAAM,IAAI,EAAE,CAAA;IACrB,CAAC;CACF;AA7ED,4CA6EC;AAED,SAAS,UAAU,CAAC,MAAoB;IACtC,IAAI,MAAM,CAAC,MAAM,KAAK,CAAC,EAAE,CAAC;QACxB,MAAM,MAAM,CAAC,CAAC,CAAC,CAAA;IACjB,CAAC;SAAM,IAAI,MAAM,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;QAC7B,MAAM,IAAI,qBAAW,CAAC,MAAM,EAAE,kBAAkB,CAAC,CAAA;IACnD,CAAC;AACH,CAAC","sourcesContent":["import { CancellationToken } from \"builder-util-runtime\"\nimport { log } from \"./log\"\nimport { NestedError } from \"./promise\"\n\nexport class AsyncTaskManager {\n readonly tasks: Array<Promise<any>> = []\n private readonly errors: Array<Error> = []\n\n constructor(private readonly cancellationToken: CancellationToken) {}\n\n add(task: () => Promise<any>) {\n if (this.cancellationToken == null || !this.cancellationToken.cancelled) {\n this.addTask(task())\n }\n }\n\n addTask(promise: Promise<any>) {\n if (this.cancellationToken.cancelled) {\n log.debug({ reason: \"cancelled\", stack: new Error().stack }, \"async task not added\")\n if (\"cancel\" in promise) {\n ;(promise as any).cancel()\n }\n return\n }\n\n this.tasks.push(\n promise.catch(it => {\n log.debug({ error: it.message || it.toString() }, \"async task error\")\n this.errors.push(it)\n return Promise.resolve(null)\n })\n )\n }\n\n cancelTasks() {\n for (const task of this.tasks) {\n if (\"cancel\" in task) {\n ;(task as any).cancel()\n }\n }\n this.tasks.length = 0\n }\n\n async awaitTasks(): Promise<Array<any>> {\n if (this.cancellationToken.cancelled) {\n this.cancelTasks()\n return []\n }\n\n const checkErrors = () => {\n if (this.errors.length > 0) {\n this.cancelTasks()\n throwError(this.errors)\n return\n }\n }\n\n checkErrors()\n\n let result: Array<any> | null = null\n const tasks = this.tasks\n let list = tasks.slice()\n tasks.length = 0\n while (list.length > 0) {\n const subResult = await Promise.all(list)\n result = result == null ? subResult : result.concat(subResult)\n checkErrors()\n if (tasks.length === 0) {\n break\n } else {\n if (this.cancellationToken.cancelled) {\n this.cancelTasks()\n return []\n }\n\n list = tasks.slice()\n tasks.length = 0\n }\n }\n return result || []\n }\n}\n\nfunction throwError(errors: Array<Error>) {\n if (errors.length === 1) {\n throw errors[0]\n } else if (errors.length > 1) {\n throw new NestedError(errors, \"Cannot cleanup: \")\n }\n}\n"]}

View File

@@ -0,0 +1 @@
export declare function deepAssign<T>(target: T, ...objects: Array<any>): T;

View File

@@ -0,0 +1,47 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.deepAssign = deepAssign;
function isObject(x) {
if (Array.isArray(x)) {
return false;
}
const type = typeof x;
return type === "object" || type === "function";
}
function assignKey(target, from, key) {
const value = from[key];
// https://github.com/electron-userland/electron-builder/pull/562
if (value === undefined) {
return;
}
const prevValue = target[key];
if (prevValue == null || value == null || !isObject(prevValue) || !isObject(value)) {
// Merge arrays.
if (Array.isArray(prevValue) && Array.isArray(value)) {
target[key] = Array.from(new Set(prevValue.concat(value)));
}
else {
target[key] = value;
}
}
else {
target[key] = assign(prevValue, value);
}
}
function assign(to, from) {
if (to !== from) {
for (const key of Object.getOwnPropertyNames(from)) {
assignKey(to, from, key);
}
}
return to;
}
function deepAssign(target, ...objects) {
for (const o of objects) {
if (o != null) {
assign(target, o);
}
}
return target;
}
//# sourceMappingURL=deepAssign.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"deepAssign.js","sourceRoot":"","sources":["../src/deepAssign.ts"],"names":[],"mappings":";;AAsCA,gCAOC;AA7CD,SAAS,QAAQ,CAAC,CAAM;IACtB,IAAI,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,EAAE,CAAC;QACrB,OAAO,KAAK,CAAA;IACd,CAAC;IAED,MAAM,IAAI,GAAG,OAAO,CAAC,CAAA;IACrB,OAAO,IAAI,KAAK,QAAQ,IAAI,IAAI,KAAK,UAAU,CAAA;AACjD,CAAC;AAED,SAAS,SAAS,CAAC,MAAW,EAAE,IAAS,EAAE,GAAW;IACpD,MAAM,KAAK,GAAG,IAAI,CAAC,GAAG,CAAC,CAAA;IACvB,iEAAiE;IACjE,IAAI,KAAK,KAAK,SAAS,EAAE,CAAC;QACxB,OAAM;IACR,CAAC;IAED,MAAM,SAAS,GAAG,MAAM,CAAC,GAAG,CAAC,CAAA;IAC7B,IAAI,SAAS,IAAI,IAAI,IAAI,KAAK,IAAI,IAAI,IAAI,CAAC,QAAQ,CAAC,SAAS,CAAC,IAAI,CAAC,QAAQ,CAAC,KAAK,CAAC,EAAE,CAAC;QACnF,gBAAgB;QAChB,IAAI,KAAK,CAAC,OAAO,CAAC,SAAS,CAAC,IAAI,KAAK,CAAC,OAAO,CAAC,KAAK,CAAC,EAAE,CAAC;YACrD,MAAM,CAAC,GAAG,CAAC,GAAG,KAAK,CAAC,IAAI,CAAC,IAAI,GAAG,CAAC,SAAS,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC,CAAC,CAAA;QAC5D,CAAC;aAAM,CAAC;YACN,MAAM,CAAC,GAAG,CAAC,GAAG,KAAK,CAAA;QACrB,CAAC;IACH,CAAC;SAAM,CAAC;QACN,MAAM,CAAC,GAAG,CAAC,GAAG,MAAM,CAAC,SAAS,EAAE,KAAK,CAAC,CAAA;IACxC,CAAC;AACH,CAAC;AAED,SAAS,MAAM,CAAC,EAAO,EAAE,IAAS;IAChC,IAAI,EAAE,KAAK,IAAI,EAAE,CAAC;QAChB,KAAK,MAAM,GAAG,IAAI,MAAM,CAAC,mBAAmB,CAAC,IAAI,CAAC,EAAE,CAAC;YACnD,SAAS,CAAC,EAAE,EAAE,IAAI,EAAE,GAAG,CAAC,CAAA;QAC1B,CAAC;IACH,CAAC;IACD,OAAO,EAAE,CAAA;AACX,CAAC;AAED,SAAgB,UAAU,CAAI,MAAS,EAAE,GAAG,OAAmB;IAC7D,KAAK,MAAM,CAAC,IAAI,OAAO,EAAE,CAAC;QACxB,IAAI,CAAC,IAAI,IAAI,EAAE,CAAC;YACd,MAAM,CAAC,MAAM,EAAE,CAAC,CAAC,CAAA;QACnB,CAAC;IACH,CAAC;IACD,OAAO,MAAM,CAAA;AACf,CAAC","sourcesContent":["function isObject(x: any) {\n if (Array.isArray(x)) {\n return false\n }\n\n const type = typeof x\n return type === \"object\" || type === \"function\"\n}\n\nfunction assignKey(target: any, from: any, key: string) {\n const value = from[key]\n // https://github.com/electron-userland/electron-builder/pull/562\n if (value === undefined) {\n return\n }\n\n const prevValue = target[key]\n if (prevValue == null || value == null || !isObject(prevValue) || !isObject(value)) {\n // Merge arrays.\n if (Array.isArray(prevValue) && Array.isArray(value)) {\n target[key] = Array.from(new Set(prevValue.concat(value)))\n } else {\n target[key] = value\n }\n } else {\n target[key] = assign(prevValue, value)\n }\n}\n\nfunction assign(to: any, from: any) {\n if (to !== from) {\n for (const key of Object.getOwnPropertyNames(from)) {\n assignKey(to, from, key)\n }\n }\n return to\n}\n\nexport function deepAssign<T>(target: T, ...objects: Array<any>): T {\n for (const o of objects) {\n if (o != null) {\n assign(target, o)\n }\n }\n return target\n}\n"]}

58
desktop-operator/node_modules/builder-util/out/fs.d.ts generated vendored Normal file
View File

@@ -0,0 +1,58 @@
import { Stats } from "fs";
export declare const MAX_FILE_REQUESTS = 8;
export declare const CONCURRENCY: {
concurrency: number;
};
export type AfterCopyFileTransformer = (file: string) => Promise<boolean>;
export declare class CopyFileTransformer {
readonly afterCopyTransformer: AfterCopyFileTransformer;
constructor(afterCopyTransformer: AfterCopyFileTransformer);
}
export type FileTransformer = (file: string) => Promise<null | string | Buffer | CopyFileTransformer> | null | string | Buffer | CopyFileTransformer;
export type Filter = (file: string, stat: Stats) => boolean;
export declare function unlinkIfExists(file: string): Promise<void>;
export declare function statOrNull(file: string): Promise<Stats | null>;
export declare function exists(file: string): Promise<boolean>;
export interface FileConsumer {
consume(file: string, fileStat: Stats, parent: string, siblingNames: Array<string>): any;
/**
* @default false
*/
isIncludeDir?: boolean;
}
/**
* Returns list of file paths (system-dependent file separator)
*/
export declare function walk(initialDirPath: string, filter?: Filter | null, consumer?: FileConsumer): Promise<Array<string>>;
export declare function copyFile(src: string, dest: string, isEnsureDir?: boolean): Promise<any>;
/**
* Hard links is used if supported and allowed.
* File permission is fixed — allow execute for all if owner can, allow read for all if owner can.
*
* ensureDir is not called, dest parent dir must exists
*/
export declare function copyOrLinkFile(src: string, dest: string, stats?: Stats | null, isUseHardLink?: boolean, exDevErrorHandler?: (() => boolean) | null): Promise<any>;
export declare class FileCopier {
private readonly isUseHardLinkFunction?;
private readonly transformer?;
isUseHardLink: boolean;
constructor(isUseHardLinkFunction?: (((file: string) => boolean) | null) | undefined, transformer?: (FileTransformer | null) | undefined);
copy(src: string, dest: string, stat: Stats | undefined): Promise<void>;
}
export interface CopyDirOptions {
filter?: Filter | null;
transformer?: FileTransformer | null;
isUseHardLink?: ((file: string) => boolean) | null;
}
/**
* Empty directories is never created.
* Hard links is used if supported and allowed.
*/
export declare function copyDir(src: string, destination: string, options?: CopyDirOptions): Promise<any>;
export declare function dirSize(dirPath: string): Promise<number>;
export declare const DO_NOT_USE_HARD_LINKS: (file: string) => boolean;
export declare const USE_HARD_LINKS: (file: string) => boolean;
export interface Link {
readonly link: string;
readonly file: string;
}

287
desktop-operator/node_modules/builder-util/out/fs.js generated vendored Normal file
View File

@@ -0,0 +1,287 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.USE_HARD_LINKS = exports.DO_NOT_USE_HARD_LINKS = exports.FileCopier = exports.CopyFileTransformer = exports.CONCURRENCY = exports.MAX_FILE_REQUESTS = void 0;
exports.unlinkIfExists = unlinkIfExists;
exports.statOrNull = statOrNull;
exports.exists = exists;
exports.walk = walk;
exports.copyFile = copyFile;
exports.copyOrLinkFile = copyOrLinkFile;
exports.copyDir = copyDir;
exports.dirSize = dirSize;
const bluebird_lst_1 = require("bluebird-lst");
const fs_extra_1 = require("fs-extra");
const os_1 = require("os");
const promises_1 = require("fs/promises");
const path = require("path");
const stat_mode_1 = require("stat-mode");
const log_1 = require("./log");
const promise_1 = require("./promise");
const isCI = require("is-ci");
exports.MAX_FILE_REQUESTS = 8;
exports.CONCURRENCY = { concurrency: exports.MAX_FILE_REQUESTS };
class CopyFileTransformer {
constructor(afterCopyTransformer) {
this.afterCopyTransformer = afterCopyTransformer;
}
}
exports.CopyFileTransformer = CopyFileTransformer;
function unlinkIfExists(file) {
return (0, promises_1.unlink)(file).catch(() => {
/* ignore */
});
}
async function statOrNull(file) {
return (0, promise_1.orNullIfFileNotExist)((0, promises_1.stat)(file));
}
async function exists(file) {
try {
await (0, promises_1.access)(file);
return true;
}
catch (_e) {
return false;
}
}
/**
* Returns list of file paths (system-dependent file separator)
*/
async function walk(initialDirPath, filter, consumer) {
let result = [];
const queue = [initialDirPath];
let addDirToResult = false;
const isIncludeDir = consumer == null ? false : consumer.isIncludeDir === true;
while (queue.length > 0) {
const dirPath = queue.pop();
if (isIncludeDir) {
if (addDirToResult) {
result.push(dirPath);
}
else {
addDirToResult = true;
}
}
const childNames = await (0, promise_1.orIfFileNotExist)((0, promises_1.readdir)(dirPath), []);
childNames.sort();
let nodeModuleContent = null;
const dirs = [];
// our handler is async, but we should add sorted files, so, we add file to result not in the mapper, but after map
const sortedFilePaths = await bluebird_lst_1.default.map(childNames, name => {
if (name === ".DS_Store" || name === ".gitkeep") {
return null;
}
const filePath = dirPath + path.sep + name;
return (0, promises_1.lstat)(filePath).then(stat => {
if (filter != null && !filter(filePath, stat)) {
return null;
}
const consumerResult = consumer == null ? null : consumer.consume(filePath, stat, dirPath, childNames);
if (consumerResult === false) {
return null;
}
else if (consumerResult == null || !("then" in consumerResult)) {
if (stat.isDirectory()) {
dirs.push(name);
return null;
}
else {
return filePath;
}
}
else {
return consumerResult.then((it) => {
if (it != null && Array.isArray(it)) {
nodeModuleContent = it;
return null;
}
// asarUtil can return modified stat (symlink handling)
if ((it != null && "isDirectory" in it ? it : stat).isDirectory()) {
dirs.push(name);
return null;
}
else {
return filePath;
}
});
}
});
}, exports.CONCURRENCY);
for (const child of sortedFilePaths) {
if (child != null) {
result.push(child);
}
}
dirs.sort();
for (const child of dirs) {
queue.push(dirPath + path.sep + child);
}
if (nodeModuleContent != null) {
result = result.concat(nodeModuleContent);
}
}
return result;
}
const _isUseHardLink = process.platform !== "win32" && process.env.USE_HARD_LINKS !== "false" && (isCI || process.env.USE_HARD_LINKS === "true");
function copyFile(src, dest, isEnsureDir = true) {
return (isEnsureDir ? (0, promises_1.mkdir)(path.dirname(dest), { recursive: true }) : Promise.resolve()).then(() => copyOrLinkFile(src, dest, null, false));
}
/**
* Hard links is used if supported and allowed.
* File permission is fixed — allow execute for all if owner can, allow read for all if owner can.
*
* ensureDir is not called, dest parent dir must exists
*/
function copyOrLinkFile(src, dest, stats, isUseHardLink, exDevErrorHandler) {
if (isUseHardLink === undefined) {
isUseHardLink = _isUseHardLink;
}
if (stats != null) {
const originalModeNumber = stats.mode;
const mode = new stat_mode_1.Mode(stats);
if (mode.owner.execute) {
mode.group.execute = true;
mode.others.execute = true;
}
mode.group.read = true;
mode.others.read = true;
mode.setuid = false;
mode.setgid = false;
if (originalModeNumber !== stats.mode) {
if (log_1.log.isDebugEnabled) {
const oldMode = new stat_mode_1.Mode({ mode: originalModeNumber });
log_1.log.debug({ file: dest, oldMode, mode }, "permissions fixed from");
}
// https://helgeklein.com/blog/2009/05/hard-links-and-permissions-acls/
// Permissions on all hard links to the same data on disk are always identical. The same applies to attributes.
// That means if you change the permissions/owner/attributes on one hard link, you will immediately see the changes on all other hard links.
if (isUseHardLink) {
isUseHardLink = false;
log_1.log.debug({ dest }, "copied, but not linked, because file permissions need to be fixed");
}
}
}
if (isUseHardLink) {
return (0, promises_1.link)(src, dest).catch((e) => {
if (e.code === "EXDEV") {
const isLog = exDevErrorHandler == null ? true : exDevErrorHandler();
if (isLog && log_1.log.isDebugEnabled) {
log_1.log.debug({ error: e.message }, "cannot copy using hard link");
}
return doCopyFile(src, dest, stats);
}
else {
throw e;
}
});
}
return doCopyFile(src, dest, stats);
}
function doCopyFile(src, dest, stats) {
const promise = (0, fs_extra_1.copyFile)(src, dest);
if (stats == null) {
return promise;
}
return promise.then(() => (0, promises_1.chmod)(dest, stats.mode));
}
class FileCopier {
constructor(isUseHardLinkFunction, transformer) {
this.isUseHardLinkFunction = isUseHardLinkFunction;
this.transformer = transformer;
if (isUseHardLinkFunction === exports.USE_HARD_LINKS) {
this.isUseHardLink = true;
}
else {
this.isUseHardLink = _isUseHardLink && isUseHardLinkFunction !== exports.DO_NOT_USE_HARD_LINKS;
}
}
async copy(src, dest, stat) {
let afterCopyTransformer = null;
if (this.transformer != null && stat != null && stat.isFile()) {
let data = this.transformer(src);
if (data != null) {
if (typeof data === "object" && "then" in data) {
data = await data;
}
if (data != null) {
if (data instanceof CopyFileTransformer) {
afterCopyTransformer = data.afterCopyTransformer;
}
else {
await (0, promises_1.writeFile)(dest, data);
return;
}
}
}
}
const isUseHardLink = afterCopyTransformer == null && (!this.isUseHardLink || this.isUseHardLinkFunction == null ? this.isUseHardLink : this.isUseHardLinkFunction(dest));
await copyOrLinkFile(src, dest, stat, isUseHardLink, isUseHardLink
? () => {
// files are copied concurrently, so, we must not check here currentIsUseHardLink — our code can be executed after that other handler will set currentIsUseHardLink to false
if (this.isUseHardLink) {
this.isUseHardLink = false;
return true;
}
else {
return false;
}
}
: null);
if (afterCopyTransformer != null) {
await afterCopyTransformer(dest);
}
}
}
exports.FileCopier = FileCopier;
/**
* Empty directories is never created.
* Hard links is used if supported and allowed.
*/
function copyDir(src, destination, options = {}) {
const fileCopier = new FileCopier(options.isUseHardLink, options.transformer);
if (log_1.log.isDebugEnabled) {
log_1.log.debug({ src, destination }, `copying${fileCopier.isUseHardLink ? " using hard links" : ""}`);
}
const createdSourceDirs = new Set();
const links = [];
const symlinkType = (0, os_1.platform)() === "win32" ? "junction" : "file";
return walk(src, options.filter, {
consume: async (file, stat, parent) => {
if (!stat.isFile() && !stat.isSymbolicLink()) {
return;
}
if (!createdSourceDirs.has(parent)) {
await (0, promises_1.mkdir)(parent.replace(src, destination), { recursive: true });
createdSourceDirs.add(parent);
}
const destFile = file.replace(src, destination);
if (stat.isFile()) {
await fileCopier.copy(file, destFile, stat);
}
else {
links.push({ file: destFile, link: await (0, promises_1.readlink)(file) });
}
},
}).then(() => bluebird_lst_1.default.map(links, it => (0, promises_1.symlink)(it.link, it.file, symlinkType), exports.CONCURRENCY));
}
async function dirSize(dirPath) {
const entries = await (0, promises_1.readdir)(dirPath, { withFileTypes: true });
const entrySizes = entries.map(async (entry) => {
const entryPath = path.join(dirPath, entry.name);
if (entry.isDirectory()) {
return await dirSize(entryPath);
}
if (entry.isFile()) {
const { size } = await (0, promises_1.stat)(entryPath);
return size;
}
return 0;
});
return (await Promise.all(entrySizes)).reduce((entrySize, totalSize) => entrySize + totalSize, 0);
}
// eslint-disable-next-line @typescript-eslint/no-unused-vars
const DO_NOT_USE_HARD_LINKS = (file) => false;
exports.DO_NOT_USE_HARD_LINKS = DO_NOT_USE_HARD_LINKS;
// eslint-disable-next-line @typescript-eslint/no-unused-vars
const USE_HARD_LINKS = (file) => true;
exports.USE_HARD_LINKS = USE_HARD_LINKS;
//# sourceMappingURL=fs.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,25 @@
import _debug from "debug";
import WritableStream = NodeJS.WritableStream;
export declare const debug: _debug.Debugger;
export interface Fields {
[index: string]: any;
}
export declare function setPrinter(value: ((message: string) => void) | null): void;
export type LogLevel = "info" | "warn" | "debug" | "notice" | "error";
export declare const PADDING = 2;
export declare class Logger {
protected readonly stream: WritableStream;
constructor(stream: WritableStream);
messageTransformer: (message: string, level: LogLevel) => string;
filePath(file: string): string;
get isDebugEnabled(): boolean;
info(messageOrFields: Fields | null | string, message?: string): void;
error(messageOrFields: Fields | null | string, message?: string): void;
warn(messageOrFields: Fields | null | string, message?: string): void;
debug(fields: Fields | null, message: string): void;
private doLog;
private _doLog;
static createMessage(message: string, fields: Fields | null, level: LogLevel, color: (it: string) => string, messagePadding?: number): string;
log(message: string): void;
}
export declare const log: Logger;

110
desktop-operator/node_modules/builder-util/out/log.js generated vendored Normal file
View File

@@ -0,0 +1,110 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.log = exports.Logger = exports.PADDING = exports.debug = void 0;
exports.setPrinter = setPrinter;
const chalk = require("chalk");
const debug_1 = require("debug");
let printer = null;
exports.debug = (0, debug_1.default)("electron-builder");
function setPrinter(value) {
printer = value;
}
exports.PADDING = 2;
class Logger {
constructor(stream) {
this.stream = stream;
this.messageTransformer = it => it;
}
filePath(file) {
const cwd = process.cwd();
return file.startsWith(cwd) ? file.substring(cwd.length + 1) : file;
}
// noinspection JSMethodCanBeStatic
get isDebugEnabled() {
return exports.debug.enabled;
}
info(messageOrFields, message) {
this.doLog(message, messageOrFields, "info");
}
error(messageOrFields, message) {
this.doLog(message, messageOrFields, "error");
}
warn(messageOrFields, message) {
this.doLog(message, messageOrFields, "warn");
}
debug(fields, message) {
if (exports.debug.enabled) {
this._doLog(message, fields, "debug");
}
}
doLog(message, messageOrFields, level) {
if (message === undefined) {
this._doLog(messageOrFields, null, level);
}
else {
this._doLog(message, messageOrFields, level);
}
}
_doLog(message, fields, level) {
// noinspection SuspiciousInstanceOfGuard
if (message instanceof Error) {
message = message.stack || message.toString();
}
else {
message = message.toString();
}
const levelIndicator = level === "error" ? "" : "•";
const color = LEVEL_TO_COLOR[level];
this.stream.write(`${" ".repeat(exports.PADDING)}${color(levelIndicator)} `);
this.stream.write(Logger.createMessage(this.messageTransformer(message, level), fields, level, color, exports.PADDING + 2 /* level indicator and space */));
this.stream.write("\n");
}
static createMessage(message, fields, level, color, messagePadding = 0) {
if (fields == null) {
return message;
}
const fieldPadding = " ".repeat(Math.max(2, 16 - message.length));
let text = (level === "error" ? color(message) : message) + fieldPadding;
const fieldNames = Object.keys(fields);
let counter = 0;
for (const name of fieldNames) {
let fieldValue = fields[name];
let valuePadding = null;
// Remove unnecessary line breaks
if (fieldValue != null && typeof fieldValue === "string" && fieldValue.includes("\n")) {
valuePadding = " ".repeat(messagePadding + message.length + fieldPadding.length + 2);
fieldValue = fieldValue.replace(/\n\s*\n/g, `\n${valuePadding}`);
}
else if (Array.isArray(fieldValue)) {
fieldValue = JSON.stringify(fieldValue);
}
text += `${color(name)}=${fieldValue}`;
if (++counter !== fieldNames.length) {
if (valuePadding == null) {
text += " ";
}
else {
text += "\n" + valuePadding;
}
}
}
return text;
}
log(message) {
if (printer == null) {
this.stream.write(`${message}\n`);
}
else {
printer(message);
}
}
}
exports.Logger = Logger;
const LEVEL_TO_COLOR = {
info: chalk.blue,
warn: chalk.yellow,
error: chalk.red,
debug: chalk.white,
};
exports.log = new Logger(process.stdout);
//# sourceMappingURL=log.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,6 @@
import { HttpExecutor } from "builder-util-runtime";
import { ClientRequest } from "http";
export declare class NodeHttpExecutor extends HttpExecutor<ClientRequest> {
createRequest(options: any, callback: (response: any) => void): ClientRequest;
}
export declare const httpExecutor: NodeHttpExecutor;

View File

@@ -0,0 +1,24 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.httpExecutor = exports.NodeHttpExecutor = void 0;
const builder_util_runtime_1 = require("builder-util-runtime");
const http_1 = require("http");
const http_proxy_agent_1 = require("http-proxy-agent");
const https = require("https");
const https_proxy_agent_1 = require("https-proxy-agent");
class NodeHttpExecutor extends builder_util_runtime_1.HttpExecutor {
// noinspection JSMethodCanBeStatic
// noinspection JSUnusedGlobalSymbols
createRequest(options, callback) {
if (process.env["https_proxy"] !== undefined && options.protocol === "https:") {
options.agent = new https_proxy_agent_1.HttpsProxyAgent(process.env["https_proxy"]);
}
else if (process.env["http_proxy"] !== undefined && options.protocol === "http:") {
options.agent = new http_proxy_agent_1.HttpProxyAgent(process.env["http_proxy"]);
}
return (options.protocol === "http:" ? http_1.request : https.request)(options, callback);
}
}
exports.NodeHttpExecutor = NodeHttpExecutor;
exports.httpExecutor = new NodeHttpExecutor();
//# sourceMappingURL=nodeHttpExecutor.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"nodeHttpExecutor.js","sourceRoot":"","sources":["../src/nodeHttpExecutor.ts"],"names":[],"mappings":";;;AAAA,+DAAmD;AACnD,+BAA4D;AAC5D,uDAAiD;AACjD,+BAA8B;AAC9B,yDAAmD;AAEnD,MAAa,gBAAiB,SAAQ,mCAA2B;IAC/D,mCAAmC;IACnC,qCAAqC;IACrC,aAAa,CAAC,OAAY,EAAE,QAAiC;QAC3D,IAAI,OAAO,CAAC,GAAG,CAAC,aAAa,CAAC,KAAK,SAAS,IAAI,OAAO,CAAC,QAAQ,KAAK,QAAQ,EAAE,CAAC;YAC9E,OAAO,CAAC,KAAK,GAAG,IAAI,mCAAe,CAAC,OAAO,CAAC,GAAG,CAAC,aAAa,CAAC,CAAC,CAAA;QACjE,CAAC;aAAM,IAAI,OAAO,CAAC,GAAG,CAAC,YAAY,CAAC,KAAK,SAAS,IAAI,OAAO,CAAC,QAAQ,KAAK,OAAO,EAAE,CAAC;YACnF,OAAO,CAAC,KAAK,GAAG,IAAI,iCAAc,CAAC,OAAO,CAAC,GAAG,CAAC,YAAY,CAAC,CAAC,CAAA;QAC/D,CAAC;QACD,OAAO,CAAC,OAAO,CAAC,QAAQ,KAAK,OAAO,CAAC,CAAC,CAAC,cAAW,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC,OAAO,EAAE,QAAQ,CAAC,CAAA;IACxF,CAAC;CACF;AAXD,4CAWC;AAEY,QAAA,YAAY,GAAG,IAAI,gBAAgB,EAAE,CAAA","sourcesContent":["import { HttpExecutor } from \"builder-util-runtime\"\nimport { ClientRequest, request as httpRequest } from \"http\"\nimport { HttpProxyAgent } from \"http-proxy-agent\"\nimport * as https from \"https\"\nimport { HttpsProxyAgent } from \"https-proxy-agent\"\n\nexport class NodeHttpExecutor extends HttpExecutor<ClientRequest> {\n // noinspection JSMethodCanBeStatic\n // noinspection JSUnusedGlobalSymbols\n createRequest(options: any, callback: (response: any) => void): ClientRequest {\n if (process.env[\"https_proxy\"] !== undefined && options.protocol === \"https:\") {\n options.agent = new HttpsProxyAgent(process.env[\"https_proxy\"])\n } else if (process.env[\"http_proxy\"] !== undefined && options.protocol === \"http:\") {\n options.agent = new HttpProxyAgent(process.env[\"http_proxy\"])\n }\n return (options.protocol === \"http:\" ? httpRequest : https.request)(options, callback)\n }\n}\n\nexport const httpExecutor = new NodeHttpExecutor()\n"]}

View File

@@ -0,0 +1,7 @@
export declare function printErrorAndExit(error: Error): void;
export declare function executeFinally<T>(promise: Promise<T>, task: (isErrorOccurred: boolean) => Promise<any>): Promise<T>;
export declare class NestedError extends Error {
constructor(errors: Array<Error>, message?: string);
}
export declare function orNullIfFileNotExist<T>(promise: Promise<T>): Promise<T | null>;
export declare function orIfFileNotExist<T>(promise: Promise<T>, fallbackValue: T): Promise<T>;

View File

@@ -0,0 +1,54 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.NestedError = void 0;
exports.printErrorAndExit = printErrorAndExit;
exports.executeFinally = executeFinally;
exports.orNullIfFileNotExist = orNullIfFileNotExist;
exports.orIfFileNotExist = orIfFileNotExist;
const chalk = require("chalk");
function printErrorAndExit(error) {
console.error(chalk.red((error.stack || error).toString()));
process.exit(1);
}
// you don't need to handle error in your task - it is passed only indicate status of promise
async function executeFinally(promise, task) {
let result = null;
try {
result = await promise;
}
catch (originalError) {
try {
await task(true);
}
catch (taskError) {
throw new NestedError([originalError, taskError]);
}
throw originalError;
}
await task(false);
return result;
}
class NestedError extends Error {
constructor(errors, message = "Compound error: ") {
let m = message;
let i = 1;
for (const error of errors) {
const prefix = `Error #${i++} `;
m += `\n\n${prefix}${"-".repeat(80)}\n${error.stack}`;
}
super(m);
}
}
exports.NestedError = NestedError;
function orNullIfFileNotExist(promise) {
return orIfFileNotExist(promise, null);
}
function orIfFileNotExist(promise, fallbackValue) {
return promise.catch((e) => {
if (e.code === "ENOENT" || e.code === "ENOTDIR") {
return fallbackValue;
}
throw e;
});
}
//# sourceMappingURL=promise.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"promise.js","sourceRoot":"","sources":["../src/promise.ts"],"names":[],"mappings":";;;AAEA,8CAGC;AAGD,wCAgBC;AAcD,oDAEC;AAED,4CAOC;AAjDD,+BAA8B;AAE9B,SAAgB,iBAAiB,CAAC,KAAY;IAC5C,OAAO,CAAC,KAAK,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,KAAK,CAAC,KAAK,IAAI,KAAK,CAAC,CAAC,QAAQ,EAAE,CAAC,CAAC,CAAA;IAC3D,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,CAAA;AACjB,CAAC;AAED,6FAA6F;AACtF,KAAK,UAAU,cAAc,CAAI,OAAmB,EAAE,IAAgD;IAC3G,IAAI,MAAM,GAAa,IAAI,CAAA;IAC3B,IAAI,CAAC;QACH,MAAM,GAAG,MAAM,OAAO,CAAA;IACxB,CAAC;IAAC,OAAO,aAAkB,EAAE,CAAC;QAC5B,IAAI,CAAC;YACH,MAAM,IAAI,CAAC,IAAI,CAAC,CAAA;QAClB,CAAC;QAAC,OAAO,SAAc,EAAE,CAAC;YACxB,MAAM,IAAI,WAAW,CAAC,CAAC,aAAa,EAAE,SAAS,CAAC,CAAC,CAAA;QACnD,CAAC;QAED,MAAM,aAAa,CAAA;IACrB,CAAC;IAED,MAAM,IAAI,CAAC,KAAK,CAAC,CAAA;IACjB,OAAO,MAAM,CAAA;AACf,CAAC;AAED,MAAa,WAAY,SAAQ,KAAK;IACpC,YAAY,MAAoB,EAAE,OAAO,GAAG,kBAAkB;QAC5D,IAAI,CAAC,GAAG,OAAO,CAAA;QACf,IAAI,CAAC,GAAG,CAAC,CAAA;QACT,KAAK,MAAM,KAAK,IAAI,MAAM,EAAE,CAAC;YAC3B,MAAM,MAAM,GAAG,UAAU,CAAC,EAAE,GAAG,CAAA;YAC/B,CAAC,IAAI,OAAO,MAAM,GAAG,GAAG,CAAC,MAAM,CAAC,EAAE,CAAC,KAAK,KAAK,CAAC,KAAK,EAAE,CAAA;QACvD,CAAC;QACD,KAAK,CAAC,CAAC,CAAC,CAAA;IACV,CAAC;CACF;AAVD,kCAUC;AAED,SAAgB,oBAAoB,CAAI,OAAmB;IACzD,OAAO,gBAAgB,CAAC,OAAO,EAAE,IAAI,CAAC,CAAA;AACxC,CAAC;AAED,SAAgB,gBAAgB,CAAI,OAAmB,EAAE,aAAgB;IACvE,OAAO,OAAO,CAAC,KAAK,CAAC,CAAC,CAAM,EAAE,EAAE;QAC9B,IAAI,CAAC,CAAC,IAAI,KAAK,QAAQ,IAAI,CAAC,CAAC,IAAI,KAAK,SAAS,EAAE,CAAC;YAChD,OAAO,aAAa,CAAA;QACtB,CAAC;QACD,MAAM,CAAC,CAAA;IACT,CAAC,CAAC,CAAA;AACJ,CAAC","sourcesContent":["import * as chalk from \"chalk\"\n\nexport function printErrorAndExit(error: Error) {\n console.error(chalk.red((error.stack || error).toString()))\n process.exit(1)\n}\n\n// you don't need to handle error in your task - it is passed only indicate status of promise\nexport async function executeFinally<T>(promise: Promise<T>, task: (isErrorOccurred: boolean) => Promise<any>): Promise<T> {\n let result: T | null = null\n try {\n result = await promise\n } catch (originalError: any) {\n try {\n await task(true)\n } catch (taskError: any) {\n throw new NestedError([originalError, taskError])\n }\n\n throw originalError\n }\n\n await task(false)\n return result\n}\n\nexport class NestedError extends Error {\n constructor(errors: Array<Error>, message = \"Compound error: \") {\n let m = message\n let i = 1\n for (const error of errors) {\n const prefix = `Error #${i++} `\n m += `\\n\\n${prefix}${\"-\".repeat(80)}\\n${error.stack}`\n }\n super(m)\n }\n}\n\nexport function orNullIfFileNotExist<T>(promise: Promise<T>): Promise<T | null> {\n return orIfFileNotExist(promise, null)\n}\n\nexport function orIfFileNotExist<T>(promise: Promise<T>, fallbackValue: T): Promise<T> {\n return promise.catch((e: any) => {\n if (e.code === \"ENOENT\" || e.code === \"ENOTDIR\") {\n return fallbackValue\n }\n throw e\n })\n}\n"]}

View File

@@ -0,0 +1,44 @@
import { ChildProcess, ExecFileOptions, SpawnOptions } from "child_process";
import _debug from "debug";
export { safeStringifyJson } from "builder-util-runtime";
export { TmpDir } from "temp-file";
export * from "./log";
export { Arch, getArchCliNames, toLinuxArchString, getArchSuffix, ArchType, archFromString, defaultArchFromString } from "./arch";
export { AsyncTaskManager } from "./asyncTaskManager";
export { DebugLogger } from "./DebugLogger";
export { httpExecutor, NodeHttpExecutor } from "./nodeHttpExecutor";
export * from "./promise";
export * from "./arch";
export * from "./fs";
export { asArray } from "builder-util-runtime";
export { deepAssign } from "./deepAssign";
export { getPath7za, getPath7x } from "./7za";
export declare const debug7z: _debug.Debugger;
export declare function serializeToYaml(object: any, skipInvalid?: boolean, noRefs?: boolean): string;
export declare function removePassword(input: string): string;
export declare function exec(file: string, args?: Array<string> | null, options?: ExecFileOptions, isLogOutIfDebug?: boolean): Promise<string>;
export interface ExtraSpawnOptions {
isPipeInput?: boolean;
}
export declare function doSpawn(command: string, args: Array<string>, options?: SpawnOptions, extraOptions?: ExtraSpawnOptions): ChildProcess;
export declare function spawnAndWrite(command: string, args: Array<string>, data: string, options?: SpawnOptions): Promise<any>;
export declare function spawn(command: string, args?: Array<string> | null, options?: SpawnOptions, extraOptions?: ExtraSpawnOptions): Promise<any>;
export declare class ExecError extends Error {
readonly exitCode: number;
alreadyLogged: boolean;
constructor(command: string, exitCode: number, out: string, errorOut: string, code?: string);
}
type Nullish = null | undefined;
export declare function use<T, R>(value: T | Nullish, task: (value: T) => R): R | null;
export declare function isEmptyOrSpaces(s: string | null | undefined): s is "" | null | undefined;
export declare function isTokenCharValid(token: string): boolean;
export declare function addValue<K, T>(map: Map<K, Array<T>>, key: K, value: T): void;
export declare function replaceDefault(inList: Array<string> | null | undefined, defaultList: Array<string>): Array<string>;
export declare function getPlatformIconFileName(value: string | null | undefined, isMac: boolean): string | null | undefined;
export declare function isPullRequest(): boolean | "" | undefined;
export declare function isEnvTrue(value: string | null | undefined): boolean;
export declare class InvalidConfigurationError extends Error {
constructor(message: string, code?: string);
}
export declare function executeAppBuilder(args: Array<string>, childProcessConsumer?: (childProcess: ChildProcess) => void, extraOptions?: SpawnOptions, maxRetries?: number): Promise<string>;
export declare function retry<T>(task: () => Promise<T>, retryCount: number, interval: number, backoff?: number, attempt?: number, shouldRetry?: (e: any) => boolean): Promise<T>;

395
desktop-operator/node_modules/builder-util/out/util.js generated vendored Normal file
View File

@@ -0,0 +1,395 @@
"use strict";
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
var desc = Object.getOwnPropertyDescriptor(m, k);
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
desc = { enumerable: true, get: function() { return m[k]; } };
}
Object.defineProperty(o, k2, desc);
}) : (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
o[k2] = m[k];
}));
var __exportStar = (this && this.__exportStar) || function(m, exports) {
for (var p in m) if (p !== "default" && !Object.prototype.hasOwnProperty.call(exports, p)) __createBinding(exports, m, p);
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.InvalidConfigurationError = exports.ExecError = exports.debug7z = exports.getPath7x = exports.getPath7za = exports.deepAssign = exports.asArray = exports.NodeHttpExecutor = exports.httpExecutor = exports.DebugLogger = exports.AsyncTaskManager = exports.defaultArchFromString = exports.archFromString = exports.getArchSuffix = exports.toLinuxArchString = exports.getArchCliNames = exports.Arch = exports.TmpDir = exports.safeStringifyJson = void 0;
exports.serializeToYaml = serializeToYaml;
exports.removePassword = removePassword;
exports.exec = exec;
exports.doSpawn = doSpawn;
exports.spawnAndWrite = spawnAndWrite;
exports.spawn = spawn;
exports.use = use;
exports.isEmptyOrSpaces = isEmptyOrSpaces;
exports.isTokenCharValid = isTokenCharValid;
exports.addValue = addValue;
exports.replaceDefault = replaceDefault;
exports.getPlatformIconFileName = getPlatformIconFileName;
exports.isPullRequest = isPullRequest;
exports.isEnvTrue = isEnvTrue;
exports.executeAppBuilder = executeAppBuilder;
exports.retry = retry;
const app_builder_bin_1 = require("app-builder-bin");
const builder_util_runtime_1 = require("builder-util-runtime");
const chalk = require("chalk");
const child_process_1 = require("child_process");
const cross_spawn_1 = require("cross-spawn");
const crypto_1 = require("crypto");
const debug_1 = require("debug");
const js_yaml_1 = require("js-yaml");
const path = require("path");
const log_1 = require("./log");
const source_map_support_1 = require("source-map-support");
const _7za_1 = require("./7za");
if (process.env.JEST_WORKER_ID == null) {
(0, source_map_support_1.install)();
}
var builder_util_runtime_2 = require("builder-util-runtime");
Object.defineProperty(exports, "safeStringifyJson", { enumerable: true, get: function () { return builder_util_runtime_2.safeStringifyJson; } });
var temp_file_1 = require("temp-file");
Object.defineProperty(exports, "TmpDir", { enumerable: true, get: function () { return temp_file_1.TmpDir; } });
__exportStar(require("./log"), exports);
var arch_1 = require("./arch");
Object.defineProperty(exports, "Arch", { enumerable: true, get: function () { return arch_1.Arch; } });
Object.defineProperty(exports, "getArchCliNames", { enumerable: true, get: function () { return arch_1.getArchCliNames; } });
Object.defineProperty(exports, "toLinuxArchString", { enumerable: true, get: function () { return arch_1.toLinuxArchString; } });
Object.defineProperty(exports, "getArchSuffix", { enumerable: true, get: function () { return arch_1.getArchSuffix; } });
Object.defineProperty(exports, "archFromString", { enumerable: true, get: function () { return arch_1.archFromString; } });
Object.defineProperty(exports, "defaultArchFromString", { enumerable: true, get: function () { return arch_1.defaultArchFromString; } });
var asyncTaskManager_1 = require("./asyncTaskManager");
Object.defineProperty(exports, "AsyncTaskManager", { enumerable: true, get: function () { return asyncTaskManager_1.AsyncTaskManager; } });
var DebugLogger_1 = require("./DebugLogger");
Object.defineProperty(exports, "DebugLogger", { enumerable: true, get: function () { return DebugLogger_1.DebugLogger; } });
var nodeHttpExecutor_1 = require("./nodeHttpExecutor");
Object.defineProperty(exports, "httpExecutor", { enumerable: true, get: function () { return nodeHttpExecutor_1.httpExecutor; } });
Object.defineProperty(exports, "NodeHttpExecutor", { enumerable: true, get: function () { return nodeHttpExecutor_1.NodeHttpExecutor; } });
__exportStar(require("./promise"), exports);
__exportStar(require("./arch"), exports);
__exportStar(require("./fs"), exports);
var builder_util_runtime_3 = require("builder-util-runtime");
Object.defineProperty(exports, "asArray", { enumerable: true, get: function () { return builder_util_runtime_3.asArray; } });
var deepAssign_1 = require("./deepAssign");
Object.defineProperty(exports, "deepAssign", { enumerable: true, get: function () { return deepAssign_1.deepAssign; } });
var _7za_2 = require("./7za");
Object.defineProperty(exports, "getPath7za", { enumerable: true, get: function () { return _7za_2.getPath7za; } });
Object.defineProperty(exports, "getPath7x", { enumerable: true, get: function () { return _7za_2.getPath7x; } });
exports.debug7z = (0, debug_1.default)("electron-builder:7z");
function serializeToYaml(object, skipInvalid = false, noRefs = false) {
return (0, js_yaml_1.dump)(object, {
lineWidth: 8000,
skipInvalid,
noRefs,
});
}
function removePassword(input) {
return input.replace(/(-String |-P |pass:| \/p |-pass |--secretKey |--accessKey |-p )([^ ]+)/g, (match, p1, p2) => {
if (p1.trim() === "/p" && p2.startsWith("\\\\Mac\\Host\\\\")) {
// appx /p
return `${p1}${p2}`;
}
return `${p1}${(0, crypto_1.createHash)("sha256").update(p2).digest("hex")} (sha256 hash)`;
});
}
function getProcessEnv(env) {
if (process.platform === "win32") {
return env == null ? undefined : env;
}
const finalEnv = {
...(env || process.env),
};
// without LC_CTYPE dpkg can returns encoded unicode symbols
// set LC_CTYPE to avoid crash https://github.com/electron-userland/electron-builder/issues/503 Even "en_DE.UTF-8" leads to error.
const locale = process.platform === "linux" ? process.env.LANG || "C.UTF-8" : "en_US.UTF-8";
finalEnv.LANG = locale;
finalEnv.LC_CTYPE = locale;
finalEnv.LC_ALL = locale;
return finalEnv;
}
function exec(file, args, options, isLogOutIfDebug = true) {
if (log_1.log.isDebugEnabled) {
const logFields = {
file,
args: args == null ? "" : removePassword(args.join(" ")),
};
if (options != null) {
if (options.cwd != null) {
logFields.cwd = options.cwd;
}
if (options.env != null) {
const diffEnv = { ...options.env };
for (const name of Object.keys(process.env)) {
if (process.env[name] === options.env[name]) {
delete diffEnv[name];
}
}
logFields.env = (0, builder_util_runtime_1.safeStringifyJson)(diffEnv);
}
}
log_1.log.debug(logFields, "executing");
}
return new Promise((resolve, reject) => {
(0, child_process_1.execFile)(file, args, {
...options,
maxBuffer: 1000 * 1024 * 1024,
env: getProcessEnv(options == null ? null : options.env),
}, (error, stdout, stderr) => {
if (error == null) {
if (isLogOutIfDebug && log_1.log.isDebugEnabled) {
const logFields = {
file,
};
if (stdout.length > 0) {
logFields.stdout = stdout;
}
if (stderr.length > 0) {
logFields.stderr = stderr;
}
log_1.log.debug(logFields, "executed");
}
resolve(stdout.toString());
}
else {
let message = chalk.red(removePassword(`Exit code: ${error.code}. ${error.message}`));
if (stdout.length !== 0) {
if (file.endsWith("wine")) {
stdout = stdout.toString();
}
message += `\n${chalk.yellow(stdout.toString())}`;
}
if (stderr.length !== 0) {
if (file.endsWith("wine")) {
stderr = stderr.toString();
}
message += `\n${chalk.red(stderr.toString())}`;
}
reject(new Error(message));
}
});
});
}
function logSpawn(command, args, options) {
// use general debug.enabled to log spawn, because it doesn't produce a lot of output (the only line), but important in any case
if (!log_1.log.isDebugEnabled) {
return;
}
const argsString = removePassword(args.join(" "));
const logFields = {
command: command + " " + (command === "docker" ? argsString : removePassword(argsString)),
};
if (options != null && options.cwd != null) {
logFields.cwd = options.cwd;
}
log_1.log.debug(logFields, "spawning");
}
function doSpawn(command, args, options, extraOptions) {
if (options == null) {
options = {};
}
options.env = getProcessEnv(options.env);
if (options.stdio == null) {
const isDebugEnabled = log_1.debug.enabled;
// do not ignore stdout/stderr if not debug, because in this case we will read into buffer and print on error
options.stdio = [extraOptions != null && extraOptions.isPipeInput ? "pipe" : "ignore", isDebugEnabled ? "inherit" : "pipe", isDebugEnabled ? "inherit" : "pipe"];
}
logSpawn(command, args, options);
try {
return (0, cross_spawn_1.spawn)(command, args, options);
}
catch (e) {
throw new Error(`Cannot spawn ${command}: ${e.stack || e}`);
}
}
function spawnAndWrite(command, args, data, options) {
const childProcess = doSpawn(command, args, options, { isPipeInput: true });
const timeout = setTimeout(() => childProcess.kill(), 4 * 60 * 1000);
return new Promise((resolve, reject) => {
handleProcess("close", childProcess, command, () => {
try {
clearTimeout(timeout);
}
finally {
resolve(undefined);
}
}, error => {
try {
clearTimeout(timeout);
}
finally {
reject(error);
}
});
childProcess.stdin.end(data);
});
}
function spawn(command, args, options, extraOptions) {
return new Promise((resolve, reject) => {
handleProcess("close", doSpawn(command, args || [], options, extraOptions), command, resolve, reject);
});
}
function handleProcess(event, childProcess, command, resolve, reject) {
childProcess.on("error", reject);
let out = "";
if (childProcess.stdout != null) {
childProcess.stdout.on("data", (data) => {
out += data;
});
}
let errorOut = "";
if (childProcess.stderr != null) {
childProcess.stderr.on("data", (data) => {
errorOut += data;
});
}
childProcess.once(event, (code) => {
if (log_1.log.isDebugEnabled) {
const fields = {
command: path.basename(command),
code,
pid: childProcess.pid,
};
if (out.length > 0) {
fields.out = out;
}
log_1.log.debug(fields, "exited");
}
if (code === 0) {
if (resolve != null) {
resolve(out);
}
}
else {
reject(new ExecError(command, code, out, errorOut));
}
});
}
function formatOut(text, title) {
return text.length === 0 ? "" : `\n${title}:\n${text}`;
}
class ExecError extends Error {
constructor(command, exitCode, out, errorOut, code = "ERR_ELECTRON_BUILDER_CANNOT_EXECUTE") {
super(`${command} process failed ${code}${formatOut(String(exitCode), "Exit code")}${formatOut(out, "Output")}${formatOut(errorOut, "Error output")}`);
this.exitCode = exitCode;
this.alreadyLogged = false;
this.code = code;
}
}
exports.ExecError = ExecError;
function use(value, task) {
return value == null ? null : task(value);
}
function isEmptyOrSpaces(s) {
return s == null || s.trim().length === 0;
}
function isTokenCharValid(token) {
return /^[.\w/=+-]+$/.test(token);
}
function addValue(map, key, value) {
const list = map.get(key);
if (list == null) {
map.set(key, [value]);
}
else if (!list.includes(value)) {
list.push(value);
}
}
function replaceDefault(inList, defaultList) {
if (inList == null || (inList.length === 1 && inList[0] === "default")) {
return defaultList;
}
const index = inList.indexOf("default");
if (index >= 0) {
const list = inList.slice(0, index);
list.push(...defaultList);
if (index !== inList.length - 1) {
list.push(...inList.slice(index + 1));
}
inList = list;
}
return inList;
}
function getPlatformIconFileName(value, isMac) {
if (value === undefined) {
return undefined;
}
if (value === null) {
return null;
}
if (!value.includes(".")) {
return `${value}.${isMac ? "icns" : "ico"}`;
}
return value.replace(isMac ? ".ico" : ".icns", isMac ? ".icns" : ".ico");
}
function isPullRequest() {
// TRAVIS_PULL_REQUEST is set to the pull request number if the current job is a pull request build, or false if its not.
function isSet(value) {
// value can be or null, or empty string
return value && value !== "false";
}
return (isSet(process.env.TRAVIS_PULL_REQUEST) ||
isSet(process.env.CIRCLE_PULL_REQUEST) ||
isSet(process.env.BITRISE_PULL_REQUEST) ||
isSet(process.env.APPVEYOR_PULL_REQUEST_NUMBER) ||
isSet(process.env.GITHUB_BASE_REF));
}
function isEnvTrue(value) {
if (value != null) {
value = value.trim();
}
return value === "true" || value === "" || value === "1";
}
class InvalidConfigurationError extends Error {
constructor(message, code = "ERR_ELECTRON_BUILDER_INVALID_CONFIGURATION") {
super(message);
this.code = code;
}
}
exports.InvalidConfigurationError = InvalidConfigurationError;
async function executeAppBuilder(args, childProcessConsumer, extraOptions = {}, maxRetries = 0) {
const command = app_builder_bin_1.appBuilderPath;
const env = {
...process.env,
SZA_PATH: await (0, _7za_1.getPath7za)(),
FORCE_COLOR: chalk.level === 0 ? "0" : "1",
};
const cacheEnv = process.env.ELECTRON_BUILDER_CACHE;
if (cacheEnv != null && cacheEnv.length > 0) {
env.ELECTRON_BUILDER_CACHE = path.resolve(cacheEnv);
}
if (extraOptions.env != null) {
Object.assign(env, extraOptions.env);
}
function runCommand() {
return new Promise((resolve, reject) => {
const childProcess = doSpawn(command, args, {
stdio: ["ignore", "pipe", process.stdout],
...extraOptions,
env,
});
if (childProcessConsumer != null) {
childProcessConsumer(childProcess);
}
handleProcess("close", childProcess, command, resolve, error => {
if (error instanceof ExecError && error.exitCode === 2) {
error.alreadyLogged = true;
}
reject(error);
});
});
}
if (maxRetries === 0) {
return runCommand();
}
else {
return retry(runCommand, maxRetries, 1000);
}
}
async function retry(task, retryCount, interval, backoff = 0, attempt = 0, shouldRetry) {
return await (0, builder_util_runtime_1.retry)(task, retryCount, interval, backoff, attempt, e => {
var _a;
log_1.log.info(`Above command failed, retrying ${retryCount} more times`);
return (_a = shouldRetry === null || shouldRetry === void 0 ? void 0 : shouldRetry(e)) !== null && _a !== void 0 ? _a : true;
});
}
//# sourceMappingURL=util.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,43 @@
{
"name": "builder-util",
"version": "25.1.7",
"main": "out/util.js",
"author": "Vladimir Krivosheev",
"license": "MIT",
"repository": {
"type": "git",
"url": "git+https://github.com/electron-userland/electron-builder.git",
"directory": "packages/builder-util"
},
"bugs": "https://github.com/electron-userland/electron-builder/issues",
"homepage": "https://github.com/electron-userland/electron-builder",
"files": [
"out"
],
"dependencies": {
"7zip-bin": "~5.2.0",
"@types/debug": "^4.1.6",
"app-builder-bin": "5.0.0-alpha.10",
"bluebird-lst": "^1.0.9",
"chalk": "^4.1.2",
"cross-spawn": "^7.0.3",
"debug": "^4.3.4",
"fs-extra": "^10.1.0",
"http-proxy-agent": "^7.0.0",
"https-proxy-agent": "^7.0.0",
"is-ci": "^3.0.0",
"js-yaml": "^4.1.0",
"source-map-support": "^0.5.19",
"stat-mode": "^1.0.0",
"temp-file": "^3.4.0",
"builder-util-runtime": "9.2.10"
},
"typings": "./out/util.d.ts",
"devDependencies": {
"@types/cross-spawn": "6.0.2",
"@types/fs-extra": "^9.0.11",
"@types/is-ci": "3.0.0",
"@types/js-yaml": "4.0.3",
"@types/source-map-support": "0.5.4"
}
}

3
desktop-operator/node_modules/builder-util/readme.md generated vendored Normal file
View File

@@ -0,0 +1,3 @@
# builder-util
Various utilities. Used by [electron-builder](https://github.com/electron-userland/electron-builder).