main funcions fixes

This commit is contained in:
2025-09-29 22:06:11 +09:00
parent 40e016e128
commit c8c3274527
7995 changed files with 1517998 additions and 1057 deletions

22
desktop-operator/node_modules/electron-builder/LICENSE generated vendored Normal file
View File

@@ -0,0 +1,22 @@
The MIT License (MIT)
Copyright (c) 2015 Loopline Systems
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

4
desktop-operator/node_modules/electron-builder/cli.js generated vendored Executable file
View File

@@ -0,0 +1,4 @@
#!/usr/bin/env node
// https://github.com/pnpm/pnpm/issues/1801
require("./out/cli/cli")

View File

@@ -0,0 +1,4 @@
#!/usr/bin/env node
// https://github.com/pnpm/pnpm/issues/1801
require("./out/cli/install-app-deps")

View File

@@ -0,0 +1,15 @@
(The MIT License)
Copyright (c) 2011-2017 JP Richardson
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files
(the 'Software'), to deal in the Software without restriction, including without limitation the rights to use, copy, modify,
merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS
OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View File

@@ -0,0 +1,262 @@
Node.js: fs-extra
=================
`fs-extra` adds file system methods that aren't included in the native `fs` module and adds promise support to the `fs` methods. It also uses [`graceful-fs`](https://github.com/isaacs/node-graceful-fs) to prevent `EMFILE` errors. It should be a drop in replacement for `fs`.
[![npm Package](https://img.shields.io/npm/v/fs-extra.svg)](https://www.npmjs.org/package/fs-extra)
[![License](https://img.shields.io/npm/l/fs-extra.svg)](https://github.com/jprichardson/node-fs-extra/blob/master/LICENSE)
[![build status](https://img.shields.io/github/workflow/status/jprichardson/node-fs-extra/Node.js%20CI/master)](https://github.com/jprichardson/node-fs-extra/actions/workflows/ci.yml?query=branch%3Amaster)
[![downloads per month](http://img.shields.io/npm/dm/fs-extra.svg)](https://www.npmjs.org/package/fs-extra)
[![JavaScript Style Guide](https://img.shields.io/badge/code_style-standard-brightgreen.svg)](https://standardjs.com)
Why?
----
I got tired of including `mkdirp`, `rimraf`, and `ncp` in most of my projects.
Installation
------------
npm install fs-extra
Usage
-----
`fs-extra` is a drop in replacement for native `fs`. All methods in `fs` are attached to `fs-extra`. All `fs` methods return promises if the callback isn't passed.
You don't ever need to include the original `fs` module again:
```js
const fs = require('fs') // this is no longer necessary
```
you can now do this:
```js
const fs = require('fs-extra')
```
or if you prefer to make it clear that you're using `fs-extra` and not `fs`, you may want
to name your `fs` variable `fse` like so:
```js
const fse = require('fs-extra')
```
you can also keep both, but it's redundant:
```js
const fs = require('fs')
const fse = require('fs-extra')
```
Sync vs Async vs Async/Await
-------------
Most methods are async by default. All async methods will return a promise if the callback isn't passed.
Sync methods on the other hand will throw if an error occurs.
Also Async/Await will throw an error if one occurs.
Example:
```js
const fs = require('fs-extra')
// Async with promises:
fs.copy('/tmp/myfile', '/tmp/mynewfile')
.then(() => console.log('success!'))
.catch(err => console.error(err))
// Async with callbacks:
fs.copy('/tmp/myfile', '/tmp/mynewfile', err => {
if (err) return console.error(err)
console.log('success!')
})
// Sync:
try {
fs.copySync('/tmp/myfile', '/tmp/mynewfile')
console.log('success!')
} catch (err) {
console.error(err)
}
// Async/Await:
async function copyFiles () {
try {
await fs.copy('/tmp/myfile', '/tmp/mynewfile')
console.log('success!')
} catch (err) {
console.error(err)
}
}
copyFiles()
```
Methods
-------
### Async
- [copy](docs/copy.md)
- [emptyDir](docs/emptyDir.md)
- [ensureFile](docs/ensureFile.md)
- [ensureDir](docs/ensureDir.md)
- [ensureLink](docs/ensureLink.md)
- [ensureSymlink](docs/ensureSymlink.md)
- [mkdirp](docs/ensureDir.md)
- [mkdirs](docs/ensureDir.md)
- [move](docs/move.md)
- [outputFile](docs/outputFile.md)
- [outputJson](docs/outputJson.md)
- [pathExists](docs/pathExists.md)
- [readJson](docs/readJson.md)
- [remove](docs/remove.md)
- [writeJson](docs/writeJson.md)
### Sync
- [copySync](docs/copy-sync.md)
- [emptyDirSync](docs/emptyDir-sync.md)
- [ensureFileSync](docs/ensureFile-sync.md)
- [ensureDirSync](docs/ensureDir-sync.md)
- [ensureLinkSync](docs/ensureLink-sync.md)
- [ensureSymlinkSync](docs/ensureSymlink-sync.md)
- [mkdirpSync](docs/ensureDir-sync.md)
- [mkdirsSync](docs/ensureDir-sync.md)
- [moveSync](docs/move-sync.md)
- [outputFileSync](docs/outputFile-sync.md)
- [outputJsonSync](docs/outputJson-sync.md)
- [pathExistsSync](docs/pathExists-sync.md)
- [readJsonSync](docs/readJson-sync.md)
- [removeSync](docs/remove-sync.md)
- [writeJsonSync](docs/writeJson-sync.md)
**NOTE:** You can still use the native Node.js methods. They are promisified and copied over to `fs-extra`. See [notes on `fs.read()`, `fs.write()`, & `fs.writev()`](docs/fs-read-write-writev.md)
### What happened to `walk()` and `walkSync()`?
They were removed from `fs-extra` in v2.0.0. If you need the functionality, `walk` and `walkSync` are available as separate packages, [`klaw`](https://github.com/jprichardson/node-klaw) and [`klaw-sync`](https://github.com/manidlou/node-klaw-sync).
Third Party
-----------
### CLI
[fse-cli](https://www.npmjs.com/package/@atao60/fse-cli) allows you to run `fs-extra` from a console or from [npm](https://www.npmjs.com) scripts.
### TypeScript
If you like TypeScript, you can use `fs-extra` with it: https://github.com/DefinitelyTyped/DefinitelyTyped/tree/master/types/fs-extra
### File / Directory Watching
If you want to watch for changes to files or directories, then you should use [chokidar](https://github.com/paulmillr/chokidar).
### Obtain Filesystem (Devices, Partitions) Information
[fs-filesystem](https://github.com/arthurintelligence/node-fs-filesystem) allows you to read the state of the filesystem of the host on which it is run. It returns information about both the devices and the partitions (volumes) of the system.
### Misc.
- [fs-extra-debug](https://github.com/jdxcode/fs-extra-debug) - Send your fs-extra calls to [debug](https://npmjs.org/package/debug).
- [mfs](https://github.com/cadorn/mfs) - Monitor your fs-extra calls.
Hacking on fs-extra
-------------------
Wanna hack on `fs-extra`? Great! Your help is needed! [fs-extra is one of the most depended upon Node.js packages](http://nodei.co/npm/fs-extra.png?downloads=true&downloadRank=true&stars=true). This project
uses [JavaScript Standard Style](https://github.com/feross/standard) - if the name or style choices bother you,
you're gonna have to get over it :) If `standard` is good enough for `npm`, it's good enough for `fs-extra`.
[![js-standard-style](https://cdn.rawgit.com/feross/standard/master/badge.svg)](https://github.com/feross/standard)
What's needed?
- First, take a look at existing issues. Those are probably going to be where the priority lies.
- More tests for edge cases. Specifically on different platforms. There can never be enough tests.
- Improve test coverage.
Note: If you make any big changes, **you should definitely file an issue for discussion first.**
### Running the Test Suite
fs-extra contains hundreds of tests.
- `npm run lint`: runs the linter ([standard](http://standardjs.com/))
- `npm run unit`: runs the unit tests
- `npm test`: runs both the linter and the tests
### Windows
If you run the tests on the Windows and receive a lot of symbolic link `EPERM` permission errors, it's
because on Windows you need elevated privilege to create symbolic links. You can add this to your Windows's
account by following the instructions here: http://superuser.com/questions/104845/permission-to-make-symbolic-links-in-windows-7
However, I didn't have much luck doing this.
Since I develop on Mac OS X, I use VMWare Fusion for Windows testing. I create a shared folder that I map to a drive on Windows.
I open the `Node.js command prompt` and run as `Administrator`. I then map the network drive running the following command:
net use z: "\\vmware-host\Shared Folders"
I can then navigate to my `fs-extra` directory and run the tests.
Naming
------
I put a lot of thought into the naming of these functions. Inspired by @coolaj86's request. So he deserves much of the credit for raising the issue. See discussion(s) here:
* https://github.com/jprichardson/node-fs-extra/issues/2
* https://github.com/flatiron/utile/issues/11
* https://github.com/ryanmcgrath/wrench-js/issues/29
* https://github.com/substack/node-mkdirp/issues/17
First, I believe that in as many cases as possible, the [Node.js naming schemes](http://nodejs.org/api/fs.html) should be chosen. However, there are problems with the Node.js own naming schemes.
For example, `fs.readFile()` and `fs.readdir()`: the **F** is capitalized in *File* and the **d** is not capitalized in *dir*. Perhaps a bit pedantic, but they should still be consistent. Also, Node.js has chosen a lot of POSIX naming schemes, which I believe is great. See: `fs.mkdir()`, `fs.rmdir()`, `fs.chown()`, etc.
We have a dilemma though. How do you consistently name methods that perform the following POSIX commands: `cp`, `cp -r`, `mkdir -p`, and `rm -rf`?
My perspective: when in doubt, err on the side of simplicity. A directory is just a hierarchical grouping of directories and files. Consider that for a moment. So when you want to copy it or remove it, in most cases you'll want to copy or remove all of its contents. When you want to create a directory, if the directory that it's suppose to be contained in does not exist, then in most cases you'll want to create that too.
So, if you want to remove a file or a directory regardless of whether it has contents, just call `fs.remove(path)`. If you want to copy a file or a directory whether it has contents, just call `fs.copy(source, destination)`. If you want to create a directory regardless of whether its parent directories exist, just call `fs.mkdirs(path)` or `fs.mkdirp(path)`.
Credit
------
`fs-extra` wouldn't be possible without using the modules from the following authors:
- [Isaac Shlueter](https://github.com/isaacs)
- [Charlie McConnel](https://github.com/avianflu)
- [James Halliday](https://github.com/substack)
- [Andrew Kelley](https://github.com/andrewrk)
License
-------
Licensed under MIT
Copyright (c) 2011-2017 [JP Richardson](https://github.com/jprichardson)
[1]: http://nodejs.org/docs/latest/api/fs.html
[jsonfile]: https://github.com/jprichardson/node-jsonfile

View File

@@ -0,0 +1,169 @@
'use strict'
const fs = require('graceful-fs')
const path = require('path')
const mkdirsSync = require('../mkdirs').mkdirsSync
const utimesMillisSync = require('../util/utimes').utimesMillisSync
const stat = require('../util/stat')
function copySync (src, dest, opts) {
if (typeof opts === 'function') {
opts = { filter: opts }
}
opts = opts || {}
opts.clobber = 'clobber' in opts ? !!opts.clobber : true // default to true for now
opts.overwrite = 'overwrite' in opts ? !!opts.overwrite : opts.clobber // overwrite falls back to clobber
// Warn about using preserveTimestamps on 32-bit node
if (opts.preserveTimestamps && process.arch === 'ia32') {
process.emitWarning(
'Using the preserveTimestamps option in 32-bit node is not recommended;\n\n' +
'\tsee https://github.com/jprichardson/node-fs-extra/issues/269',
'Warning', 'fs-extra-WARN0002'
)
}
const { srcStat, destStat } = stat.checkPathsSync(src, dest, 'copy', opts)
stat.checkParentPathsSync(src, srcStat, dest, 'copy')
return handleFilterAndCopy(destStat, src, dest, opts)
}
function handleFilterAndCopy (destStat, src, dest, opts) {
if (opts.filter && !opts.filter(src, dest)) return
const destParent = path.dirname(dest)
if (!fs.existsSync(destParent)) mkdirsSync(destParent)
return getStats(destStat, src, dest, opts)
}
function startCopy (destStat, src, dest, opts) {
if (opts.filter && !opts.filter(src, dest)) return
return getStats(destStat, src, dest, opts)
}
function getStats (destStat, src, dest, opts) {
const statSync = opts.dereference ? fs.statSync : fs.lstatSync
const srcStat = statSync(src)
if (srcStat.isDirectory()) return onDir(srcStat, destStat, src, dest, opts)
else if (srcStat.isFile() ||
srcStat.isCharacterDevice() ||
srcStat.isBlockDevice()) return onFile(srcStat, destStat, src, dest, opts)
else if (srcStat.isSymbolicLink()) return onLink(destStat, src, dest, opts)
else if (srcStat.isSocket()) throw new Error(`Cannot copy a socket file: ${src}`)
else if (srcStat.isFIFO()) throw new Error(`Cannot copy a FIFO pipe: ${src}`)
throw new Error(`Unknown file: ${src}`)
}
function onFile (srcStat, destStat, src, dest, opts) {
if (!destStat) return copyFile(srcStat, src, dest, opts)
return mayCopyFile(srcStat, src, dest, opts)
}
function mayCopyFile (srcStat, src, dest, opts) {
if (opts.overwrite) {
fs.unlinkSync(dest)
return copyFile(srcStat, src, dest, opts)
} else if (opts.errorOnExist) {
throw new Error(`'${dest}' already exists`)
}
}
function copyFile (srcStat, src, dest, opts) {
fs.copyFileSync(src, dest)
if (opts.preserveTimestamps) handleTimestamps(srcStat.mode, src, dest)
return setDestMode(dest, srcStat.mode)
}
function handleTimestamps (srcMode, src, dest) {
// Make sure the file is writable before setting the timestamp
// otherwise open fails with EPERM when invoked with 'r+'
// (through utimes call)
if (fileIsNotWritable(srcMode)) makeFileWritable(dest, srcMode)
return setDestTimestamps(src, dest)
}
function fileIsNotWritable (srcMode) {
return (srcMode & 0o200) === 0
}
function makeFileWritable (dest, srcMode) {
return setDestMode(dest, srcMode | 0o200)
}
function setDestMode (dest, srcMode) {
return fs.chmodSync(dest, srcMode)
}
function setDestTimestamps (src, dest) {
// The initial srcStat.atime cannot be trusted
// because it is modified by the read(2) system call
// (See https://nodejs.org/api/fs.html#fs_stat_time_values)
const updatedSrcStat = fs.statSync(src)
return utimesMillisSync(dest, updatedSrcStat.atime, updatedSrcStat.mtime)
}
function onDir (srcStat, destStat, src, dest, opts) {
if (!destStat) return mkDirAndCopy(srcStat.mode, src, dest, opts)
return copyDir(src, dest, opts)
}
function mkDirAndCopy (srcMode, src, dest, opts) {
fs.mkdirSync(dest)
copyDir(src, dest, opts)
return setDestMode(dest, srcMode)
}
function copyDir (src, dest, opts) {
fs.readdirSync(src).forEach(item => copyDirItem(item, src, dest, opts))
}
function copyDirItem (item, src, dest, opts) {
const srcItem = path.join(src, item)
const destItem = path.join(dest, item)
const { destStat } = stat.checkPathsSync(srcItem, destItem, 'copy', opts)
return startCopy(destStat, srcItem, destItem, opts)
}
function onLink (destStat, src, dest, opts) {
let resolvedSrc = fs.readlinkSync(src)
if (opts.dereference) {
resolvedSrc = path.resolve(process.cwd(), resolvedSrc)
}
if (!destStat) {
return fs.symlinkSync(resolvedSrc, dest)
} else {
let resolvedDest
try {
resolvedDest = fs.readlinkSync(dest)
} catch (err) {
// dest exists and is a regular file or directory,
// Windows may throw UNKNOWN error. If dest already exists,
// fs throws error anyway, so no need to guard against it here.
if (err.code === 'EINVAL' || err.code === 'UNKNOWN') return fs.symlinkSync(resolvedSrc, dest)
throw err
}
if (opts.dereference) {
resolvedDest = path.resolve(process.cwd(), resolvedDest)
}
if (stat.isSrcSubdir(resolvedSrc, resolvedDest)) {
throw new Error(`Cannot copy '${resolvedSrc}' to a subdirectory of itself, '${resolvedDest}'.`)
}
// prevent copy if src is a subdir of dest since unlinking
// dest in this case would result in removing src contents
// and therefore a broken symlink would be created.
if (fs.statSync(dest).isDirectory() && stat.isSrcSubdir(resolvedDest, resolvedSrc)) {
throw new Error(`Cannot overwrite '${resolvedDest}' with '${resolvedSrc}'.`)
}
return copyLink(resolvedSrc, dest)
}
}
function copyLink (resolvedSrc, dest) {
fs.unlinkSync(dest)
return fs.symlinkSync(resolvedSrc, dest)
}
module.exports = copySync

View File

@@ -0,0 +1,235 @@
'use strict'
const fs = require('graceful-fs')
const path = require('path')
const mkdirs = require('../mkdirs').mkdirs
const pathExists = require('../path-exists').pathExists
const utimesMillis = require('../util/utimes').utimesMillis
const stat = require('../util/stat')
function copy (src, dest, opts, cb) {
if (typeof opts === 'function' && !cb) {
cb = opts
opts = {}
} else if (typeof opts === 'function') {
opts = { filter: opts }
}
cb = cb || function () {}
opts = opts || {}
opts.clobber = 'clobber' in opts ? !!opts.clobber : true // default to true for now
opts.overwrite = 'overwrite' in opts ? !!opts.overwrite : opts.clobber // overwrite falls back to clobber
// Warn about using preserveTimestamps on 32-bit node
if (opts.preserveTimestamps && process.arch === 'ia32') {
process.emitWarning(
'Using the preserveTimestamps option in 32-bit node is not recommended;\n\n' +
'\tsee https://github.com/jprichardson/node-fs-extra/issues/269',
'Warning', 'fs-extra-WARN0001'
)
}
stat.checkPaths(src, dest, 'copy', opts, (err, stats) => {
if (err) return cb(err)
const { srcStat, destStat } = stats
stat.checkParentPaths(src, srcStat, dest, 'copy', err => {
if (err) return cb(err)
if (opts.filter) return handleFilter(checkParentDir, destStat, src, dest, opts, cb)
return checkParentDir(destStat, src, dest, opts, cb)
})
})
}
function checkParentDir (destStat, src, dest, opts, cb) {
const destParent = path.dirname(dest)
pathExists(destParent, (err, dirExists) => {
if (err) return cb(err)
if (dirExists) return getStats(destStat, src, dest, opts, cb)
mkdirs(destParent, err => {
if (err) return cb(err)
return getStats(destStat, src, dest, opts, cb)
})
})
}
function handleFilter (onInclude, destStat, src, dest, opts, cb) {
Promise.resolve(opts.filter(src, dest)).then(include => {
if (include) return onInclude(destStat, src, dest, opts, cb)
return cb()
}, error => cb(error))
}
function startCopy (destStat, src, dest, opts, cb) {
if (opts.filter) return handleFilter(getStats, destStat, src, dest, opts, cb)
return getStats(destStat, src, dest, opts, cb)
}
function getStats (destStat, src, dest, opts, cb) {
const stat = opts.dereference ? fs.stat : fs.lstat
stat(src, (err, srcStat) => {
if (err) return cb(err)
if (srcStat.isDirectory()) return onDir(srcStat, destStat, src, dest, opts, cb)
else if (srcStat.isFile() ||
srcStat.isCharacterDevice() ||
srcStat.isBlockDevice()) return onFile(srcStat, destStat, src, dest, opts, cb)
else if (srcStat.isSymbolicLink()) return onLink(destStat, src, dest, opts, cb)
else if (srcStat.isSocket()) return cb(new Error(`Cannot copy a socket file: ${src}`))
else if (srcStat.isFIFO()) return cb(new Error(`Cannot copy a FIFO pipe: ${src}`))
return cb(new Error(`Unknown file: ${src}`))
})
}
function onFile (srcStat, destStat, src, dest, opts, cb) {
if (!destStat) return copyFile(srcStat, src, dest, opts, cb)
return mayCopyFile(srcStat, src, dest, opts, cb)
}
function mayCopyFile (srcStat, src, dest, opts, cb) {
if (opts.overwrite) {
fs.unlink(dest, err => {
if (err) return cb(err)
return copyFile(srcStat, src, dest, opts, cb)
})
} else if (opts.errorOnExist) {
return cb(new Error(`'${dest}' already exists`))
} else return cb()
}
function copyFile (srcStat, src, dest, opts, cb) {
fs.copyFile(src, dest, err => {
if (err) return cb(err)
if (opts.preserveTimestamps) return handleTimestampsAndMode(srcStat.mode, src, dest, cb)
return setDestMode(dest, srcStat.mode, cb)
})
}
function handleTimestampsAndMode (srcMode, src, dest, cb) {
// Make sure the file is writable before setting the timestamp
// otherwise open fails with EPERM when invoked with 'r+'
// (through utimes call)
if (fileIsNotWritable(srcMode)) {
return makeFileWritable(dest, srcMode, err => {
if (err) return cb(err)
return setDestTimestampsAndMode(srcMode, src, dest, cb)
})
}
return setDestTimestampsAndMode(srcMode, src, dest, cb)
}
function fileIsNotWritable (srcMode) {
return (srcMode & 0o200) === 0
}
function makeFileWritable (dest, srcMode, cb) {
return setDestMode(dest, srcMode | 0o200, cb)
}
function setDestTimestampsAndMode (srcMode, src, dest, cb) {
setDestTimestamps(src, dest, err => {
if (err) return cb(err)
return setDestMode(dest, srcMode, cb)
})
}
function setDestMode (dest, srcMode, cb) {
return fs.chmod(dest, srcMode, cb)
}
function setDestTimestamps (src, dest, cb) {
// The initial srcStat.atime cannot be trusted
// because it is modified by the read(2) system call
// (See https://nodejs.org/api/fs.html#fs_stat_time_values)
fs.stat(src, (err, updatedSrcStat) => {
if (err) return cb(err)
return utimesMillis(dest, updatedSrcStat.atime, updatedSrcStat.mtime, cb)
})
}
function onDir (srcStat, destStat, src, dest, opts, cb) {
if (!destStat) return mkDirAndCopy(srcStat.mode, src, dest, opts, cb)
return copyDir(src, dest, opts, cb)
}
function mkDirAndCopy (srcMode, src, dest, opts, cb) {
fs.mkdir(dest, err => {
if (err) return cb(err)
copyDir(src, dest, opts, err => {
if (err) return cb(err)
return setDestMode(dest, srcMode, cb)
})
})
}
function copyDir (src, dest, opts, cb) {
fs.readdir(src, (err, items) => {
if (err) return cb(err)
return copyDirItems(items, src, dest, opts, cb)
})
}
function copyDirItems (items, src, dest, opts, cb) {
const item = items.pop()
if (!item) return cb()
return copyDirItem(items, item, src, dest, opts, cb)
}
function copyDirItem (items, item, src, dest, opts, cb) {
const srcItem = path.join(src, item)
const destItem = path.join(dest, item)
stat.checkPaths(srcItem, destItem, 'copy', opts, (err, stats) => {
if (err) return cb(err)
const { destStat } = stats
startCopy(destStat, srcItem, destItem, opts, err => {
if (err) return cb(err)
return copyDirItems(items, src, dest, opts, cb)
})
})
}
function onLink (destStat, src, dest, opts, cb) {
fs.readlink(src, (err, resolvedSrc) => {
if (err) return cb(err)
if (opts.dereference) {
resolvedSrc = path.resolve(process.cwd(), resolvedSrc)
}
if (!destStat) {
return fs.symlink(resolvedSrc, dest, cb)
} else {
fs.readlink(dest, (err, resolvedDest) => {
if (err) {
// dest exists and is a regular file or directory,
// Windows may throw UNKNOWN error. If dest already exists,
// fs throws error anyway, so no need to guard against it here.
if (err.code === 'EINVAL' || err.code === 'UNKNOWN') return fs.symlink(resolvedSrc, dest, cb)
return cb(err)
}
if (opts.dereference) {
resolvedDest = path.resolve(process.cwd(), resolvedDest)
}
if (stat.isSrcSubdir(resolvedSrc, resolvedDest)) {
return cb(new Error(`Cannot copy '${resolvedSrc}' to a subdirectory of itself, '${resolvedDest}'.`))
}
// do not copy if src is a subdir of dest since unlinking
// dest in this case would result in removing src contents
// and therefore a broken symlink would be created.
if (destStat.isDirectory() && stat.isSrcSubdir(resolvedDest, resolvedSrc)) {
return cb(new Error(`Cannot overwrite '${resolvedDest}' with '${resolvedSrc}'.`))
}
return copyLink(resolvedSrc, dest, cb)
})
}
})
}
function copyLink (resolvedSrc, dest, cb) {
fs.unlink(dest, err => {
if (err) return cb(err)
return fs.symlink(resolvedSrc, dest, cb)
})
}
module.exports = copy

View File

@@ -0,0 +1,7 @@
'use strict'
const u = require('universalify').fromCallback
module.exports = {
copy: u(require('./copy')),
copySync: require('./copy-sync')
}

View File

@@ -0,0 +1,39 @@
'use strict'
const u = require('universalify').fromPromise
const fs = require('../fs')
const path = require('path')
const mkdir = require('../mkdirs')
const remove = require('../remove')
const emptyDir = u(async function emptyDir (dir) {
let items
try {
items = await fs.readdir(dir)
} catch {
return mkdir.mkdirs(dir)
}
return Promise.all(items.map(item => remove.remove(path.join(dir, item))))
})
function emptyDirSync (dir) {
let items
try {
items = fs.readdirSync(dir)
} catch {
return mkdir.mkdirsSync(dir)
}
items.forEach(item => {
item = path.join(dir, item)
remove.removeSync(item)
})
}
module.exports = {
emptyDirSync,
emptydirSync: emptyDirSync,
emptyDir,
emptydir: emptyDir
}

View File

@@ -0,0 +1,69 @@
'use strict'
const u = require('universalify').fromCallback
const path = require('path')
const fs = require('graceful-fs')
const mkdir = require('../mkdirs')
function createFile (file, callback) {
function makeFile () {
fs.writeFile(file, '', err => {
if (err) return callback(err)
callback()
})
}
fs.stat(file, (err, stats) => { // eslint-disable-line handle-callback-err
if (!err && stats.isFile()) return callback()
const dir = path.dirname(file)
fs.stat(dir, (err, stats) => {
if (err) {
// if the directory doesn't exist, make it
if (err.code === 'ENOENT') {
return mkdir.mkdirs(dir, err => {
if (err) return callback(err)
makeFile()
})
}
return callback(err)
}
if (stats.isDirectory()) makeFile()
else {
// parent is not a directory
// This is just to cause an internal ENOTDIR error to be thrown
fs.readdir(dir, err => {
if (err) return callback(err)
})
}
})
})
}
function createFileSync (file) {
let stats
try {
stats = fs.statSync(file)
} catch {}
if (stats && stats.isFile()) return
const dir = path.dirname(file)
try {
if (!fs.statSync(dir).isDirectory()) {
// parent is not a directory
// This is just to cause an internal ENOTDIR error to be thrown
fs.readdirSync(dir)
}
} catch (err) {
// If the stat call above failed because the directory doesn't exist, create it
if (err && err.code === 'ENOENT') mkdir.mkdirsSync(dir)
else throw err
}
fs.writeFileSync(file, '')
}
module.exports = {
createFile: u(createFile),
createFileSync
}

View File

@@ -0,0 +1,23 @@
'use strict'
const { createFile, createFileSync } = require('./file')
const { createLink, createLinkSync } = require('./link')
const { createSymlink, createSymlinkSync } = require('./symlink')
module.exports = {
// file
createFile,
createFileSync,
ensureFile: createFile,
ensureFileSync: createFileSync,
// link
createLink,
createLinkSync,
ensureLink: createLink,
ensureLinkSync: createLinkSync,
// symlink
createSymlink,
createSymlinkSync,
ensureSymlink: createSymlink,
ensureSymlinkSync: createSymlinkSync
}

View File

@@ -0,0 +1,64 @@
'use strict'
const u = require('universalify').fromCallback
const path = require('path')
const fs = require('graceful-fs')
const mkdir = require('../mkdirs')
const pathExists = require('../path-exists').pathExists
const { areIdentical } = require('../util/stat')
function createLink (srcpath, dstpath, callback) {
function makeLink (srcpath, dstpath) {
fs.link(srcpath, dstpath, err => {
if (err) return callback(err)
callback(null)
})
}
fs.lstat(dstpath, (_, dstStat) => {
fs.lstat(srcpath, (err, srcStat) => {
if (err) {
err.message = err.message.replace('lstat', 'ensureLink')
return callback(err)
}
if (dstStat && areIdentical(srcStat, dstStat)) return callback(null)
const dir = path.dirname(dstpath)
pathExists(dir, (err, dirExists) => {
if (err) return callback(err)
if (dirExists) return makeLink(srcpath, dstpath)
mkdir.mkdirs(dir, err => {
if (err) return callback(err)
makeLink(srcpath, dstpath)
})
})
})
})
}
function createLinkSync (srcpath, dstpath) {
let dstStat
try {
dstStat = fs.lstatSync(dstpath)
} catch {}
try {
const srcStat = fs.lstatSync(srcpath)
if (dstStat && areIdentical(srcStat, dstStat)) return
} catch (err) {
err.message = err.message.replace('lstat', 'ensureLink')
throw err
}
const dir = path.dirname(dstpath)
const dirExists = fs.existsSync(dir)
if (dirExists) return fs.linkSync(srcpath, dstpath)
mkdir.mkdirsSync(dir)
return fs.linkSync(srcpath, dstpath)
}
module.exports = {
createLink: u(createLink),
createLinkSync
}

View File

@@ -0,0 +1,99 @@
'use strict'
const path = require('path')
const fs = require('graceful-fs')
const pathExists = require('../path-exists').pathExists
/**
* Function that returns two types of paths, one relative to symlink, and one
* relative to the current working directory. Checks if path is absolute or
* relative. If the path is relative, this function checks if the path is
* relative to symlink or relative to current working directory. This is an
* initiative to find a smarter `srcpath` to supply when building symlinks.
* This allows you to determine which path to use out of one of three possible
* types of source paths. The first is an absolute path. This is detected by
* `path.isAbsolute()`. When an absolute path is provided, it is checked to
* see if it exists. If it does it's used, if not an error is returned
* (callback)/ thrown (sync). The other two options for `srcpath` are a
* relative url. By default Node's `fs.symlink` works by creating a symlink
* using `dstpath` and expects the `srcpath` to be relative to the newly
* created symlink. If you provide a `srcpath` that does not exist on the file
* system it results in a broken symlink. To minimize this, the function
* checks to see if the 'relative to symlink' source file exists, and if it
* does it will use it. If it does not, it checks if there's a file that
* exists that is relative to the current working directory, if does its used.
* This preserves the expectations of the original fs.symlink spec and adds
* the ability to pass in `relative to current working direcotry` paths.
*/
function symlinkPaths (srcpath, dstpath, callback) {
if (path.isAbsolute(srcpath)) {
return fs.lstat(srcpath, (err) => {
if (err) {
err.message = err.message.replace('lstat', 'ensureSymlink')
return callback(err)
}
return callback(null, {
toCwd: srcpath,
toDst: srcpath
})
})
} else {
const dstdir = path.dirname(dstpath)
const relativeToDst = path.join(dstdir, srcpath)
return pathExists(relativeToDst, (err, exists) => {
if (err) return callback(err)
if (exists) {
return callback(null, {
toCwd: relativeToDst,
toDst: srcpath
})
} else {
return fs.lstat(srcpath, (err) => {
if (err) {
err.message = err.message.replace('lstat', 'ensureSymlink')
return callback(err)
}
return callback(null, {
toCwd: srcpath,
toDst: path.relative(dstdir, srcpath)
})
})
}
})
}
}
function symlinkPathsSync (srcpath, dstpath) {
let exists
if (path.isAbsolute(srcpath)) {
exists = fs.existsSync(srcpath)
if (!exists) throw new Error('absolute srcpath does not exist')
return {
toCwd: srcpath,
toDst: srcpath
}
} else {
const dstdir = path.dirname(dstpath)
const relativeToDst = path.join(dstdir, srcpath)
exists = fs.existsSync(relativeToDst)
if (exists) {
return {
toCwd: relativeToDst,
toDst: srcpath
}
} else {
exists = fs.existsSync(srcpath)
if (!exists) throw new Error('relative srcpath does not exist')
return {
toCwd: srcpath,
toDst: path.relative(dstdir, srcpath)
}
}
}
}
module.exports = {
symlinkPaths,
symlinkPathsSync
}

View File

@@ -0,0 +1,31 @@
'use strict'
const fs = require('graceful-fs')
function symlinkType (srcpath, type, callback) {
callback = (typeof type === 'function') ? type : callback
type = (typeof type === 'function') ? false : type
if (type) return callback(null, type)
fs.lstat(srcpath, (err, stats) => {
if (err) return callback(null, 'file')
type = (stats && stats.isDirectory()) ? 'dir' : 'file'
callback(null, type)
})
}
function symlinkTypeSync (srcpath, type) {
let stats
if (type) return type
try {
stats = fs.lstatSync(srcpath)
} catch {
return 'file'
}
return (stats && stats.isDirectory()) ? 'dir' : 'file'
}
module.exports = {
symlinkType,
symlinkTypeSync
}

View File

@@ -0,0 +1,82 @@
'use strict'
const u = require('universalify').fromCallback
const path = require('path')
const fs = require('../fs')
const _mkdirs = require('../mkdirs')
const mkdirs = _mkdirs.mkdirs
const mkdirsSync = _mkdirs.mkdirsSync
const _symlinkPaths = require('./symlink-paths')
const symlinkPaths = _symlinkPaths.symlinkPaths
const symlinkPathsSync = _symlinkPaths.symlinkPathsSync
const _symlinkType = require('./symlink-type')
const symlinkType = _symlinkType.symlinkType
const symlinkTypeSync = _symlinkType.symlinkTypeSync
const pathExists = require('../path-exists').pathExists
const { areIdentical } = require('../util/stat')
function createSymlink (srcpath, dstpath, type, callback) {
callback = (typeof type === 'function') ? type : callback
type = (typeof type === 'function') ? false : type
fs.lstat(dstpath, (err, stats) => {
if (!err && stats.isSymbolicLink()) {
Promise.all([
fs.stat(srcpath),
fs.stat(dstpath)
]).then(([srcStat, dstStat]) => {
if (areIdentical(srcStat, dstStat)) return callback(null)
_createSymlink(srcpath, dstpath, type, callback)
})
} else _createSymlink(srcpath, dstpath, type, callback)
})
}
function _createSymlink (srcpath, dstpath, type, callback) {
symlinkPaths(srcpath, dstpath, (err, relative) => {
if (err) return callback(err)
srcpath = relative.toDst
symlinkType(relative.toCwd, type, (err, type) => {
if (err) return callback(err)
const dir = path.dirname(dstpath)
pathExists(dir, (err, dirExists) => {
if (err) return callback(err)
if (dirExists) return fs.symlink(srcpath, dstpath, type, callback)
mkdirs(dir, err => {
if (err) return callback(err)
fs.symlink(srcpath, dstpath, type, callback)
})
})
})
})
}
function createSymlinkSync (srcpath, dstpath, type) {
let stats
try {
stats = fs.lstatSync(dstpath)
} catch {}
if (stats && stats.isSymbolicLink()) {
const srcStat = fs.statSync(srcpath)
const dstStat = fs.statSync(dstpath)
if (areIdentical(srcStat, dstStat)) return
}
const relative = symlinkPathsSync(srcpath, dstpath)
srcpath = relative.toDst
type = symlinkTypeSync(relative.toCwd, type)
const dir = path.dirname(dstpath)
const exists = fs.existsSync(dir)
if (exists) return fs.symlinkSync(srcpath, dstpath, type)
mkdirsSync(dir)
return fs.symlinkSync(srcpath, dstpath, type)
}
module.exports = {
createSymlink: u(createSymlink),
createSymlinkSync
}

View File

@@ -0,0 +1,128 @@
'use strict'
// This is adapted from https://github.com/normalize/mz
// Copyright (c) 2014-2016 Jonathan Ong me@jongleberry.com and Contributors
const u = require('universalify').fromCallback
const fs = require('graceful-fs')
const api = [
'access',
'appendFile',
'chmod',
'chown',
'close',
'copyFile',
'fchmod',
'fchown',
'fdatasync',
'fstat',
'fsync',
'ftruncate',
'futimes',
'lchmod',
'lchown',
'link',
'lstat',
'mkdir',
'mkdtemp',
'open',
'opendir',
'readdir',
'readFile',
'readlink',
'realpath',
'rename',
'rm',
'rmdir',
'stat',
'symlink',
'truncate',
'unlink',
'utimes',
'writeFile'
].filter(key => {
// Some commands are not available on some systems. Ex:
// fs.opendir was added in Node.js v12.12.0
// fs.rm was added in Node.js v14.14.0
// fs.lchown is not available on at least some Linux
return typeof fs[key] === 'function'
})
// Export cloned fs:
Object.assign(exports, fs)
// Universalify async methods:
api.forEach(method => {
exports[method] = u(fs[method])
})
// We differ from mz/fs in that we still ship the old, broken, fs.exists()
// since we are a drop-in replacement for the native module
exports.exists = function (filename, callback) {
if (typeof callback === 'function') {
return fs.exists(filename, callback)
}
return new Promise(resolve => {
return fs.exists(filename, resolve)
})
}
// fs.read(), fs.write(), & fs.writev() need special treatment due to multiple callback args
exports.read = function (fd, buffer, offset, length, position, callback) {
if (typeof callback === 'function') {
return fs.read(fd, buffer, offset, length, position, callback)
}
return new Promise((resolve, reject) => {
fs.read(fd, buffer, offset, length, position, (err, bytesRead, buffer) => {
if (err) return reject(err)
resolve({ bytesRead, buffer })
})
})
}
// Function signature can be
// fs.write(fd, buffer[, offset[, length[, position]]], callback)
// OR
// fs.write(fd, string[, position[, encoding]], callback)
// We need to handle both cases, so we use ...args
exports.write = function (fd, buffer, ...args) {
if (typeof args[args.length - 1] === 'function') {
return fs.write(fd, buffer, ...args)
}
return new Promise((resolve, reject) => {
fs.write(fd, buffer, ...args, (err, bytesWritten, buffer) => {
if (err) return reject(err)
resolve({ bytesWritten, buffer })
})
})
}
// fs.writev only available in Node v12.9.0+
if (typeof fs.writev === 'function') {
// Function signature is
// s.writev(fd, buffers[, position], callback)
// We need to handle the optional arg, so we use ...args
exports.writev = function (fd, buffers, ...args) {
if (typeof args[args.length - 1] === 'function') {
return fs.writev(fd, buffers, ...args)
}
return new Promise((resolve, reject) => {
fs.writev(fd, buffers, ...args, (err, bytesWritten, buffers) => {
if (err) return reject(err)
resolve({ bytesWritten, buffers })
})
})
}
}
// fs.realpath.native sometimes not available if fs is monkey-patched
if (typeof fs.realpath.native === 'function') {
exports.realpath.native = u(fs.realpath.native)
} else {
process.emitWarning(
'fs.realpath.native is not a function. Is fs being monkey-patched?',
'Warning', 'fs-extra-WARN0003'
)
}

View File

@@ -0,0 +1,16 @@
'use strict'
module.exports = {
// Export promiseified graceful-fs:
...require('./fs'),
// Export extra methods:
...require('./copy'),
...require('./empty'),
...require('./ensure'),
...require('./json'),
...require('./mkdirs'),
...require('./move'),
...require('./output-file'),
...require('./path-exists'),
...require('./remove')
}

View File

@@ -0,0 +1,16 @@
'use strict'
const u = require('universalify').fromPromise
const jsonFile = require('./jsonfile')
jsonFile.outputJson = u(require('./output-json'))
jsonFile.outputJsonSync = require('./output-json-sync')
// aliases
jsonFile.outputJSON = jsonFile.outputJson
jsonFile.outputJSONSync = jsonFile.outputJsonSync
jsonFile.writeJSON = jsonFile.writeJson
jsonFile.writeJSONSync = jsonFile.writeJsonSync
jsonFile.readJSON = jsonFile.readJson
jsonFile.readJSONSync = jsonFile.readJsonSync
module.exports = jsonFile

View File

@@ -0,0 +1,11 @@
'use strict'
const jsonFile = require('jsonfile')
module.exports = {
// jsonfile exports
readJson: jsonFile.readFile,
readJsonSync: jsonFile.readFileSync,
writeJson: jsonFile.writeFile,
writeJsonSync: jsonFile.writeFileSync
}

View File

@@ -0,0 +1,12 @@
'use strict'
const { stringify } = require('jsonfile/utils')
const { outputFileSync } = require('../output-file')
function outputJsonSync (file, data, options) {
const str = stringify(data, options)
outputFileSync(file, str, options)
}
module.exports = outputJsonSync

View File

@@ -0,0 +1,12 @@
'use strict'
const { stringify } = require('jsonfile/utils')
const { outputFile } = require('../output-file')
async function outputJson (file, data, options = {}) {
const str = stringify(data, options)
await outputFile(file, str, options)
}
module.exports = outputJson

View File

@@ -0,0 +1,14 @@
'use strict'
const u = require('universalify').fromPromise
const { makeDir: _makeDir, makeDirSync } = require('./make-dir')
const makeDir = u(_makeDir)
module.exports = {
mkdirs: makeDir,
mkdirsSync: makeDirSync,
// alias
mkdirp: makeDir,
mkdirpSync: makeDirSync,
ensureDir: makeDir,
ensureDirSync: makeDirSync
}

View File

@@ -0,0 +1,27 @@
'use strict'
const fs = require('../fs')
const { checkPath } = require('./utils')
const getMode = options => {
const defaults = { mode: 0o777 }
if (typeof options === 'number') return options
return ({ ...defaults, ...options }).mode
}
module.exports.makeDir = async (dir, options) => {
checkPath(dir)
return fs.mkdir(dir, {
mode: getMode(options),
recursive: true
})
}
module.exports.makeDirSync = (dir, options) => {
checkPath(dir)
return fs.mkdirSync(dir, {
mode: getMode(options),
recursive: true
})
}

View File

@@ -0,0 +1,21 @@
// Adapted from https://github.com/sindresorhus/make-dir
// Copyright (c) Sindre Sorhus <sindresorhus@gmail.com> (sindresorhus.com)
// Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
// The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
'use strict'
const path = require('path')
// https://github.com/nodejs/node/issues/8987
// https://github.com/libuv/libuv/pull/1088
module.exports.checkPath = function checkPath (pth) {
if (process.platform === 'win32') {
const pathHasInvalidWinCharacters = /[<>:"|?*]/.test(pth.replace(path.parse(pth).root, ''))
if (pathHasInvalidWinCharacters) {
const error = new Error(`Path contains invalid characters: ${pth}`)
error.code = 'EINVAL'
throw error
}
}
}

View File

@@ -0,0 +1,7 @@
'use strict'
const u = require('universalify').fromCallback
module.exports = {
move: u(require('./move')),
moveSync: require('./move-sync')
}

View File

@@ -0,0 +1,54 @@
'use strict'
const fs = require('graceful-fs')
const path = require('path')
const copySync = require('../copy').copySync
const removeSync = require('../remove').removeSync
const mkdirpSync = require('../mkdirs').mkdirpSync
const stat = require('../util/stat')
function moveSync (src, dest, opts) {
opts = opts || {}
const overwrite = opts.overwrite || opts.clobber || false
const { srcStat, isChangingCase = false } = stat.checkPathsSync(src, dest, 'move', opts)
stat.checkParentPathsSync(src, srcStat, dest, 'move')
if (!isParentRoot(dest)) mkdirpSync(path.dirname(dest))
return doRename(src, dest, overwrite, isChangingCase)
}
function isParentRoot (dest) {
const parent = path.dirname(dest)
const parsedPath = path.parse(parent)
return parsedPath.root === parent
}
function doRename (src, dest, overwrite, isChangingCase) {
if (isChangingCase) return rename(src, dest, overwrite)
if (overwrite) {
removeSync(dest)
return rename(src, dest, overwrite)
}
if (fs.existsSync(dest)) throw new Error('dest already exists.')
return rename(src, dest, overwrite)
}
function rename (src, dest, overwrite) {
try {
fs.renameSync(src, dest)
} catch (err) {
if (err.code !== 'EXDEV') throw err
return moveAcrossDevice(src, dest, overwrite)
}
}
function moveAcrossDevice (src, dest, overwrite) {
const opts = {
overwrite,
errorOnExist: true
}
copySync(src, dest, opts)
return removeSync(src)
}
module.exports = moveSync

View File

@@ -0,0 +1,75 @@
'use strict'
const fs = require('graceful-fs')
const path = require('path')
const copy = require('../copy').copy
const remove = require('../remove').remove
const mkdirp = require('../mkdirs').mkdirp
const pathExists = require('../path-exists').pathExists
const stat = require('../util/stat')
function move (src, dest, opts, cb) {
if (typeof opts === 'function') {
cb = opts
opts = {}
}
opts = opts || {}
const overwrite = opts.overwrite || opts.clobber || false
stat.checkPaths(src, dest, 'move', opts, (err, stats) => {
if (err) return cb(err)
const { srcStat, isChangingCase = false } = stats
stat.checkParentPaths(src, srcStat, dest, 'move', err => {
if (err) return cb(err)
if (isParentRoot(dest)) return doRename(src, dest, overwrite, isChangingCase, cb)
mkdirp(path.dirname(dest), err => {
if (err) return cb(err)
return doRename(src, dest, overwrite, isChangingCase, cb)
})
})
})
}
function isParentRoot (dest) {
const parent = path.dirname(dest)
const parsedPath = path.parse(parent)
return parsedPath.root === parent
}
function doRename (src, dest, overwrite, isChangingCase, cb) {
if (isChangingCase) return rename(src, dest, overwrite, cb)
if (overwrite) {
return remove(dest, err => {
if (err) return cb(err)
return rename(src, dest, overwrite, cb)
})
}
pathExists(dest, (err, destExists) => {
if (err) return cb(err)
if (destExists) return cb(new Error('dest already exists.'))
return rename(src, dest, overwrite, cb)
})
}
function rename (src, dest, overwrite, cb) {
fs.rename(src, dest, err => {
if (!err) return cb()
if (err.code !== 'EXDEV') return cb(err)
return moveAcrossDevice(src, dest, overwrite, cb)
})
}
function moveAcrossDevice (src, dest, overwrite, cb) {
const opts = {
overwrite,
errorOnExist: true
}
copy(src, dest, opts, err => {
if (err) return cb(err)
return remove(src, cb)
})
}
module.exports = move

View File

@@ -0,0 +1,40 @@
'use strict'
const u = require('universalify').fromCallback
const fs = require('graceful-fs')
const path = require('path')
const mkdir = require('../mkdirs')
const pathExists = require('../path-exists').pathExists
function outputFile (file, data, encoding, callback) {
if (typeof encoding === 'function') {
callback = encoding
encoding = 'utf8'
}
const dir = path.dirname(file)
pathExists(dir, (err, itDoes) => {
if (err) return callback(err)
if (itDoes) return fs.writeFile(file, data, encoding, callback)
mkdir.mkdirs(dir, err => {
if (err) return callback(err)
fs.writeFile(file, data, encoding, callback)
})
})
}
function outputFileSync (file, ...args) {
const dir = path.dirname(file)
if (fs.existsSync(dir)) {
return fs.writeFileSync(file, ...args)
}
mkdir.mkdirsSync(dir)
fs.writeFileSync(file, ...args)
}
module.exports = {
outputFile: u(outputFile),
outputFileSync
}

View File

@@ -0,0 +1,12 @@
'use strict'
const u = require('universalify').fromPromise
const fs = require('../fs')
function pathExists (path) {
return fs.access(path).then(() => true).catch(() => false)
}
module.exports = {
pathExists: u(pathExists),
pathExistsSync: fs.existsSync
}

View File

@@ -0,0 +1,22 @@
'use strict'
const fs = require('graceful-fs')
const u = require('universalify').fromCallback
const rimraf = require('./rimraf')
function remove (path, callback) {
// Node 14.14.0+
if (fs.rm) return fs.rm(path, { recursive: true, force: true }, callback)
rimraf(path, callback)
}
function removeSync (path) {
// Node 14.14.0+
if (fs.rmSync) return fs.rmSync(path, { recursive: true, force: true })
rimraf.sync(path)
}
module.exports = {
remove: u(remove),
removeSync
}

View File

@@ -0,0 +1,302 @@
'use strict'
const fs = require('graceful-fs')
const path = require('path')
const assert = require('assert')
const isWindows = (process.platform === 'win32')
function defaults (options) {
const methods = [
'unlink',
'chmod',
'stat',
'lstat',
'rmdir',
'readdir'
]
methods.forEach(m => {
options[m] = options[m] || fs[m]
m = m + 'Sync'
options[m] = options[m] || fs[m]
})
options.maxBusyTries = options.maxBusyTries || 3
}
function rimraf (p, options, cb) {
let busyTries = 0
if (typeof options === 'function') {
cb = options
options = {}
}
assert(p, 'rimraf: missing path')
assert.strictEqual(typeof p, 'string', 'rimraf: path should be a string')
assert.strictEqual(typeof cb, 'function', 'rimraf: callback function required')
assert(options, 'rimraf: invalid options argument provided')
assert.strictEqual(typeof options, 'object', 'rimraf: options should be object')
defaults(options)
rimraf_(p, options, function CB (er) {
if (er) {
if ((er.code === 'EBUSY' || er.code === 'ENOTEMPTY' || er.code === 'EPERM') &&
busyTries < options.maxBusyTries) {
busyTries++
const time = busyTries * 100
// try again, with the same exact callback as this one.
return setTimeout(() => rimraf_(p, options, CB), time)
}
// already gone
if (er.code === 'ENOENT') er = null
}
cb(er)
})
}
// Two possible strategies.
// 1. Assume it's a file. unlink it, then do the dir stuff on EPERM or EISDIR
// 2. Assume it's a directory. readdir, then do the file stuff on ENOTDIR
//
// Both result in an extra syscall when you guess wrong. However, there
// are likely far more normal files in the world than directories. This
// is based on the assumption that a the average number of files per
// directory is >= 1.
//
// If anyone ever complains about this, then I guess the strategy could
// be made configurable somehow. But until then, YAGNI.
function rimraf_ (p, options, cb) {
assert(p)
assert(options)
assert(typeof cb === 'function')
// sunos lets the root user unlink directories, which is... weird.
// so we have to lstat here and make sure it's not a dir.
options.lstat(p, (er, st) => {
if (er && er.code === 'ENOENT') {
return cb(null)
}
// Windows can EPERM on stat. Life is suffering.
if (er && er.code === 'EPERM' && isWindows) {
return fixWinEPERM(p, options, er, cb)
}
if (st && st.isDirectory()) {
return rmdir(p, options, er, cb)
}
options.unlink(p, er => {
if (er) {
if (er.code === 'ENOENT') {
return cb(null)
}
if (er.code === 'EPERM') {
return (isWindows)
? fixWinEPERM(p, options, er, cb)
: rmdir(p, options, er, cb)
}
if (er.code === 'EISDIR') {
return rmdir(p, options, er, cb)
}
}
return cb(er)
})
})
}
function fixWinEPERM (p, options, er, cb) {
assert(p)
assert(options)
assert(typeof cb === 'function')
options.chmod(p, 0o666, er2 => {
if (er2) {
cb(er2.code === 'ENOENT' ? null : er)
} else {
options.stat(p, (er3, stats) => {
if (er3) {
cb(er3.code === 'ENOENT' ? null : er)
} else if (stats.isDirectory()) {
rmdir(p, options, er, cb)
} else {
options.unlink(p, cb)
}
})
}
})
}
function fixWinEPERMSync (p, options, er) {
let stats
assert(p)
assert(options)
try {
options.chmodSync(p, 0o666)
} catch (er2) {
if (er2.code === 'ENOENT') {
return
} else {
throw er
}
}
try {
stats = options.statSync(p)
} catch (er3) {
if (er3.code === 'ENOENT') {
return
} else {
throw er
}
}
if (stats.isDirectory()) {
rmdirSync(p, options, er)
} else {
options.unlinkSync(p)
}
}
function rmdir (p, options, originalEr, cb) {
assert(p)
assert(options)
assert(typeof cb === 'function')
// try to rmdir first, and only readdir on ENOTEMPTY or EEXIST (SunOS)
// if we guessed wrong, and it's not a directory, then
// raise the original error.
options.rmdir(p, er => {
if (er && (er.code === 'ENOTEMPTY' || er.code === 'EEXIST' || er.code === 'EPERM')) {
rmkids(p, options, cb)
} else if (er && er.code === 'ENOTDIR') {
cb(originalEr)
} else {
cb(er)
}
})
}
function rmkids (p, options, cb) {
assert(p)
assert(options)
assert(typeof cb === 'function')
options.readdir(p, (er, files) => {
if (er) return cb(er)
let n = files.length
let errState
if (n === 0) return options.rmdir(p, cb)
files.forEach(f => {
rimraf(path.join(p, f), options, er => {
if (errState) {
return
}
if (er) return cb(errState = er)
if (--n === 0) {
options.rmdir(p, cb)
}
})
})
})
}
// this looks simpler, and is strictly *faster*, but will
// tie up the JavaScript thread and fail on excessively
// deep directory trees.
function rimrafSync (p, options) {
let st
options = options || {}
defaults(options)
assert(p, 'rimraf: missing path')
assert.strictEqual(typeof p, 'string', 'rimraf: path should be a string')
assert(options, 'rimraf: missing options')
assert.strictEqual(typeof options, 'object', 'rimraf: options should be object')
try {
st = options.lstatSync(p)
} catch (er) {
if (er.code === 'ENOENT') {
return
}
// Windows can EPERM on stat. Life is suffering.
if (er.code === 'EPERM' && isWindows) {
fixWinEPERMSync(p, options, er)
}
}
try {
// sunos lets the root user unlink directories, which is... weird.
if (st && st.isDirectory()) {
rmdirSync(p, options, null)
} else {
options.unlinkSync(p)
}
} catch (er) {
if (er.code === 'ENOENT') {
return
} else if (er.code === 'EPERM') {
return isWindows ? fixWinEPERMSync(p, options, er) : rmdirSync(p, options, er)
} else if (er.code !== 'EISDIR') {
throw er
}
rmdirSync(p, options, er)
}
}
function rmdirSync (p, options, originalEr) {
assert(p)
assert(options)
try {
options.rmdirSync(p)
} catch (er) {
if (er.code === 'ENOTDIR') {
throw originalEr
} else if (er.code === 'ENOTEMPTY' || er.code === 'EEXIST' || er.code === 'EPERM') {
rmkidsSync(p, options)
} else if (er.code !== 'ENOENT') {
throw er
}
}
}
function rmkidsSync (p, options) {
assert(p)
assert(options)
options.readdirSync(p).forEach(f => rimrafSync(path.join(p, f), options))
if (isWindows) {
// We only end up here once we got ENOTEMPTY at least once, and
// at this point, we are guaranteed to have removed all the kids.
// So, we know that it won't be ENOENT or ENOTDIR or anything else.
// try really hard to delete stuff on windows, because it has a
// PROFOUNDLY annoying habit of not closing handles promptly when
// files are deleted, resulting in spurious ENOTEMPTY errors.
const startTime = Date.now()
do {
try {
const ret = options.rmdirSync(p, options)
return ret
} catch {}
} while (Date.now() - startTime < 500) // give up after 500ms
} else {
const ret = options.rmdirSync(p, options)
return ret
}
}
module.exports = rimraf
rimraf.sync = rimrafSync

View File

@@ -0,0 +1,154 @@
'use strict'
const fs = require('../fs')
const path = require('path')
const util = require('util')
function getStats (src, dest, opts) {
const statFunc = opts.dereference
? (file) => fs.stat(file, { bigint: true })
: (file) => fs.lstat(file, { bigint: true })
return Promise.all([
statFunc(src),
statFunc(dest).catch(err => {
if (err.code === 'ENOENT') return null
throw err
})
]).then(([srcStat, destStat]) => ({ srcStat, destStat }))
}
function getStatsSync (src, dest, opts) {
let destStat
const statFunc = opts.dereference
? (file) => fs.statSync(file, { bigint: true })
: (file) => fs.lstatSync(file, { bigint: true })
const srcStat = statFunc(src)
try {
destStat = statFunc(dest)
} catch (err) {
if (err.code === 'ENOENT') return { srcStat, destStat: null }
throw err
}
return { srcStat, destStat }
}
function checkPaths (src, dest, funcName, opts, cb) {
util.callbackify(getStats)(src, dest, opts, (err, stats) => {
if (err) return cb(err)
const { srcStat, destStat } = stats
if (destStat) {
if (areIdentical(srcStat, destStat)) {
const srcBaseName = path.basename(src)
const destBaseName = path.basename(dest)
if (funcName === 'move' &&
srcBaseName !== destBaseName &&
srcBaseName.toLowerCase() === destBaseName.toLowerCase()) {
return cb(null, { srcStat, destStat, isChangingCase: true })
}
return cb(new Error('Source and destination must not be the same.'))
}
if (srcStat.isDirectory() && !destStat.isDirectory()) {
return cb(new Error(`Cannot overwrite non-directory '${dest}' with directory '${src}'.`))
}
if (!srcStat.isDirectory() && destStat.isDirectory()) {
return cb(new Error(`Cannot overwrite directory '${dest}' with non-directory '${src}'.`))
}
}
if (srcStat.isDirectory() && isSrcSubdir(src, dest)) {
return cb(new Error(errMsg(src, dest, funcName)))
}
return cb(null, { srcStat, destStat })
})
}
function checkPathsSync (src, dest, funcName, opts) {
const { srcStat, destStat } = getStatsSync(src, dest, opts)
if (destStat) {
if (areIdentical(srcStat, destStat)) {
const srcBaseName = path.basename(src)
const destBaseName = path.basename(dest)
if (funcName === 'move' &&
srcBaseName !== destBaseName &&
srcBaseName.toLowerCase() === destBaseName.toLowerCase()) {
return { srcStat, destStat, isChangingCase: true }
}
throw new Error('Source and destination must not be the same.')
}
if (srcStat.isDirectory() && !destStat.isDirectory()) {
throw new Error(`Cannot overwrite non-directory '${dest}' with directory '${src}'.`)
}
if (!srcStat.isDirectory() && destStat.isDirectory()) {
throw new Error(`Cannot overwrite directory '${dest}' with non-directory '${src}'.`)
}
}
if (srcStat.isDirectory() && isSrcSubdir(src, dest)) {
throw new Error(errMsg(src, dest, funcName))
}
return { srcStat, destStat }
}
// recursively check if dest parent is a subdirectory of src.
// It works for all file types including symlinks since it
// checks the src and dest inodes. It starts from the deepest
// parent and stops once it reaches the src parent or the root path.
function checkParentPaths (src, srcStat, dest, funcName, cb) {
const srcParent = path.resolve(path.dirname(src))
const destParent = path.resolve(path.dirname(dest))
if (destParent === srcParent || destParent === path.parse(destParent).root) return cb()
fs.stat(destParent, { bigint: true }, (err, destStat) => {
if (err) {
if (err.code === 'ENOENT') return cb()
return cb(err)
}
if (areIdentical(srcStat, destStat)) {
return cb(new Error(errMsg(src, dest, funcName)))
}
return checkParentPaths(src, srcStat, destParent, funcName, cb)
})
}
function checkParentPathsSync (src, srcStat, dest, funcName) {
const srcParent = path.resolve(path.dirname(src))
const destParent = path.resolve(path.dirname(dest))
if (destParent === srcParent || destParent === path.parse(destParent).root) return
let destStat
try {
destStat = fs.statSync(destParent, { bigint: true })
} catch (err) {
if (err.code === 'ENOENT') return
throw err
}
if (areIdentical(srcStat, destStat)) {
throw new Error(errMsg(src, dest, funcName))
}
return checkParentPathsSync(src, srcStat, destParent, funcName)
}
function areIdentical (srcStat, destStat) {
return destStat.ino && destStat.dev && destStat.ino === srcStat.ino && destStat.dev === srcStat.dev
}
// return true if dest is a subdir of src, otherwise false.
// It only checks the path strings.
function isSrcSubdir (src, dest) {
const srcArr = path.resolve(src).split(path.sep).filter(i => i)
const destArr = path.resolve(dest).split(path.sep).filter(i => i)
return srcArr.reduce((acc, cur, i) => acc && destArr[i] === cur, true)
}
function errMsg (src, dest, funcName) {
return `Cannot ${funcName} '${src}' to a subdirectory of itself, '${dest}'.`
}
module.exports = {
checkPaths,
checkPathsSync,
checkParentPaths,
checkParentPathsSync,
isSrcSubdir,
areIdentical
}

View File

@@ -0,0 +1,26 @@
'use strict'
const fs = require('graceful-fs')
function utimesMillis (path, atime, mtime, callback) {
// if (!HAS_MILLIS_RES) return fs.utimes(path, atime, mtime, callback)
fs.open(path, 'r+', (err, fd) => {
if (err) return callback(err)
fs.futimes(fd, atime, mtime, futimesErr => {
fs.close(fd, closeErr => {
if (callback) callback(futimesErr || closeErr)
})
})
})
}
function utimesMillisSync (path, atime, mtime) {
const fd = fs.openSync(path, 'r+')
fs.futimesSync(fd, atime, mtime)
return fs.closeSync(fd)
}
module.exports = {
utimesMillis,
utimesMillisSync
}

View File

@@ -0,0 +1,67 @@
{
"name": "fs-extra",
"version": "10.1.0",
"description": "fs-extra contains methods that aren't included in the vanilla Node.js fs package. Such as recursive mkdir, copy, and remove.",
"engines": {
"node": ">=12"
},
"homepage": "https://github.com/jprichardson/node-fs-extra",
"repository": {
"type": "git",
"url": "https://github.com/jprichardson/node-fs-extra"
},
"keywords": [
"fs",
"file",
"file system",
"copy",
"directory",
"extra",
"mkdirp",
"mkdir",
"mkdirs",
"recursive",
"json",
"read",
"write",
"extra",
"delete",
"remove",
"touch",
"create",
"text",
"output",
"move",
"promise"
],
"author": "JP Richardson <jprichardson@gmail.com>",
"license": "MIT",
"dependencies": {
"graceful-fs": "^4.2.0",
"jsonfile": "^6.0.1",
"universalify": "^2.0.0"
},
"devDependencies": {
"at-least-node": "^1.0.0",
"klaw": "^2.1.1",
"klaw-sync": "^3.0.2",
"minimist": "^1.1.1",
"mocha": "^5.0.5",
"nyc": "^15.0.0",
"proxyquire": "^2.0.1",
"read-dir-files": "^0.1.1",
"standard": "^16.0.3"
},
"main": "./lib/index.js",
"files": [
"lib/",
"!lib/**/__tests__/"
],
"scripts": {
"lint": "standard",
"test-find": "find ./lib/**/__tests__ -name *.test.js | xargs mocha",
"test": "npm run lint && npm run unit",
"unit": "nyc node test.js"
},
"sideEffects": false
}

View File

@@ -0,0 +1,15 @@
(The MIT License)
Copyright (c) 2012-2015, JP Richardson <jprichardson@gmail.com>
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files
(the 'Software'), to deal in the Software without restriction, including without limitation the rights to use, copy, modify,
merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS
OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View File

@@ -0,0 +1,230 @@
Node.js - jsonfile
================
Easily read/write JSON files in Node.js. _Note: this module cannot be used in the browser._
[![npm Package](https://img.shields.io/npm/v/jsonfile.svg?style=flat-square)](https://www.npmjs.org/package/jsonfile)
[![linux build status](https://img.shields.io/github/actions/workflow/status/jprichardson/node-jsonfile/ci.yml?branch=master)](https://github.com/jprichardson/node-jsonfile/actions?query=branch%3Amaster)
[![windows Build status](https://img.shields.io/appveyor/ci/jprichardson/node-jsonfile/master.svg?label=windows%20build)](https://ci.appveyor.com/project/jprichardson/node-jsonfile/branch/master)
<a href="https://github.com/feross/standard"><img src="https://cdn.rawgit.com/feross/standard/master/sticker.svg" alt="Standard JavaScript" width="100"></a>
Why?
----
Writing `JSON.stringify()` and then `fs.writeFile()` and `JSON.parse()` with `fs.readFile()` enclosed in `try/catch` blocks became annoying.
Installation
------------
npm install --save jsonfile
API
---
* [`readFile(filename, [options], callback)`](#readfilefilename-options-callback)
* [`readFileSync(filename, [options])`](#readfilesyncfilename-options)
* [`writeFile(filename, obj, [options], callback)`](#writefilefilename-obj-options-callback)
* [`writeFileSync(filename, obj, [options])`](#writefilesyncfilename-obj-options)
----
### readFile(filename, [options], callback)
`options` (`object`, default `undefined`): Pass in any [`fs.readFile`](https://nodejs.org/api/fs.html#fs_fs_readfile_path_options_callback) options or set `reviver` for a [JSON reviver](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/parse).
- `throws` (`boolean`, default: `true`). If `JSON.parse` throws an error, pass this error to the callback.
If `false`, returns `null` for the object.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
jsonfile.readFile(file, function (err, obj) {
if (err) console.error(err)
console.dir(obj)
})
```
You can also use this method with promises. The `readFile` method will return a promise if you do not pass a callback function.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
jsonfile.readFile(file)
.then(obj => console.dir(obj))
.catch(error => console.error(error))
```
----
### readFileSync(filename, [options])
`options` (`object`, default `undefined`): Pass in any [`fs.readFileSync`](https://nodejs.org/api/fs.html#fs_fs_readfilesync_path_options) options or set `reviver` for a [JSON reviver](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/parse).
- `throws` (`boolean`, default: `true`). If an error is encountered reading or parsing the file, throw the error. If `false`, returns `null` for the object.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
console.dir(jsonfile.readFileSync(file))
```
----
### writeFile(filename, obj, [options], callback)
`options`: Pass in any [`fs.writeFile`](https://nodejs.org/api/fs.html#fs_fs_writefile_file_data_options_callback) options or set `replacer` for a [JSON replacer](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify). Can also pass in `spaces`, or override `EOL` string or set `finalEOL` flag as `false` to not save the file with `EOL` at the end.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFile(file, obj, function (err) {
if (err) console.error(err)
})
```
Or use with promises as follows:
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFile(file, obj)
.then(res => {
console.log('Write complete')
})
.catch(error => console.error(error))
```
**formatting with spaces:**
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFile(file, obj, { spaces: 2 }, function (err) {
if (err) console.error(err)
})
```
**overriding EOL:**
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFile(file, obj, { spaces: 2, EOL: '\r\n' }, function (err) {
if (err) console.error(err)
})
```
**disabling the EOL at the end of file:**
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFile(file, obj, { spaces: 2, finalEOL: false }, function (err) {
if (err) console.log(err)
})
```
**appending to an existing JSON file:**
You can use `fs.writeFile` option `{ flag: 'a' }` to achieve this.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/mayAlreadyExistedData.json'
const obj = { name: 'JP' }
jsonfile.writeFile(file, obj, { flag: 'a' }, function (err) {
if (err) console.error(err)
})
```
----
### writeFileSync(filename, obj, [options])
`options`: Pass in any [`fs.writeFileSync`](https://nodejs.org/api/fs.html#fs_fs_writefilesync_file_data_options) options or set `replacer` for a [JSON replacer](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify). Can also pass in `spaces`, or override `EOL` string or set `finalEOL` flag as `false` to not save the file with `EOL` at the end.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFileSync(file, obj)
```
**formatting with spaces:**
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFileSync(file, obj, { spaces: 2 })
```
**overriding EOL:**
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFileSync(file, obj, { spaces: 2, EOL: '\r\n' })
```
**disabling the EOL at the end of file:**
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFileSync(file, obj, { spaces: 2, finalEOL: false })
```
**appending to an existing JSON file:**
You can use `fs.writeFileSync` option `{ flag: 'a' }` to achieve this.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/mayAlreadyExistedData.json'
const obj = { name: 'JP' }
jsonfile.writeFileSync(file, obj, { flag: 'a' })
```
License
-------
(MIT License)
Copyright 2012-2016, JP Richardson <jprichardson@gmail.com>

View File

@@ -0,0 +1,88 @@
let _fs
try {
_fs = require('graceful-fs')
} catch (_) {
_fs = require('fs')
}
const universalify = require('universalify')
const { stringify, stripBom } = require('./utils')
async function _readFile (file, options = {}) {
if (typeof options === 'string') {
options = { encoding: options }
}
const fs = options.fs || _fs
const shouldThrow = 'throws' in options ? options.throws : true
let data = await universalify.fromCallback(fs.readFile)(file, options)
data = stripBom(data)
let obj
try {
obj = JSON.parse(data, options ? options.reviver : null)
} catch (err) {
if (shouldThrow) {
err.message = `${file}: ${err.message}`
throw err
} else {
return null
}
}
return obj
}
const readFile = universalify.fromPromise(_readFile)
function readFileSync (file, options = {}) {
if (typeof options === 'string') {
options = { encoding: options }
}
const fs = options.fs || _fs
const shouldThrow = 'throws' in options ? options.throws : true
try {
let content = fs.readFileSync(file, options)
content = stripBom(content)
return JSON.parse(content, options.reviver)
} catch (err) {
if (shouldThrow) {
err.message = `${file}: ${err.message}`
throw err
} else {
return null
}
}
}
async function _writeFile (file, obj, options = {}) {
const fs = options.fs || _fs
const str = stringify(obj, options)
await universalify.fromCallback(fs.writeFile)(file, str, options)
}
const writeFile = universalify.fromPromise(_writeFile)
function writeFileSync (file, obj, options = {}) {
const fs = options.fs || _fs
const str = stringify(obj, options)
// not sure if fs.writeFileSync returns anything, but just in case
return fs.writeFileSync(file, str, options)
}
// NOTE: do not change this export format; required for ESM compat
// see https://github.com/jprichardson/node-jsonfile/pull/162 for details
module.exports = {
readFile,
readFileSync,
writeFile,
writeFileSync
}

View File

@@ -0,0 +1,40 @@
{
"name": "jsonfile",
"version": "6.2.0",
"description": "Easily read/write JSON files.",
"repository": {
"type": "git",
"url": "git@github.com:jprichardson/node-jsonfile.git"
},
"keywords": [
"read",
"write",
"file",
"json",
"fs",
"fs-extra"
],
"author": "JP Richardson <jprichardson@gmail.com>",
"license": "MIT",
"dependencies": {
"universalify": "^2.0.0"
},
"optionalDependencies": {
"graceful-fs": "^4.1.6"
},
"devDependencies": {
"mocha": "^8.2.0",
"rimraf": "^2.4.0",
"standard": "^16.0.1"
},
"main": "index.js",
"files": [
"index.js",
"utils.js"
],
"scripts": {
"lint": "standard",
"test": "npm run lint && npm run unit",
"unit": "mocha"
}
}

View File

@@ -0,0 +1,14 @@
function stringify (obj, { EOL = '\n', finalEOL = true, replacer = null, spaces } = {}) {
const EOF = finalEOL ? EOL : ''
const str = JSON.stringify(obj, replacer, spaces)
return str.replace(/\n/g, EOL) + EOF
}
function stripBom (content) {
// we do this because JSON.parse would convert it to a utf8 string if encoding wasn't specified
if (Buffer.isBuffer(content)) content = content.toString('utf8')
return content.replace(/^\uFEFF/, '')
}
module.exports = { stringify, stripBom }

View File

@@ -0,0 +1,20 @@
(The MIT License)
Copyright (c) 2017, Ryan Zimmerman <opensrc@ryanzim.com>
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the 'Software'), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software is furnished to do so,
subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View File

@@ -0,0 +1,76 @@
# universalify
![GitHub Workflow Status (branch)](https://img.shields.io/github/actions/workflow/status/RyanZim/universalify/ci.yml?branch=master)
![Coveralls github branch](https://img.shields.io/coveralls/github/RyanZim/universalify/master.svg)
![npm](https://img.shields.io/npm/dm/universalify.svg)
![npm](https://img.shields.io/npm/l/universalify.svg)
Make a callback- or promise-based function support both promises and callbacks.
Uses the native promise implementation.
## Installation
```bash
npm install universalify
```
## API
### `universalify.fromCallback(fn)`
Takes a callback-based function to universalify, and returns the universalified function.
Function must take a callback as the last parameter that will be called with the signature `(error, result)`. `universalify` does not support calling the callback with three or more arguments, and does not ensure that the callback is only called once.
```js
function callbackFn (n, cb) {
setTimeout(() => cb(null, n), 15)
}
const fn = universalify.fromCallback(callbackFn)
// Works with Promises:
fn('Hello World!')
.then(result => console.log(result)) // -> Hello World!
.catch(error => console.error(error))
// Works with Callbacks:
fn('Hi!', (error, result) => {
if (error) return console.error(error)
console.log(result)
// -> Hi!
})
```
### `universalify.fromPromise(fn)`
Takes a promise-based function to universalify, and returns the universalified function.
Function must return a valid JS promise. `universalify` does not ensure that a valid promise is returned.
```js
function promiseFn (n) {
return new Promise(resolve => {
setTimeout(() => resolve(n), 15)
})
}
const fn = universalify.fromPromise(promiseFn)
// Works with Promises:
fn('Hello World!')
.then(result => console.log(result)) // -> Hello World!
.catch(error => console.error(error))
// Works with Callbacks:
fn('Hi!', (error, result) => {
if (error) return console.error(error)
console.log(result)
// -> Hi!
})
```
## License
MIT

View File

@@ -0,0 +1,24 @@
'use strict'
exports.fromCallback = function (fn) {
return Object.defineProperty(function (...args) {
if (typeof args[args.length - 1] === 'function') fn.apply(this, args)
else {
return new Promise((resolve, reject) => {
args.push((err, res) => (err != null) ? reject(err) : resolve(res))
fn.apply(this, args)
})
}
}, 'name', { value: fn.name })
}
exports.fromPromise = function (fn) {
return Object.defineProperty(function (...args) {
const cb = args[args.length - 1]
if (typeof cb !== 'function') return fn.apply(this, args)
else {
args.pop()
fn.apply(this, args).then(r => cb(null, r), cb)
}
}, 'name', { value: fn.name })
}

View File

@@ -0,0 +1,34 @@
{
"name": "universalify",
"version": "2.0.1",
"description": "Make a callback- or promise-based function support both promises and callbacks.",
"keywords": [
"callback",
"native",
"promise"
],
"homepage": "https://github.com/RyanZim/universalify#readme",
"bugs": "https://github.com/RyanZim/universalify/issues",
"license": "MIT",
"author": "Ryan Zimmerman <opensrc@ryanzim.com>",
"files": [
"index.js"
],
"repository": {
"type": "git",
"url": "git+https://github.com/RyanZim/universalify.git"
},
"scripts": {
"test": "standard && nyc --reporter text --reporter lcovonly tape test/*.js | colortape"
},
"devDependencies": {
"colortape": "^0.1.2",
"coveralls": "^3.0.1",
"nyc": "^15.0.0",
"standard": "^14.3.1",
"tape": "^5.0.1"
},
"engines": {
"node": ">= 10.0.0"
}
}

View File

@@ -0,0 +1,25 @@
import { Arch } from "builder-util";
import { PackagerOptions, Platform } from "app-builder-lib";
import { PublishOptions } from "electron-publish";
import * as yargs from "yargs";
export declare function createYargs(): yargs.Argv<unknown>;
export interface BuildOptions extends PackagerOptions, PublishOptions {
}
export interface CliOptions extends PackagerOptions, PublishOptions {
x64?: boolean;
ia32?: boolean;
armv7l?: boolean;
arm64?: boolean;
universal?: boolean;
dir?: boolean;
}
/** @private */
export declare function normalizeOptions(args: CliOptions): BuildOptions;
/** @private */
export declare function coerceTypes(host: any): any;
export declare function createTargets(platforms: Array<Platform>, type?: string | null, arch?: string | null): Map<Platform, Map<Arch, Array<string>>>;
export declare function build(rawOptions?: CliOptions): Promise<Array<string>>;
/**
* @private
*/
export declare function configureBuildCommand(yargs: yargs.Argv): yargs.Argv;

View File

@@ -0,0 +1,268 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.createYargs = createYargs;
exports.normalizeOptions = normalizeOptions;
exports.coerceTypes = coerceTypes;
exports.createTargets = createTargets;
exports.build = build;
exports.configureBuildCommand = configureBuildCommand;
const builder_util_1 = require("builder-util");
const chalk = require("chalk");
const app_builder_lib_1 = require("app-builder-lib");
const yargs = require("yargs");
function createYargs() {
return yargs.parserConfiguration({
"camel-case-expansion": false,
});
}
/** @private */
function normalizeOptions(args) {
if (args.targets != null) {
return args;
}
const targets = new Map();
function processTargets(platform, types) {
function commonArch(currentIfNotSpecified) {
const result = Array();
if (args.x64) {
result.push(builder_util_1.Arch.x64);
}
if (args.armv7l) {
result.push(builder_util_1.Arch.armv7l);
}
if (args.arm64) {
result.push(builder_util_1.Arch.arm64);
}
if (args.ia32) {
result.push(builder_util_1.Arch.ia32);
}
if (args.universal) {
result.push(builder_util_1.Arch.universal);
}
return result.length === 0 && currentIfNotSpecified ? [(0, builder_util_1.archFromString)(process.arch)] : result;
}
let archToType = targets.get(platform);
if (archToType == null) {
archToType = new Map();
targets.set(platform, archToType);
}
if (types.length === 0) {
const defaultTargetValue = args.dir ? [app_builder_lib_1.DIR_TARGET] : [];
for (const arch of commonArch(args.dir === true)) {
archToType.set(arch, defaultTargetValue);
}
return;
}
for (const type of types) {
const suffixPos = type.lastIndexOf(":");
if (suffixPos > 0) {
(0, builder_util_1.addValue)(archToType, (0, builder_util_1.archFromString)(type.substring(suffixPos + 1)), type.substring(0, suffixPos));
}
else {
for (const arch of commonArch(true)) {
(0, builder_util_1.addValue)(archToType, arch, type);
}
}
}
}
if (args.mac != null) {
processTargets(app_builder_lib_1.Platform.MAC, args.mac);
}
if (args.linux != null) {
processTargets(app_builder_lib_1.Platform.LINUX, args.linux);
}
if (args.win != null) {
processTargets(app_builder_lib_1.Platform.WINDOWS, args.win);
}
if (targets.size === 0) {
processTargets(app_builder_lib_1.Platform.current(), []);
}
const result = { ...args };
result.targets = targets;
delete result.dir;
delete result.mac;
delete result.linux;
delete result.win;
const r = result;
delete r.m;
delete r.o;
delete r.l;
delete r.w;
delete r.windows;
delete r.macos;
delete r.$0;
delete r._;
delete r.version;
delete r.help;
delete r.c;
delete r.p;
delete r.pd;
delete result.ia32;
delete result.x64;
delete result.armv7l;
delete result.arm64;
delete result.universal;
let config = result.config;
// config is array when combining dot-notation values with a config file value
// https://github.com/electron-userland/electron-builder/issues/2016
if (Array.isArray(config)) {
const newConfig = {};
for (const configItem of config) {
if (typeof configItem === "object") {
(0, builder_util_1.deepAssign)(newConfig, configItem);
}
else if (typeof configItem === "string") {
newConfig.extends = configItem;
}
}
config = newConfig;
result.config = newConfig;
}
// AJV cannot coerce "null" string to null if string is also allowed (because null string is a valid value)
if (config != null && typeof config !== "string") {
if (config.extraMetadata != null) {
coerceTypes(config.extraMetadata);
}
// ability to disable code sign using -c.mac.identity=null
if (config.mac != null) {
coerceValue(config.mac, "identity");
}
// fix Boolean type by coerceTypes
if (config.nsis != null) {
coerceTypes(config.nsis);
}
if (config.nsisWeb != null) {
coerceTypes(config.nsisWeb);
}
}
if ("project" in r && !("projectDir" in result)) {
result.projectDir = r.project;
}
delete r.project;
return result;
}
function coerceValue(host, key) {
const value = host[key];
if (value === "true") {
host[key] = true;
}
else if (value === "false") {
host[key] = false;
}
else if (value === "null") {
host[key] = null;
}
else if (key === "version" && typeof value === "number") {
host[key] = value.toString();
}
else if (value != null && typeof value === "object") {
coerceTypes(value);
}
}
/** @private */
function coerceTypes(host) {
for (const key of Object.getOwnPropertyNames(host)) {
coerceValue(host, key);
}
return host;
}
function createTargets(platforms, type, arch) {
const targets = new Map();
for (const platform of platforms) {
const archs = arch === "all" ? (platform === app_builder_lib_1.Platform.MAC ? [builder_util_1.Arch.x64, builder_util_1.Arch.arm64, builder_util_1.Arch.universal] : [builder_util_1.Arch.x64, builder_util_1.Arch.ia32]) : [(0, builder_util_1.archFromString)(arch == null ? process.arch : arch)];
const archToType = new Map();
targets.set(platform, archToType);
for (const arch of archs) {
archToType.set(arch, type == null ? [] : [type]);
}
}
return targets;
}
function build(rawOptions) {
const buildOptions = normalizeOptions(rawOptions || {});
return (0, app_builder_lib_1.build)(buildOptions, new app_builder_lib_1.Packager(buildOptions));
}
/**
* @private
*/
function configureBuildCommand(yargs) {
const publishGroup = "Publishing:";
const buildGroup = "Building:";
return yargs
.option("mac", {
group: buildGroup,
alias: ["m", "o", "macos"],
description: `Build for macOS, accepts target list (see ${chalk.underline("https://goo.gl/5uHuzj")}).`,
type: "array",
})
.option("linux", {
group: buildGroup,
alias: "l",
description: `Build for Linux, accepts target list (see ${chalk.underline("https://goo.gl/4vwQad")})`,
type: "array",
})
.option("win", {
group: buildGroup,
alias: ["w", "windows"],
description: `Build for Windows, accepts target list (see ${chalk.underline("https://goo.gl/jYsTEJ")})`,
type: "array",
})
.option("x64", {
group: buildGroup,
description: "Build for x64",
type: "boolean",
})
.option("ia32", {
group: buildGroup,
description: "Build for ia32",
type: "boolean",
})
.option("armv7l", {
group: buildGroup,
description: "Build for armv7l",
type: "boolean",
})
.option("arm64", {
group: buildGroup,
description: "Build for arm64",
type: "boolean",
})
.option("universal", {
group: buildGroup,
description: "Build for universal",
type: "boolean",
})
.option("dir", {
group: buildGroup,
description: "Build unpacked dir. Useful to test.",
type: "boolean",
})
.option("publish", {
group: publishGroup,
alias: "p",
description: `Publish artifacts, see ${chalk.underline("https://goo.gl/tSFycD")}`,
choices: ["onTag", "onTagOrDraft", "always", "never", undefined],
})
.option("prepackaged", {
alias: ["pd"],
group: buildGroup,
description: "The path to prepackaged app (to pack in a distributable format)",
})
.option("projectDir", {
alias: ["project"],
group: buildGroup,
description: "The path to project directory. Defaults to current working directory.",
})
.option("config", {
alias: ["c"],
group: buildGroup,
description: "The path to an electron-builder config. Defaults to `electron-builder.yml` (or `json`, or `json5`, or `js`, or `ts`), see " + chalk.underline("https://goo.gl/YFRJOM"),
})
.group(["help", "version"], "Other:")
.example("electron-builder -mwl", "build for macOS, Windows and Linux")
.example("electron-builder --linux deb tar.xz", "build deb and tar.xz for Linux")
.example("electron-builder --win --ia32", "build for Windows ia32")
.example("electron-builder -c.extraMetadata.foo=bar", "set package.json property `foo` to `bar`")
.example("electron-builder --config.nsis.unicode=false", "configure unicode options for NSIS");
}
//# sourceMappingURL=builder.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,2 @@
#! /usr/bin/env node
export {};

View File

@@ -0,0 +1,70 @@
#! /usr/bin/env node
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
const builder_util_1 = require("builder-util");
const chalk = require("chalk");
const fs_extra_1 = require("fs-extra");
const isCi = require("is-ci");
const path = require("path");
const load_1 = require("app-builder-lib/out/util/config/load");
const builder_1 = require("../builder");
const create_self_signed_cert_1 = require("./create-self-signed-cert");
const install_app_deps_1 = require("./install-app-deps");
const start_1 = require("./start");
const yarn_1 = require("app-builder-lib/out/util/yarn");
const electronVersion_1 = require("app-builder-lib/out/electron/electronVersion");
const publish_1 = require("../publish");
// tslint:disable:no-unused-expression
void (0, builder_1.createYargs)()
.command(["build", "*"], "Build", builder_1.configureBuildCommand, wrap(builder_1.build))
.command("install-app-deps", "Install app deps", install_app_deps_1.configureInstallAppDepsCommand, wrap(install_app_deps_1.installAppDeps))
.command("node-gyp-rebuild", "Rebuild own native code", install_app_deps_1.configureInstallAppDepsCommand /* yes, args the same as for install app deps */, wrap(rebuildAppNativeCode))
.command("publish", "Publish a list of artifacts", publish_1.configurePublishCommand, wrap(publish_1.publish))
.command("create-self-signed-cert", "Create self-signed code signing cert for Windows apps", yargs => yargs
.option("publisher", {
alias: ["p"],
type: "string",
requiresArg: true,
description: "The publisher name",
})
.demandOption("publisher"), wrap(argv => (0, create_self_signed_cert_1.createSelfSignedCert)(argv.publisher)))
.command("start", "Run application in a development mode using electron-webpack", yargs => yargs, wrap(() => (0, start_1.start)()))
.help()
.epilog(`See ${chalk.underline("https://electron.build")} for more documentation.`)
.strict()
.recommendCommands().argv;
function wrap(task) {
return (args) => {
checkIsOutdated().catch((e) => builder_util_1.log.warn({ error: e }, "cannot check updates"));
(0, load_1.loadEnv)(path.join(process.cwd(), "electron-builder.env"))
.then(() => task(args))
.catch(error => {
process.exitCode = 1;
// https://github.com/electron-userland/electron-builder/issues/2940
process.on("exit", () => (process.exitCode = 1));
if (error instanceof builder_util_1.InvalidConfigurationError) {
builder_util_1.log.error(null, error.message);
}
else if (!(error instanceof builder_util_1.ExecError) || !error.alreadyLogged) {
builder_util_1.log.error({ failedTask: task.name, stackTrace: error.stack }, error.message);
}
});
};
}
async function checkIsOutdated() {
if (isCi || process.env.NO_UPDATE_NOTIFIER != null) {
return;
}
const pkg = await (0, fs_extra_1.readJson)(path.join(__dirname, "..", "..", "package.json"));
if (pkg.version === "0.0.0-semantic-release") {
return;
}
const UpdateNotifier = require("simple-update-notifier");
await UpdateNotifier({ pkg });
}
async function rebuildAppNativeCode(args) {
const projectDir = process.cwd();
// this script must be used only for electron
return (0, yarn_1.nodeGypRebuild)(args.platform, args.arch, { version: await (0, electronVersion_1.getElectronVersion)(projectDir), useCustomDist: true });
}
//# sourceMappingURL=cli.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1 @@
export {};

View File

@@ -0,0 +1,40 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.createSelfSignedCert = createSelfSignedCert;
const filename_1 = require("app-builder-lib/out/util/filename");
const builder_util_1 = require("builder-util");
const chalk = require("chalk");
const windowsSignToolManager_1 = require("app-builder-lib/out/codeSign/windowsSignToolManager");
const promises_1 = require("fs/promises");
const path = require("path");
/** @internal */
async function createSelfSignedCert(publisher) {
const tmpDir = new builder_util_1.TmpDir("create-self-signed-cert");
const targetDir = process.cwd();
const tempPrefix = path.join(await tmpDir.getTempDir({ prefix: "self-signed-cert-creator" }), (0, filename_1.sanitizeFileName)(publisher));
const cer = `${tempPrefix}.cer`;
const pvk = `${tempPrefix}.pvk`;
builder_util_1.log.info(chalk.bold('When asked to enter a password ("Create Private Key Password"), please select "None".'));
try {
await (0, promises_1.mkdir)(path.dirname(tempPrefix), { recursive: true });
const vendorPath = path.join(await (0, windowsSignToolManager_1.getSignVendorPath)(), "windows-10", process.arch);
await (0, builder_util_1.exec)(path.join(vendorPath, "makecert.exe"), ["-r", "-h", "0", "-n", `CN=${quoteString(publisher)}`, "-eku", "1.3.6.1.5.5.7.3.3", "-pe", "-sv", pvk, cer]);
const pfx = path.join(targetDir, `${(0, filename_1.sanitizeFileName)(publisher)}.pfx`);
await (0, builder_util_1.unlinkIfExists)(pfx);
await (0, builder_util_1.exec)(path.join(vendorPath, "pvk2pfx.exe"), ["-pvk", pvk, "-spc", cer, "-pfx", pfx]);
builder_util_1.log.info({ file: pfx }, `created. Please see https://electron.build/code-signing how to use it to sign.`);
const certLocation = "Cert:\\LocalMachine\\TrustedPeople";
builder_util_1.log.info({ file: pfx, certLocation }, `importing. Operation will be succeed only if runned from root. Otherwise import file manually.`);
await (0, builder_util_1.spawn)("powershell.exe", ["-NoProfile", "-NonInteractive", "-Command", "Import-PfxCertificate", "-FilePath", `"${pfx}"`, "-CertStoreLocation", certLocation]);
}
finally {
await tmpDir.cleanup();
}
}
function quoteString(s) {
if (!s.includes(",") && !s.includes('"')) {
return s;
}
return `"${s.replace(/"/g, '\\"')}"`;
}
//# sourceMappingURL=create-self-signed-cert.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"create-self-signed-cert.js","sourceRoot":"","sources":["../../src/cli/create-self-signed-cert.ts"],"names":[],"mappings":";;AAQA,oDAyBC;AAjCD,gEAAoE;AACpE,+CAAuE;AACvE,+BAA8B;AAC9B,gGAAuF;AACvF,0CAAmC;AACnC,6BAA4B;AAE5B,gBAAgB;AACT,KAAK,UAAU,oBAAoB,CAAC,SAAiB;IAC1D,MAAM,MAAM,GAAG,IAAI,qBAAM,CAAC,yBAAyB,CAAC,CAAA;IACpD,MAAM,SAAS,GAAG,OAAO,CAAC,GAAG,EAAE,CAAA;IAC/B,MAAM,UAAU,GAAG,IAAI,CAAC,IAAI,CAAC,MAAM,MAAM,CAAC,UAAU,CAAC,EAAE,MAAM,EAAE,0BAA0B,EAAE,CAAC,EAAE,IAAA,2BAAgB,EAAC,SAAS,CAAC,CAAC,CAAA;IAC1H,MAAM,GAAG,GAAG,GAAG,UAAU,MAAM,CAAA;IAC/B,MAAM,GAAG,GAAG,GAAG,UAAU,MAAM,CAAA;IAE/B,kBAAG,CAAC,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,uFAAuF,CAAC,CAAC,CAAA;IAE7G,IAAI,CAAC;QACH,MAAM,IAAA,gBAAK,EAAC,IAAI,CAAC,OAAO,CAAC,UAAU,CAAC,EAAE,EAAE,SAAS,EAAE,IAAI,EAAE,CAAC,CAAA;QAC1D,MAAM,UAAU,GAAG,IAAI,CAAC,IAAI,CAAC,MAAM,IAAA,0CAAiB,GAAE,EAAE,YAAY,EAAE,OAAO,CAAC,IAAI,CAAC,CAAA;QACnF,MAAM,IAAA,mBAAI,EAAC,IAAI,CAAC,IAAI,CAAC,UAAU,EAAE,cAAc,CAAC,EAAE,CAAC,IAAI,EAAE,IAAI,EAAE,GAAG,EAAE,IAAI,EAAE,MAAM,WAAW,CAAC,SAAS,CAAC,EAAE,EAAE,MAAM,EAAE,mBAAmB,EAAE,KAAK,EAAE,KAAK,EAAE,GAAG,EAAE,GAAG,CAAC,CAAC,CAAA;QAE/J,MAAM,GAAG,GAAG,IAAI,CAAC,IAAI,CAAC,SAAS,EAAE,GAAG,IAAA,2BAAgB,EAAC,SAAS,CAAC,MAAM,CAAC,CAAA;QACtE,MAAM,IAAA,6BAAc,EAAC,GAAG,CAAC,CAAA;QACzB,MAAM,IAAA,mBAAI,EAAC,IAAI,CAAC,IAAI,CAAC,UAAU,EAAE,aAAa,CAAC,EAAE,CAAC,MAAM,EAAE,GAAG,EAAE,MAAM,EAAE,GAAG,EAAE,MAAM,EAAE,GAAG,CAAC,CAAC,CAAA;QACzF,kBAAG,CAAC,IAAI,CAAC,EAAE,IAAI,EAAE,GAAG,EAAE,EAAE,gFAAgF,CAAC,CAAA;QAEzG,MAAM,YAAY,GAAG,oCAAoC,CAAA;QACzD,kBAAG,CAAC,IAAI,CAAC,EAAE,IAAI,EAAE,GAAG,EAAE,YAAY,EAAE,EAAE,gGAAgG,CAAC,CAAA;QACvI,MAAM,IAAA,oBAAK,EAAC,gBAAgB,EAAE,CAAC,YAAY,EAAE,iBAAiB,EAAE,UAAU,EAAE,uBAAuB,EAAE,WAAW,EAAE,IAAI,GAAG,GAAG,EAAE,oBAAoB,EAAE,YAAY,CAAC,CAAC,CAAA;IACpK,CAAC;YAAS,CAAC;QACT,MAAM,MAAM,CAAC,OAAO,EAAE,CAAA;IACxB,CAAC;AACH,CAAC;AAED,SAAS,WAAW,CAAC,CAAS;IAC5B,IAAI,CAAC,CAAC,CAAC,QAAQ,CAAC,GAAG,CAAC,IAAI,CAAC,CAAC,CAAC,QAAQ,CAAC,GAAG,CAAC,EAAE,CAAC;QACzC,OAAO,CAAC,CAAA;IACV,CAAC;IAED,OAAO,IAAI,CAAC,CAAC,OAAO,CAAC,IAAI,EAAE,KAAK,CAAC,GAAG,CAAA;AACtC,CAAC","sourcesContent":["import { sanitizeFileName } from \"app-builder-lib/out/util/filename\"\nimport { exec, log, spawn, TmpDir, unlinkIfExists } from \"builder-util\"\nimport * as chalk from \"chalk\"\nimport { getSignVendorPath } from \"app-builder-lib/out/codeSign/windowsSignToolManager\"\nimport { mkdir } from \"fs/promises\"\nimport * as path from \"path\"\n\n/** @internal */\nexport async function createSelfSignedCert(publisher: string) {\n const tmpDir = new TmpDir(\"create-self-signed-cert\")\n const targetDir = process.cwd()\n const tempPrefix = path.join(await tmpDir.getTempDir({ prefix: \"self-signed-cert-creator\" }), sanitizeFileName(publisher))\n const cer = `${tempPrefix}.cer`\n const pvk = `${tempPrefix}.pvk`\n\n log.info(chalk.bold('When asked to enter a password (\"Create Private Key Password\"), please select \"None\".'))\n\n try {\n await mkdir(path.dirname(tempPrefix), { recursive: true })\n const vendorPath = path.join(await getSignVendorPath(), \"windows-10\", process.arch)\n await exec(path.join(vendorPath, \"makecert.exe\"), [\"-r\", \"-h\", \"0\", \"-n\", `CN=${quoteString(publisher)}`, \"-eku\", \"1.3.6.1.5.5.7.3.3\", \"-pe\", \"-sv\", pvk, cer])\n\n const pfx = path.join(targetDir, `${sanitizeFileName(publisher)}.pfx`)\n await unlinkIfExists(pfx)\n await exec(path.join(vendorPath, \"pvk2pfx.exe\"), [\"-pvk\", pvk, \"-spc\", cer, \"-pfx\", pfx])\n log.info({ file: pfx }, `created. Please see https://electron.build/code-signing how to use it to sign.`)\n\n const certLocation = \"Cert:\\\\LocalMachine\\\\TrustedPeople\"\n log.info({ file: pfx, certLocation }, `importing. Operation will be succeed only if runned from root. Otherwise import file manually.`)\n await spawn(\"powershell.exe\", [\"-NoProfile\", \"-NonInteractive\", \"-Command\", \"Import-PfxCertificate\", \"-FilePath\", `\"${pfx}\"`, \"-CertStoreLocation\", certLocation])\n } finally {\n await tmpDir.cleanup()\n }\n}\n\nfunction quoteString(s: string): string {\n if (!s.includes(\",\") && !s.includes('\"')) {\n return s\n }\n\n return `\"${s.replace(/\"/g, '\\\\\"')}\"`\n}\n"]}

View File

@@ -0,0 +1,2 @@
#! /usr/bin/env node
export {};

View File

@@ -0,0 +1,69 @@
#! /usr/bin/env node
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.configureInstallAppDepsCommand = configureInstallAppDepsCommand;
exports.installAppDeps = installAppDeps;
const electronVersion_1 = require("app-builder-lib/out/electron/electronVersion");
const config_1 = require("app-builder-lib/out/util/config/config");
const yarn_1 = require("app-builder-lib/out/util/yarn");
const version_1 = require("app-builder-lib/out/version");
const packageDependencies_1 = require("app-builder-lib/out/util/packageDependencies");
const builder_util_1 = require("builder-util");
const fs_extra_1 = require("fs-extra");
const lazy_val_1 = require("lazy-val");
const path = require("path");
const load_1 = require("app-builder-lib/out/util/config/load");
const yargs = require("yargs");
/** @internal */
function configureInstallAppDepsCommand(yargs) {
// https://github.com/yargs/yargs/issues/760
// demandOption is required to be set
return yargs
.parserConfiguration({
"camel-case-expansion": false,
})
.option("platform", {
choices: ["linux", "darwin", "win32"],
default: process.platform,
description: "The target platform",
})
.option("arch", {
choices: (0, builder_util_1.getArchCliNames)().concat("all"),
default: process.arch === "arm" ? "armv7l" : process.arch,
description: "The target arch",
});
}
/** @internal */
async function installAppDeps(args) {
try {
builder_util_1.log.info({ version: version_1.PACKAGE_VERSION }, "electron-builder");
}
catch (e) {
// error in dev mode without babel
if (!(e instanceof ReferenceError)) {
throw e;
}
}
const projectDir = process.cwd();
const packageMetadata = new lazy_val_1.Lazy(() => (0, load_1.orNullIfFileNotExist)((0, fs_extra_1.readJson)(path.join(projectDir, "package.json"))));
const config = await (0, config_1.getConfig)(projectDir, null, null, packageMetadata);
const [appDir, version] = await Promise.all([
(0, config_1.computeDefaultAppDirectory)(projectDir, (0, builder_util_1.use)(config.directories, it => it.app)),
(0, electronVersion_1.getElectronVersion)(projectDir, config),
]);
// if two package.json — force full install (user wants to install/update app deps in addition to dev)
await (0, yarn_1.installOrRebuild)(config, appDir, {
frameworkInfo: { version, useCustomDist: true },
platform: args.platform,
arch: args.arch,
productionDeps: (0, packageDependencies_1.createLazyProductionDeps)(appDir, null, false),
}, appDir !== projectDir);
}
function main() {
return installAppDeps(configureInstallAppDepsCommand(yargs).argv);
}
if (require.main === module) {
builder_util_1.log.warn("please use as subcommand: electron-builder install-app-deps");
main().catch(builder_util_1.printErrorAndExit);
}
//# sourceMappingURL=install-app-deps.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"install-app-deps.js","sourceRoot":"","sources":["../../src/cli/install-app-deps.ts"],"names":[],"mappings":";;;AAeA,wEAiBC;AAGD,wCAiCC;AAlED,kFAAiF;AACjF,mEAA8F;AAC9F,wDAAgE;AAChE,yDAA6D;AAC7D,sFAAuF;AACvF,+CAA2E;AAC3E,uCAAmC;AACnC,uCAA+B;AAC/B,6BAA4B;AAC5B,+DAA2E;AAC3E,+BAA8B;AAE9B,gBAAgB;AAChB,SAAgB,8BAA8B,CAAC,KAAiB;IAC9D,4CAA4C;IAC5C,qCAAqC;IACrC,OAAO,KAAK;SACT,mBAAmB,CAAC;QACnB,sBAAsB,EAAE,KAAK;KAC9B,CAAC;SACD,MAAM,CAAC,UAAU,EAAE;QAClB,OAAO,EAAE,CAAC,OAAO,EAAE,QAAQ,EAAE,OAAO,CAAC;QACrC,OAAO,EAAE,OAAO,CAAC,QAAQ;QACzB,WAAW,EAAE,qBAAqB;KACnC,CAAC;SACD,MAAM,CAAC,MAAM,EAAE;QACd,OAAO,EAAE,IAAA,8BAAe,GAAE,CAAC,MAAM,CAAC,KAAK,CAAC;QACxC,OAAO,EAAE,OAAO,CAAC,IAAI,KAAK,KAAK,CAAC,CAAC,CAAC,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,IAAI;QACzD,WAAW,EAAE,iBAAiB;KAC/B,CAAC,CAAA;AACN,CAAC;AAED,gBAAgB;AACT,KAAK,UAAU,cAAc,CAAC,IAAS;IAC5C,IAAI,CAAC;QACH,kBAAG,CAAC,IAAI,CAAC,EAAE,OAAO,EAAE,yBAAe,EAAE,EAAE,kBAAkB,CAAC,CAAA;IAC5D,CAAC;IAAC,OAAO,CAAM,EAAE,CAAC;QAChB,kCAAkC;QAClC,IAAI,CAAC,CAAC,CAAC,YAAY,cAAc,CAAC,EAAE,CAAC;YACnC,MAAM,CAAC,CAAA;QACT,CAAC;IACH,CAAC;IAED,MAAM,UAAU,GAAG,OAAO,CAAC,GAAG,EAAE,CAAA;IAChC,MAAM,eAAe,GAAG,IAAI,eAAI,CAAC,GAAG,EAAE,CAAC,IAAA,2BAAoB,EAAC,IAAA,mBAAQ,EAAC,IAAI,CAAC,IAAI,CAAC,UAAU,EAAE,cAAc,CAAC,CAAC,CAAC,CAAC,CAAA;IAC7G,MAAM,MAAM,GAAG,MAAM,IAAA,kBAAS,EAAC,UAAU,EAAE,IAAI,EAAE,IAAI,EAAE,eAAe,CAAC,CAAA;IACvE,MAAM,CAAC,MAAM,EAAE,OAAO,CAAC,GAAG,MAAM,OAAO,CAAC,GAAG,CAAS;QAClD,IAAA,mCAA0B,EACxB,UAAU,EACV,IAAA,kBAAG,EAAC,MAAM,CAAC,WAAW,EAAE,EAAE,CAAC,EAAE,CAAC,EAAE,CAAC,GAAG,CAAC,CACtC;QACD,IAAA,oCAAkB,EAAC,UAAU,EAAE,MAAM,CAAC;KACvC,CAAC,CAAA;IAEF,sGAAsG;IACtG,MAAM,IAAA,uBAAgB,EACpB,MAAM,EACN,MAAM,EACN;QACE,aAAa,EAAE,EAAE,OAAO,EAAE,aAAa,EAAE,IAAI,EAAE;QAC/C,QAAQ,EAAE,IAAI,CAAC,QAAQ;QACvB,IAAI,EAAE,IAAI,CAAC,IAAI;QACf,cAAc,EAAE,IAAA,8CAAwB,EAAC,MAAM,EAAE,IAAI,EAAE,KAAK,CAAC;KAC9D,EACD,MAAM,KAAK,UAAU,CACtB,CAAA;AACH,CAAC;AAED,SAAS,IAAI;IACX,OAAO,cAAc,CAAC,8BAA8B,CAAC,KAAK,CAAC,CAAC,IAAI,CAAC,CAAA;AACnE,CAAC;AAED,IAAI,OAAO,CAAC,IAAI,KAAK,MAAM,EAAE,CAAC;IAC5B,kBAAG,CAAC,IAAI,CAAC,6DAA6D,CAAC,CAAA;IACvE,IAAI,EAAE,CAAC,KAAK,CAAC,gCAAiB,CAAC,CAAA;AACjC,CAAC","sourcesContent":["#! /usr/bin/env node\n\nimport { getElectronVersion } from \"app-builder-lib/out/electron/electronVersion\"\nimport { computeDefaultAppDirectory, getConfig } from \"app-builder-lib/out/util/config/config\"\nimport { installOrRebuild } from \"app-builder-lib/out/util/yarn\"\nimport { PACKAGE_VERSION } from \"app-builder-lib/out/version\"\nimport { createLazyProductionDeps } from \"app-builder-lib/out/util/packageDependencies\"\nimport { getArchCliNames, log, use, printErrorAndExit } from \"builder-util\"\nimport { readJson } from \"fs-extra\"\nimport { Lazy } from \"lazy-val\"\nimport * as path from \"path\"\nimport { orNullIfFileNotExist } from \"app-builder-lib/out/util/config/load\"\nimport * as yargs from \"yargs\"\n\n/** @internal */\nexport function configureInstallAppDepsCommand(yargs: yargs.Argv): yargs.Argv {\n // https://github.com/yargs/yargs/issues/760\n // demandOption is required to be set\n return yargs\n .parserConfiguration({\n \"camel-case-expansion\": false,\n })\n .option(\"platform\", {\n choices: [\"linux\", \"darwin\", \"win32\"],\n default: process.platform,\n description: \"The target platform\",\n })\n .option(\"arch\", {\n choices: getArchCliNames().concat(\"all\"),\n default: process.arch === \"arm\" ? \"armv7l\" : process.arch,\n description: \"The target arch\",\n })\n}\n\n/** @internal */\nexport async function installAppDeps(args: any) {\n try {\n log.info({ version: PACKAGE_VERSION }, \"electron-builder\")\n } catch (e: any) {\n // error in dev mode without babel\n if (!(e instanceof ReferenceError)) {\n throw e\n }\n }\n\n const projectDir = process.cwd()\n const packageMetadata = new Lazy(() => orNullIfFileNotExist(readJson(path.join(projectDir, \"package.json\"))))\n const config = await getConfig(projectDir, null, null, packageMetadata)\n const [appDir, version] = await Promise.all<string>([\n computeDefaultAppDirectory(\n projectDir,\n use(config.directories, it => it.app)\n ),\n getElectronVersion(projectDir, config),\n ])\n\n // if two package.json — force full install (user wants to install/update app deps in addition to dev)\n await installOrRebuild(\n config,\n appDir,\n {\n frameworkInfo: { version, useCustomDist: true },\n platform: args.platform,\n arch: args.arch,\n productionDeps: createLazyProductionDeps(appDir, null, false),\n },\n appDir !== projectDir\n )\n}\n\nfunction main() {\n return installAppDeps(configureInstallAppDepsCommand(yargs).argv)\n}\n\nif (require.main === module) {\n log.warn(\"please use as subcommand: electron-builder install-app-deps\")\n main().catch(printErrorAndExit)\n}\n"]}

View File

@@ -0,0 +1 @@
export {};

View File

@@ -0,0 +1,9 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.start = start;
/** @internal */
function start() {
require("electron-webpack/dev-runner");
return Promise.resolve();
}
//# sourceMappingURL=start.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"start.js","sourceRoot":"","sources":["../../src/cli/start.ts"],"names":[],"mappings":";;AACA,sBAGC;AAJD,gBAAgB;AAChB,SAAgB,KAAK;IACnB,OAAO,CAAC,6BAA6B,CAAC,CAAA;IACtC,OAAO,OAAO,CAAC,OAAO,EAAE,CAAA;AAC1B,CAAC","sourcesContent":["/** @internal */\nexport function start() {\n require(\"electron-webpack/dev-runner\")\n return Promise.resolve()\n}\n"]}

View File

@@ -0,0 +1,6 @@
export { getArchSuffix, Arch, archFromString, log } from "builder-util";
export { build, CliOptions, createTargets } from "./builder";
export { publish, publishArtifactsWithOptions } from "./publish";
export { TargetConfiguration, Platform, Target, DIR_TARGET, BeforeBuildContext, SourceRepositoryInfo, TargetSpecificOptions, TargetConfigType, DEFAULT_TARGET, CompressionLevel, MacConfiguration, DmgOptions, MasConfiguration, MacOsTargetName, PkgOptions, DmgContent, DmgWindow, PlatformSpecificBuildOptions, AsarOptions, FileSet, LinuxConfiguration, DebOptions, CommonLinuxOptions, LinuxTargetSpecificOptions, AppImageOptions, Configuration, AfterPackContext, MetadataDirectories, Protocol, ReleaseInfo, ElectronBrandingOptions, ElectronDownloadOptions, SnapOptions, CommonWindowsInstallerConfiguration, FileAssociation, MsiOptions, AppXOptions, WindowsConfiguration, Packager, BuildResult, PackagerOptions, ArtifactCreated, ArtifactBuildStarted, NsisOptions, NsisWebOptions, PortableOptions, CommonNsisOptions, SquirrelWindowsOptions, WindowsSignOptions, CustomWindowsSignTaskConfiguration, WindowsSignTaskConfiguration, CustomWindowsSign, FileCodeSigningInfo, CertificateFromStoreInfo, Metadata, AuthorMetadata, RepositoryInfo, AppInfo, UploadTask, PublishManager, PublishOptions, ProgressInfo, MacPackager, WinPackager, LinuxPackager, } from "app-builder-lib";
export { buildForge, ForgeOptions } from "app-builder-lib";
export { CancellationToken } from "builder-util-runtime";

View File

@@ -0,0 +1,30 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.CancellationToken = exports.buildForge = exports.LinuxPackager = exports.WinPackager = exports.MacPackager = exports.PublishManager = exports.AppInfo = exports.Packager = exports.DEFAULT_TARGET = exports.DIR_TARGET = exports.Target = exports.Platform = exports.publishArtifactsWithOptions = exports.publish = exports.createTargets = exports.build = exports.log = exports.archFromString = exports.Arch = exports.getArchSuffix = void 0;
var builder_util_1 = require("builder-util");
Object.defineProperty(exports, "getArchSuffix", { enumerable: true, get: function () { return builder_util_1.getArchSuffix; } });
Object.defineProperty(exports, "Arch", { enumerable: true, get: function () { return builder_util_1.Arch; } });
Object.defineProperty(exports, "archFromString", { enumerable: true, get: function () { return builder_util_1.archFromString; } });
Object.defineProperty(exports, "log", { enumerable: true, get: function () { return builder_util_1.log; } });
var builder_1 = require("./builder");
Object.defineProperty(exports, "build", { enumerable: true, get: function () { return builder_1.build; } });
Object.defineProperty(exports, "createTargets", { enumerable: true, get: function () { return builder_1.createTargets; } });
var publish_1 = require("./publish");
Object.defineProperty(exports, "publish", { enumerable: true, get: function () { return publish_1.publish; } });
Object.defineProperty(exports, "publishArtifactsWithOptions", { enumerable: true, get: function () { return publish_1.publishArtifactsWithOptions; } });
var app_builder_lib_1 = require("app-builder-lib");
Object.defineProperty(exports, "Platform", { enumerable: true, get: function () { return app_builder_lib_1.Platform; } });
Object.defineProperty(exports, "Target", { enumerable: true, get: function () { return app_builder_lib_1.Target; } });
Object.defineProperty(exports, "DIR_TARGET", { enumerable: true, get: function () { return app_builder_lib_1.DIR_TARGET; } });
Object.defineProperty(exports, "DEFAULT_TARGET", { enumerable: true, get: function () { return app_builder_lib_1.DEFAULT_TARGET; } });
Object.defineProperty(exports, "Packager", { enumerable: true, get: function () { return app_builder_lib_1.Packager; } });
Object.defineProperty(exports, "AppInfo", { enumerable: true, get: function () { return app_builder_lib_1.AppInfo; } });
Object.defineProperty(exports, "PublishManager", { enumerable: true, get: function () { return app_builder_lib_1.PublishManager; } });
Object.defineProperty(exports, "MacPackager", { enumerable: true, get: function () { return app_builder_lib_1.MacPackager; } });
Object.defineProperty(exports, "WinPackager", { enumerable: true, get: function () { return app_builder_lib_1.WinPackager; } });
Object.defineProperty(exports, "LinuxPackager", { enumerable: true, get: function () { return app_builder_lib_1.LinuxPackager; } });
var app_builder_lib_2 = require("app-builder-lib");
Object.defineProperty(exports, "buildForge", { enumerable: true, get: function () { return app_builder_lib_2.buildForge; } });
var builder_util_runtime_1 = require("builder-util-runtime");
Object.defineProperty(exports, "CancellationToken", { enumerable: true, get: function () { return builder_util_runtime_1.CancellationToken; } });
//# sourceMappingURL=index.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"index.js","sourceRoot":"","sources":["../src/index.ts"],"names":[],"mappings":";;;AAAA,6CAAuE;AAA9D,6GAAA,aAAa,OAAA;AAAE,oGAAA,IAAI,OAAA;AAAE,8GAAA,cAAc,OAAA;AAAE,mGAAA,GAAG,OAAA;AACjD,qCAA4D;AAAnD,gGAAA,KAAK,OAAA;AAAc,wGAAA,aAAa,OAAA;AACzC,qCAAgE;AAAvD,kGAAA,OAAO,OAAA;AAAE,sHAAA,2BAA2B,OAAA;AAC7C,mDAkEwB;AAhEtB,2GAAA,QAAQ,OAAA;AACR,yGAAA,MAAM,OAAA;AACN,6GAAA,UAAU,OAAA;AAKV,iHAAA,cAAc,OAAA;AA8Bd,2GAAA,QAAQ,OAAA;AAmBR,0GAAA,OAAO,OAAA;AAEP,iHAAA,cAAc,OAAA;AAGd,8GAAA,WAAW,OAAA;AACX,8GAAA,WAAW,OAAA;AACX,gHAAA,aAAa,OAAA;AAEf,mDAA0D;AAAjD,6GAAA,UAAU,OAAA;AACnB,6DAAwD;AAA/C,yHAAA,iBAAiB,OAAA","sourcesContent":["export { getArchSuffix, Arch, archFromString, log } from \"builder-util\"\nexport { build, CliOptions, createTargets } from \"./builder\"\nexport { publish, publishArtifactsWithOptions } from \"./publish\"\nexport {\n TargetConfiguration,\n Platform,\n Target,\n DIR_TARGET,\n BeforeBuildContext,\n SourceRepositoryInfo,\n TargetSpecificOptions,\n TargetConfigType,\n DEFAULT_TARGET,\n CompressionLevel,\n MacConfiguration,\n DmgOptions,\n MasConfiguration,\n MacOsTargetName,\n PkgOptions,\n DmgContent,\n DmgWindow,\n PlatformSpecificBuildOptions,\n AsarOptions,\n FileSet,\n LinuxConfiguration,\n DebOptions,\n CommonLinuxOptions,\n LinuxTargetSpecificOptions,\n AppImageOptions,\n Configuration,\n AfterPackContext,\n MetadataDirectories,\n Protocol,\n ReleaseInfo,\n ElectronBrandingOptions,\n ElectronDownloadOptions,\n SnapOptions,\n CommonWindowsInstallerConfiguration,\n FileAssociation,\n MsiOptions,\n AppXOptions,\n WindowsConfiguration,\n Packager,\n BuildResult,\n PackagerOptions,\n ArtifactCreated,\n ArtifactBuildStarted,\n NsisOptions,\n NsisWebOptions,\n PortableOptions,\n CommonNsisOptions,\n SquirrelWindowsOptions,\n WindowsSignOptions,\n CustomWindowsSignTaskConfiguration,\n WindowsSignTaskConfiguration,\n CustomWindowsSign,\n FileCodeSigningInfo,\n CertificateFromStoreInfo,\n Metadata,\n AuthorMetadata,\n RepositoryInfo,\n AppInfo,\n UploadTask,\n PublishManager,\n PublishOptions,\n ProgressInfo,\n MacPackager,\n WinPackager,\n LinuxPackager,\n} from \"app-builder-lib\"\nexport { buildForge, ForgeOptions } from \"app-builder-lib\"\nexport { CancellationToken } from \"builder-util-runtime\"\n"]}

View File

@@ -0,0 +1,12 @@
#! /usr/bin/env node
import { UploadTask } from "app-builder-lib";
import { Publish } from "app-builder-lib/out/core";
export declare function publish(args: {
files: string[];
version: string | undefined;
config: string | undefined;
}): Promise<UploadTask[] | null>;
export declare function publishArtifactsWithOptions(uploadOptions: {
file: string;
arch: string | null;
}[], buildVersion?: string, configurationFilePath?: string, publishConfiguration?: Publish): Promise<UploadTask[] | null>;

View File

@@ -0,0 +1,102 @@
#! /usr/bin/env node
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.configurePublishCommand = configurePublishCommand;
exports.publish = publish;
exports.publishArtifactsWithOptions = publishArtifactsWithOptions;
const app_builder_lib_1 = require("app-builder-lib");
const platformPackager_1 = require("app-builder-lib/out/platformPackager");
const config_1 = require("app-builder-lib/out/util/config/config");
const builder_util_1 = require("builder-util");
const builder_util_2 = require("builder-util");
const chalk = require("chalk");
const path = require("path");
const yargs = require("yargs");
const builder_1 = require("./builder");
/** @internal */
function configurePublishCommand(yargs) {
// https://github.com/yargs/yargs/issues/760
// demandOption is required to be set
return yargs
.parserConfiguration({
"camel-case-expansion": false,
})
.option("files", {
alias: "f",
string: true,
type: "array",
requiresArg: true,
description: "The file(s) to upload to your publisher",
})
.option("version", {
alias: ["v"],
type: "string",
description: "The app/build version used when searching for an upload release (used by some Publishers)",
})
.option("config", {
alias: ["c"],
type: "string",
description: "The path to an electron-builder config. Defaults to `electron-builder.yml` (or `json`, or `json5`, or `js`, or `ts`), see " + chalk.underline("https://goo.gl/YFRJOM"),
})
.demandOption("files");
}
async function publish(args) {
const uploadTasks = args.files.map(f => {
return {
file: path.resolve(f),
arch: null,
};
});
return publishArtifactsWithOptions(uploadTasks, args.version, args.config);
}
async function publishArtifactsWithOptions(uploadOptions, buildVersion, configurationFilePath, publishConfiguration) {
const projectDir = process.cwd();
const config = await (0, config_1.getConfig)(projectDir, configurationFilePath || null, { publish: publishConfiguration, detectUpdateChannel: false });
const buildOptions = (0, builder_1.normalizeOptions)({ config });
(0, app_builder_lib_1.checkBuildRequestOptions)(buildOptions);
const uniqueUploads = Array.from(new Set(uploadOptions));
const tasks = uniqueUploads.map(({ file, arch }) => {
const filename = path.basename(file);
return { file, arch: arch ? (0, builder_util_1.archFromString)(arch) : null, safeArtifactName: (0, platformPackager_1.computeSafeArtifactNameIfNeeded)(filename, () => filename) };
});
return publishPackageWithTasks(buildOptions, tasks, buildVersion);
}
async function publishPackageWithTasks(options, uploadTasks, buildVersion, cancellationToken = new app_builder_lib_1.CancellationToken(), packager = new app_builder_lib_1.Packager(options, cancellationToken)) {
await packager.validateConfig();
const appInfo = new app_builder_lib_1.AppInfo(packager, buildVersion);
const publishManager = new app_builder_lib_1.PublishManager(packager, options, cancellationToken);
const sigIntHandler = () => {
builder_util_1.log.warn("cancelled by SIGINT");
packager.cancellationToken.cancel();
publishManager.cancelTasks();
};
process.once("SIGINT", sigIntHandler);
try {
const publishConfigurations = await publishManager.getGlobalPublishConfigurations();
if (publishConfigurations == null || publishConfigurations.length === 0) {
throw new builder_util_1.InvalidConfigurationError("unable to find any publish configuration");
}
for (const newArtifact of uploadTasks) {
for (const publishConfiguration of publishConfigurations) {
publishManager.scheduleUpload(publishConfiguration, newArtifact, appInfo);
}
}
await publishManager.awaitTasks();
return uploadTasks;
}
catch (error) {
packager.cancellationToken.cancel();
publishManager.cancelTasks();
process.removeListener("SIGINT", sigIntHandler);
builder_util_1.log.error({ message: (error.stack || error.message || error).toString() }, "error publishing");
}
return null;
}
function main() {
return publish(configurePublishCommand(yargs).argv);
}
if (require.main === module) {
builder_util_1.log.warn("please use as subcommand: electron-builder publish");
main().catch(builder_util_2.printErrorAndExit);
}
//# sourceMappingURL=publish.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,73 @@
{
"name": "electron-builder",
"description": "A complete solution to package and build a ready for distribution Electron app for MacOS, Windows and Linux with “auto update” support out of the box",
"version": "25.1.8",
"main": "out/index.js",
"files": [
"out"
],
"bin": {
"electron-builder": "./cli.js",
"install-app-deps": "./install-app-deps.js"
},
"repository": {
"type": "git",
"url": "git+https://github.com/electron-userland/electron-builder.git",
"directory": "packages/electron-builder"
},
"engines": {
"node": ">=14.0.0"
},
"keywords": [
"electron",
"builder",
"build",
"installer",
"install",
"packager",
"pack",
"nsis",
"app",
"dmg",
"pkg",
"msi",
"exe",
"setup",
"Windows",
"OS X",
"MacOS",
"Mac",
"appx",
"snap",
"flatpak",
"portable"
],
"author": "Vladimir Krivosheev",
"contributors": [
"Stefan Judis"
],
"license": "MIT",
"bugs": "https://github.com/electron-userland/electron-builder/issues",
"homepage": "https://github.com/electron-userland/electron-builder",
"dependencies": {
"chalk": "^4.1.2",
"fs-extra": "^10.1.0",
"is-ci": "^3.0.0",
"lazy-val": "^1.0.5",
"simple-update-notifier": "2.0.0",
"yargs": "^17.6.2",
"app-builder-lib": "25.1.8",
"builder-util": "25.1.7",
"dmg-builder": "25.1.8",
"builder-util-runtime": "9.2.10"
},
"devDependencies": {
"@types/fs-extra": "9.0.13",
"@types/is-ci": "3.0.0",
"@types/yargs": "^17.0.16"
},
"typings": "./out/index.d.ts",
"publishConfig": {
"tag": "next"
}
}