main funcions fixes
This commit is contained in:
185
desktop-operator/README.md
Normal file
185
desktop-operator/README.md
Normal file
@@ -0,0 +1,185 @@
|
||||
# GodEye Operator Console
|
||||
|
||||
Десктопное приложение для операторов системы видеонаблюдения GodEye. Приложение позволяет операторам подключаться к Android устройствам, получать видеопоток с камер и управлять съемкой.
|
||||
|
||||
## 🚀 Новые функции (v2.0)
|
||||
|
||||
### ✨ Улучшенный пользовательский интерфейс
|
||||
- **Автосохранение настроек** - все настройки сохраняются автоматически
|
||||
- **Автоподключение** - возможность автоматически подключаться к серверу при запуске
|
||||
- **Мониторинг ping** - отображение задержки соединения в реальном времени
|
||||
- **Статус индикаторы** - улучшенные индикаторы состояния подключения
|
||||
|
||||
### 📸 Расширенные возможности видео
|
||||
- **Скриншоты** - сохранение кадров из видеопотока (`Ctrl+S`)
|
||||
- **Полноэкранный режим** - просмотр видео в полный экран (`Ctrl+F`)
|
||||
- **Увеличение видео** - масштабирование с возможностью перетаскивания
|
||||
- **Picture-in-Picture** - режим "картинка в картинке" для многозадачности
|
||||
|
||||
### ⚙️ Улучшенные настройки
|
||||
- **Persistent конфигурация** - сохранение размеров окна, настроек фильтров
|
||||
- **Горячие клавиши** - быстрый доступ к основным функциям
|
||||
- **Улучшенная запись** - автосохранение записей с метаданными
|
||||
|
||||
## <20> Технические характеристики
|
||||
|
||||
- **Electron**: v38.2.0 (последняя LTS версия)
|
||||
- **Socket.IO**: v4.8.1 - для связи с backend сервером
|
||||
- **UUID**: v11.0.3 - генерация уникальных идентификаторов
|
||||
- **WebRTC** - для получения видеопотока
|
||||
- **Node.js** - основная среда выполнения
|
||||
|
||||
|
||||
## <20> Пошаговая инструкция: подключение к камере смартфона
|
||||
|
||||
1. **Запустите backend сервер**
|
||||
- Откройте терминал и перейдите в папку `backend`
|
||||
- Выполните команду: `node src/server.js`
|
||||
- Убедитесь, что сервер работает на порту 3001
|
||||
|
||||
2. **Запустите Android клиент**
|
||||
- Установите приложение GodEye Android Client на смартфон
|
||||
- Запустите приложение и убедитесь, что оно подключено к серверу
|
||||
|
||||
3. **Запустите десктопное приложение оператора**
|
||||
- Перейдите в папку `desktop-operator`
|
||||
- Выполните команду: `npm install` (только при первом запуске)
|
||||
- Запустите приложение: `npm run dev`
|
||||
|
||||
4. **Подключитесь к серверу**
|
||||
- Введите URL сервера (по умолчанию: `http://localhost:3001`)
|
||||
- Нажмите кнопку **"Подключиться"**
|
||||
- Дождитесь появления списка доступных устройств
|
||||
- При необходимости нажмите кнопку **"Обновить"** для обновления списка
|
||||
|
||||
5. **Выберите устройство**
|
||||
- В списке устройств выберите нужный смартфон
|
||||
|
||||
6. **Запросите доступ к камере**
|
||||
- Выберите тип камеры: Основная, Фронтальная, Широкоугольная, Телеобъектив
|
||||
- Нажмите соответствующую кнопку
|
||||
- Дождитесь подтверждения и появления видеопотока
|
||||
|
||||
7. **Управляйте видеопотоком**
|
||||
- Используйте элементы управления для обработки изображения, записи, скриншотов и других функций
|
||||
- Все действия отображаются в журнале событий
|
||||
|
||||
8. **Завершите сессию**
|
||||
- Нажмите "Отключиться" для завершения работы
|
||||
|
||||
**Примечание:**
|
||||
- Для стабильной работы убедитесь, что смартфон и ПК находятся в одной сети или сервер доступен из интернета.
|
||||
- В случае проблем используйте журнал событий для диагностики.
|
||||
|
||||
### Установка зависимостей
|
||||
```bash
|
||||
npm install
|
||||
```
|
||||
|
||||
### Режим разработки
|
||||
```bash
|
||||
npm run dev
|
||||
```
|
||||
|
||||
### Сборка приложения
|
||||
```bash
|
||||
npm run build
|
||||
```
|
||||
|
||||
### Создание дистрибутива
|
||||
```bash
|
||||
npm run dist
|
||||
```
|
||||
|
||||
## ⌨️ Горячие клавиши
|
||||
|
||||
- `Ctrl+S` - Сделать скриншот
|
||||
- `Ctrl+F` - Переключить полноэкранный режим
|
||||
- `Ctrl+R` - Начать/остановить запись
|
||||
- `Escape` - Выйти из полноэкранного режима
|
||||
|
||||
## 📁 Структура проекта
|
||||
|
||||
```
|
||||
desktop-operator/
|
||||
├── src/
|
||||
│ ├── main.js # Основной процесс Electron
|
||||
│ ├── config.js # Менеджер конфигурации
|
||||
│ └── renderer/ # Интерфейс приложения
|
||||
│ ├── index.html # HTML структура
|
||||
│ ├── app.js # Логика приложения
|
||||
│ └── styles.css # Стили интерфейса
|
||||
├── assets/ # Ресурсы (иконки, изображения)
|
||||
├── package.json # Конфигурация проекта
|
||||
└── README.md # Документация
|
||||
```
|
||||
|
||||
## 🔧 Конфигурация
|
||||
|
||||
Приложение автоматически создает файл конфигурации в:
|
||||
- **Windows**: `%APPDATA%/godeye-operator/config.json`
|
||||
- **macOS**: `~/Library/Application Support/godeye-operator/config.json`
|
||||
- **Linux**: `~/.config/godeye-operator/config.json`
|
||||
|
||||
### Параметры конфигурации:
|
||||
```json
|
||||
{
|
||||
"server": {
|
||||
"url": "http://localhost:3001",
|
||||
"autoConnect": false,
|
||||
"reconnectAttempts": 3,
|
||||
"reconnectInterval": 5000
|
||||
},
|
||||
"ui": {
|
||||
"theme": "dark",
|
||||
"language": "ru",
|
||||
"windowBounds": { "width": 1400, "height": 900 },
|
||||
"videoFilters": {
|
||||
"brightness": 0,
|
||||
"contrast": 0,
|
||||
"saturation": 0,
|
||||
"monochrome": false,
|
||||
"edgeDetection": false
|
||||
}
|
||||
},
|
||||
"recording": {
|
||||
"quality": "high",
|
||||
"format": "webm",
|
||||
"maxDuration": 3600000,
|
||||
"autoSave": true
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## 🌐 Интеграция с Backend
|
||||
|
||||
Приложение работает с backend сервером через WebSocket соединения:
|
||||
|
||||
- **Регистрация оператора** - `register:operator`
|
||||
- **Список устройств** - `devices:list`
|
||||
- **Запрос камеры** - `camera:request`
|
||||
- **WebRTC сигналинг** - `webrtc:offer/answer/ice-candidate`
|
||||
|
||||
## 📋 Системные требования
|
||||
|
||||
- **ОС**: Windows 10+, macOS 10.15+, Ubuntu 18.04+
|
||||
- **RAM**: 4 GB (рекомендуется 8 GB)
|
||||
- **Дисковое пространство**: 200 MB
|
||||
- **Сеть**: Стабильное интернет-соединение для WebRTC
|
||||
|
||||
## 🐛 Отладка
|
||||
|
||||
Для запуска в режиме отладки:
|
||||
```bash
|
||||
npm run dev
|
||||
```
|
||||
|
||||
Это откроет DevTools для диагностики проблем.
|
||||
|
||||
## 📝 Лицензия
|
||||
|
||||
MIT License - см. файл LICENSE для деталей.
|
||||
|
||||
---
|
||||
|
||||
**GodEye Team** | Система видеонаблюдения следующего поколения
|
||||
1
desktop-operator/node_modules/.bin/asar
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/asar
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../@electron/asar/bin/asar.js
|
||||
1
desktop-operator/node_modules/.bin/color-support
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/color-support
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../color-support/bin.js
|
||||
1
desktop-operator/node_modules/.bin/crc32
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/crc32
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../crc-32/bin/crc32.njs
|
||||
1
desktop-operator/node_modules/.bin/ejs
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/ejs
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../ejs/bin/cli.js
|
||||
1
desktop-operator/node_modules/.bin/electron
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/electron
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../electron/cli.js
|
||||
1
desktop-operator/node_modules/.bin/electron-builder
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/electron-builder
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../electron-builder/cli.js
|
||||
1
desktop-operator/node_modules/.bin/electron-osx-flat
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/electron-osx-flat
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../@electron/osx-sign/bin/electron-osx-flat.js
|
||||
1
desktop-operator/node_modules/.bin/electron-osx-sign
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/electron-osx-sign
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../@electron/osx-sign/bin/electron-osx-sign.js
|
||||
1
desktop-operator/node_modules/.bin/electron-rebuild
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/electron-rebuild
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../@electron/rebuild/lib/cli.js
|
||||
1
desktop-operator/node_modules/.bin/extract-zip
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/extract-zip
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../extract-zip/cli.js
|
||||
1
desktop-operator/node_modules/.bin/install-app-deps
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/install-app-deps
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../electron-builder/install-app-deps.js
|
||||
1
desktop-operator/node_modules/.bin/is-ci
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/is-ci
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../is-ci/bin.js
|
||||
1
desktop-operator/node_modules/.bin/jake
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/jake
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../jake/bin/cli.js
|
||||
1
desktop-operator/node_modules/.bin/js-yaml
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/js-yaml
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../js-yaml/bin/js-yaml.js
|
||||
1
desktop-operator/node_modules/.bin/json5
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/json5
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../json5/lib/cli.js
|
||||
1
desktop-operator/node_modules/.bin/mime
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/mime
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../mime/cli.js
|
||||
1
desktop-operator/node_modules/.bin/mkdirp
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/mkdirp
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../mkdirp/bin/cmd.js
|
||||
1
desktop-operator/node_modules/.bin/node-gyp
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/node-gyp
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../node-gyp/bin/node-gyp.js
|
||||
1
desktop-operator/node_modules/.bin/node-which
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/node-which
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../which/bin/node-which
|
||||
1
desktop-operator/node_modules/.bin/nopt
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/nopt
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../nopt/bin/nopt.js
|
||||
1
desktop-operator/node_modules/.bin/read-binary-file-arch
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/read-binary-file-arch
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../read-binary-file-arch/cli.js
|
||||
1
desktop-operator/node_modules/.bin/rimraf
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/rimraf
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../rimraf/bin.js
|
||||
1
desktop-operator/node_modules/.bin/semver
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/semver
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../semver/bin/semver.js
|
||||
1
desktop-operator/node_modules/.bin/tsc
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/tsc
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../typescript/bin/tsc
|
||||
1
desktop-operator/node_modules/.bin/tsserver
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/tsserver
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../typescript/bin/tsserver
|
||||
1
desktop-operator/node_modules/.bin/uuid
generated
vendored
Symbolic link
1
desktop-operator/node_modules/.bin/uuid
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../uuid/dist/esm/bin/uuid
|
||||
4768
desktop-operator/node_modules/.package-lock.json
generated
vendored
Normal file
4768
desktop-operator/node_modules/.package-lock.json
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
9
desktop-operator/node_modules/7zip-bin/7x.sh
generated
vendored
Normal file
9
desktop-operator/node_modules/7zip-bin/7x.sh
generated
vendored
Normal file
@@ -0,0 +1,9 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
sz_program=${SZA_PATH:-7za}
|
||||
sz_type=${SZA_ARCHIVE_TYPE:-xz}
|
||||
|
||||
case $1 in
|
||||
-d) "$sz_program" e -si -so -t${sz_type} ;;
|
||||
*) "$sz_program" a f -si -so -t${sz_type} -mx${SZA_COMPRESSION_LEVEL:-9} ;;
|
||||
esac 2> /dev/null
|
||||
22
desktop-operator/node_modules/7zip-bin/LICENSE.txt
generated
vendored
Normal file
22
desktop-operator/node_modules/7zip-bin/LICENSE.txt
generated
vendored
Normal file
@@ -0,0 +1,22 @@
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) 2016 Vladimir Krivosheev
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
|
||||
1
desktop-operator/node_modules/7zip-bin/README.md
generated
vendored
Normal file
1
desktop-operator/node_modules/7zip-bin/README.md
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
7-Zip precompiled binaries.
|
||||
2
desktop-operator/node_modules/7zip-bin/index.d.ts
generated
vendored
Normal file
2
desktop-operator/node_modules/7zip-bin/index.d.ts
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
export const path7za: string
|
||||
export const path7x: string
|
||||
22
desktop-operator/node_modules/7zip-bin/index.js
generated
vendored
Normal file
22
desktop-operator/node_modules/7zip-bin/index.js
generated
vendored
Normal file
@@ -0,0 +1,22 @@
|
||||
"use strict"
|
||||
|
||||
const path = require("path")
|
||||
|
||||
function getPath() {
|
||||
if (process.env.USE_SYSTEM_7ZA === "true") {
|
||||
return "7za"
|
||||
}
|
||||
|
||||
if (process.platform === "darwin") {
|
||||
return path.join(__dirname, "mac", process.arch, "7za")
|
||||
}
|
||||
else if (process.platform === "win32") {
|
||||
return path.join(__dirname, "win", process.arch, "7za.exe")
|
||||
}
|
||||
else {
|
||||
return path.join(__dirname, "linux", process.arch, "7za")
|
||||
}
|
||||
}
|
||||
|
||||
exports.path7za = getPath()
|
||||
exports.path7x = path.join(__dirname, "7x.sh")
|
||||
BIN
desktop-operator/node_modules/7zip-bin/linux/arm/7za
generated
vendored
Normal file
BIN
desktop-operator/node_modules/7zip-bin/linux/arm/7za
generated
vendored
Normal file
Binary file not shown.
BIN
desktop-operator/node_modules/7zip-bin/linux/arm64/7za
generated
vendored
Normal file
BIN
desktop-operator/node_modules/7zip-bin/linux/arm64/7za
generated
vendored
Normal file
Binary file not shown.
BIN
desktop-operator/node_modules/7zip-bin/linux/ia32/7za
generated
vendored
Normal file
BIN
desktop-operator/node_modules/7zip-bin/linux/ia32/7za
generated
vendored
Normal file
Binary file not shown.
BIN
desktop-operator/node_modules/7zip-bin/linux/x64/7za
generated
vendored
Normal file
BIN
desktop-operator/node_modules/7zip-bin/linux/x64/7za
generated
vendored
Normal file
Binary file not shown.
9
desktop-operator/node_modules/7zip-bin/linux/x64/build.sh
generated
vendored
Normal file
9
desktop-operator/node_modules/7zip-bin/linux/x64/build.sh
generated
vendored
Normal file
@@ -0,0 +1,9 @@
|
||||
#!/usr/bin/env bash
|
||||
set -e
|
||||
|
||||
BASEDIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
|
||||
|
||||
rm -rf /tmp/7z-linux
|
||||
mkdir /tmp/7z-linux
|
||||
cp "$BASEDIR/do-build.sh" /tmp/7z-linux/do-build.sh
|
||||
docker run --rm -v /tmp/7z-linux:/project buildpack-deps:xenial /project/do-build.sh
|
||||
20
desktop-operator/node_modules/7zip-bin/linux/x64/do-build.sh
generated
vendored
Normal file
20
desktop-operator/node_modules/7zip-bin/linux/x64/do-build.sh
generated
vendored
Normal file
@@ -0,0 +1,20 @@
|
||||
#!/usr/bin/env bash
|
||||
set -e
|
||||
|
||||
apt-get update -qq
|
||||
apt-get upgrade -qq
|
||||
|
||||
echo "deb http://apt.llvm.org/xenial/ llvm-toolchain-xenial-5.0 main" > /etc/apt/sources.list.d/llvm.list
|
||||
curl -L http://apt.llvm.org/llvm-snapshot.gpg.key | apt-key add -
|
||||
apt-get update -qq
|
||||
apt-get install -qq bzip2 yasm clang-5.0 lldb-5.0 lld-5.0
|
||||
|
||||
ln -s /usr/bin/clang-5.0 /usr/bin/clang
|
||||
ln -s /usr/bin/clang++-5.0 /usr/bin/clang++
|
||||
|
||||
mkdir -p /tmp/7z
|
||||
cd /tmp/7z
|
||||
curl -L http://downloads.sourceforge.net/project/p7zip/p7zip/16.02/p7zip_16.02_src_all.tar.bz2 | tar -xj -C . --strip-components 1
|
||||
cp makefile.linux_clang_amd64_asm makefile.machine
|
||||
make -j4
|
||||
mv bin/7za /project/7za
|
||||
BIN
desktop-operator/node_modules/7zip-bin/mac/arm64/7za
generated
vendored
Normal file
BIN
desktop-operator/node_modules/7zip-bin/mac/arm64/7za
generated
vendored
Normal file
Binary file not shown.
BIN
desktop-operator/node_modules/7zip-bin/mac/x64/7za
generated
vendored
Normal file
BIN
desktop-operator/node_modules/7zip-bin/mac/x64/7za
generated
vendored
Normal file
Binary file not shown.
20
desktop-operator/node_modules/7zip-bin/package.json
generated
vendored
Normal file
20
desktop-operator/node_modules/7zip-bin/package.json
generated
vendored
Normal file
@@ -0,0 +1,20 @@
|
||||
{
|
||||
"name": "7zip-bin",
|
||||
"description": "7-Zip precompiled binaries",
|
||||
"version": "5.2.0",
|
||||
"files": [
|
||||
"*.js",
|
||||
"7x.sh",
|
||||
"index.d.ts",
|
||||
"linux",
|
||||
"mac",
|
||||
"win"
|
||||
],
|
||||
"license": "MIT",
|
||||
"repository": "develar/7zip-bin",
|
||||
"keywords": [
|
||||
"7zip",
|
||||
"7z",
|
||||
"7za"
|
||||
]
|
||||
}
|
||||
BIN
desktop-operator/node_modules/7zip-bin/win/arm64/7za.exe
generated
vendored
Normal file
BIN
desktop-operator/node_modules/7zip-bin/win/arm64/7za.exe
generated
vendored
Normal file
Binary file not shown.
BIN
desktop-operator/node_modules/7zip-bin/win/ia32/7za.exe
generated
vendored
Normal file
BIN
desktop-operator/node_modules/7zip-bin/win/ia32/7za.exe
generated
vendored
Normal file
Binary file not shown.
BIN
desktop-operator/node_modules/7zip-bin/win/x64/7za.exe
generated
vendored
Normal file
BIN
desktop-operator/node_modules/7zip-bin/win/x64/7za.exe
generated
vendored
Normal file
Binary file not shown.
259
desktop-operator/node_modules/@develar/schema-utils/CHANGELOG.md
generated
vendored
Normal file
259
desktop-operator/node_modules/@develar/schema-utils/CHANGELOG.md
generated
vendored
Normal file
@@ -0,0 +1,259 @@
|
||||
# Changelog
|
||||
|
||||
All notable changes to this project will be documented in this file. See [standard-version](https://github.com/conventional-changelog/standard-version) for commit guidelines.
|
||||
|
||||
### [2.6.5](https://github.com/webpack/schema-utils/compare/v2.6.4...v2.6.5) (2020-03-11)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* typo ([#88](https://github.com/webpack/schema-utils/issues/88)) ([52fc92b](https://github.com/webpack/schema-utils/commit/52fc92b531b7503b7bbe1bf9b21bfa26dffbc8f5))
|
||||
|
||||
### [2.6.4](https://github.com/webpack/schema-utils/compare/v2.6.3...v2.6.4) (2020-01-17)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* change `initialised` to `initialized` ([#87](https://github.com/webpack/schema-utils/issues/87)) ([70f12d3](https://github.com/webpack/schema-utils/commit/70f12d33a8eaa27249bc9c1a27f886724cf91ea7))
|
||||
|
||||
### [2.6.3](https://github.com/webpack/schema-utils/compare/v2.6.2...v2.6.3) (2020-01-17)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* prefer the `baseDataPath` option from arguments ([#86](https://github.com/webpack/schema-utils/issues/86)) ([e236859](https://github.com/webpack/schema-utils/commit/e236859e85b28e35e1294f86fc1ff596a5031cea))
|
||||
|
||||
### [2.6.2](https://github.com/webpack/schema-utils/compare/v2.6.1...v2.6.2) (2020-01-14)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* better handle Windows absolute paths ([#85](https://github.com/webpack/schema-utils/issues/85)) ([1fa2930](https://github.com/webpack/schema-utils/commit/1fa2930a161e907b9fc53a7233d605910afdb883))
|
||||
|
||||
### [2.6.1](https://github.com/webpack/schema-utils/compare/v2.6.0...v2.6.1) (2019-11-28)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* typescript declarations ([#84](https://github.com/webpack/schema-utils/issues/84)) ([89d55a9](https://github.com/webpack/schema-utils/commit/89d55a9a8edfa6a8ac8b112f226bb3154e260319))
|
||||
|
||||
## [2.6.0](https://github.com/webpack/schema-utils/compare/v2.5.0...v2.6.0) (2019-11-27)
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* support configuration via title ([#81](https://github.com/webpack/schema-utils/issues/81)) ([afddc10](https://github.com/webpack/schema-utils/commit/afddc109f6891cd37a9f1835d50862d119a072bf))
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* typescript definitions ([#70](https://github.com/webpack/schema-utils/issues/70)) ([f38158d](https://github.com/webpack/schema-utils/commit/f38158d6d040e2c701622778ae8122fb26a4f990))
|
||||
|
||||
## [2.5.0](https://github.com/webpack/schema-utils/compare/v2.4.1...v2.5.0) (2019-10-15)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* rework format for maxLength, minLength ([#67](https://github.com/webpack/schema-utils/issues/67)) ([0d12259](https://github.com/webpack/schema-utils/commit/0d12259))
|
||||
* support all cases with one number in range ([#64](https://github.com/webpack/schema-utils/issues/64)) ([7fc8069](https://github.com/webpack/schema-utils/commit/7fc8069))
|
||||
* typescript definition and export naming ([#69](https://github.com/webpack/schema-utils/issues/69)) ([a435b79](https://github.com/webpack/schema-utils/commit/a435b79))
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* "smart" numbers range ([62fb107](https://github.com/webpack/schema-utils/commit/62fb107))
|
||||
|
||||
### [2.4.1](https://github.com/webpack/schema-utils/compare/v2.4.0...v2.4.1) (2019-09-27)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* publish definitions ([#58](https://github.com/webpack/schema-utils/issues/58)) ([1885faa](https://github.com/webpack/schema-utils/commit/1885faa))
|
||||
|
||||
## [2.4.0](https://github.com/webpack/schema-utils/compare/v2.3.0...v2.4.0) (2019-09-26)
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* better errors when the `type` keyword doesn't exist ([0988be2](https://github.com/webpack/schema-utils/commit/0988be2))
|
||||
* support $data reference ([#56](https://github.com/webpack/schema-utils/issues/56)) ([d2f11d6](https://github.com/webpack/schema-utils/commit/d2f11d6))
|
||||
* types definitions ([#52](https://github.com/webpack/schema-utils/issues/52)) ([facb431](https://github.com/webpack/schema-utils/commit/facb431))
|
||||
|
||||
## [2.3.0](https://github.com/webpack/schema-utils/compare/v2.2.0...v2.3.0) (2019-09-26)
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* support `not` keyword ([#53](https://github.com/webpack/schema-utils/issues/53)) ([765f458](https://github.com/webpack/schema-utils/commit/765f458))
|
||||
|
||||
## [2.2.0](https://github.com/webpack/schema-utils/compare/v2.1.0...v2.2.0) (2019-09-02)
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* better error output for `oneOf` and `anyOf` ([#48](https://github.com/webpack/schema-utils/issues/48)) ([#50](https://github.com/webpack/schema-utils/issues/50)) ([332242f](https://github.com/webpack/schema-utils/commit/332242f))
|
||||
|
||||
## [2.1.0](https://github.com/webpack-contrib/schema-utils/compare/v2.0.1...v2.1.0) (2019-08-07)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* throw error on sparse arrays ([#47](https://github.com/webpack-contrib/schema-utils/issues/47)) ([b85ac38](https://github.com/webpack-contrib/schema-utils/commit/b85ac38))
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* export `ValidateError` ([#46](https://github.com/webpack-contrib/schema-utils/issues/46)) ([ff781d7](https://github.com/webpack-contrib/schema-utils/commit/ff781d7))
|
||||
|
||||
|
||||
|
||||
### [2.0.1](https://github.com/webpack-contrib/schema-utils/compare/v2.0.0...v2.0.1) (2019-07-18)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* error message for empty object ([#44](https://github.com/webpack-contrib/schema-utils/issues/44)) ([0b4b4a2](https://github.com/webpack-contrib/schema-utils/commit/0b4b4a2))
|
||||
|
||||
|
||||
|
||||
### [2.0.0](https://github.com/webpack-contrib/schema-utils/compare/v1.0.0...v2.0.0) (2019-07-17)
|
||||
|
||||
|
||||
### BREAKING CHANGES
|
||||
|
||||
* drop support for Node.js < 8.9.0
|
||||
* drop support `errorMessage`, please use `description` for links.
|
||||
* api was changed, please look documentation.
|
||||
* error messages was fully rewritten.
|
||||
|
||||
|
||||
<a name="1.0.0"></a>
|
||||
# [1.0.0](https://github.com/webpack-contrib/schema-utils/compare/v0.4.7...v1.0.0) (2018-08-07)
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* **src:** add support for custom error messages ([#33](https://github.com/webpack-contrib/schema-utils/issues/33)) ([1cbe4ef](https://github.com/webpack-contrib/schema-utils/commit/1cbe4ef))
|
||||
|
||||
|
||||
|
||||
<a name="0.4.7"></a>
|
||||
## [0.4.7](https://github.com/webpack-contrib/schema-utils/compare/v0.4.6...v0.4.7) (2018-08-07)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* **src:** `node >= v4.0.0` support ([#32](https://github.com/webpack-contrib/schema-utils/issues/32)) ([cb13dd4](https://github.com/webpack-contrib/schema-utils/commit/cb13dd4))
|
||||
|
||||
|
||||
|
||||
<a name="0.4.6"></a>
|
||||
## [0.4.6](https://github.com/webpack-contrib/schema-utils/compare/v0.4.5...v0.4.6) (2018-08-06)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* **package:** remove lockfile ([#28](https://github.com/webpack-contrib/schema-utils/issues/28)) ([69f1a81](https://github.com/webpack-contrib/schema-utils/commit/69f1a81))
|
||||
* **package:** remove unnecessary `webpack` dependency ([#26](https://github.com/webpack-contrib/schema-utils/issues/26)) ([532eaa5](https://github.com/webpack-contrib/schema-utils/commit/532eaa5))
|
||||
|
||||
|
||||
|
||||
<a name="0.4.5"></a>
|
||||
## [0.4.5](https://github.com/webpack-contrib/schema-utils/compare/v0.4.4...v0.4.5) (2018-02-13)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* **CHANGELOG:** update broken links ([4483b9f](https://github.com/webpack-contrib/schema-utils/commit/4483b9f))
|
||||
* **package:** update broken links ([f2494ba](https://github.com/webpack-contrib/schema-utils/commit/f2494ba))
|
||||
|
||||
|
||||
|
||||
<a name="0.4.4"></a>
|
||||
## [0.4.4](https://github.com/webpack-contrib/schema-utils/compare/v0.4.3...v0.4.4) (2018-02-13)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* **package:** update `dependencies` ([#22](https://github.com/webpack-contrib/schema-utils/issues/22)) ([3aecac6](https://github.com/webpack-contrib/schema-utils/commit/3aecac6))
|
||||
|
||||
|
||||
|
||||
<a name="0.4.3"></a>
|
||||
## [0.4.3](https://github.com/webpack-contrib/schema-utils/compare/v0.4.2...v0.4.3) (2017-12-14)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* **validateOptions:** throw `err` instead of `process.exit(1)` ([#17](https://github.com/webpack-contrib/schema-utils/issues/17)) ([c595eda](https://github.com/webpack-contrib/schema-utils/commit/c595eda))
|
||||
* **ValidationError:** never return `this` in the ctor ([#16](https://github.com/webpack-contrib/schema-utils/issues/16)) ([c723791](https://github.com/webpack-contrib/schema-utils/commit/c723791))
|
||||
|
||||
|
||||
|
||||
<a name="0.4.2"></a>
|
||||
## [0.4.2](https://github.com/webpack-contrib/schema-utils/compare/v0.4.1...v0.4.2) (2017-11-09)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* **validateOptions:** catch `ValidationError` and handle it internally ([#15](https://github.com/webpack-contrib/schema-utils/issues/15)) ([9c5ef5e](https://github.com/webpack-contrib/schema-utils/commit/9c5ef5e))
|
||||
|
||||
|
||||
|
||||
<a name="0.4.1"></a>
|
||||
## [0.4.1](https://github.com/webpack-contrib/schema-utils/compare/v0.4.0...v0.4.1) (2017-11-03)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* **ValidationError:** use `Error.captureStackTrace` for `err.stack` handling ([#14](https://github.com/webpack-contrib/schema-utils/issues/14)) ([a6fb974](https://github.com/webpack-contrib/schema-utils/commit/a6fb974))
|
||||
|
||||
|
||||
|
||||
<a name="0.4.0"></a>
|
||||
# [0.4.0](https://github.com/webpack-contrib/schema-utils/compare/v0.3.0...v0.4.0) (2017-10-28)
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* add support for `typeof`, `instanceof` (`{Function\|RegExp}`) ([#10](https://github.com/webpack-contrib/schema-utils/issues/10)) ([9f01816](https://github.com/webpack-contrib/schema-utils/commit/9f01816))
|
||||
|
||||
|
||||
|
||||
<a name="0.3.0"></a>
|
||||
# [0.3.0](https://github.com/webpack-contrib/schema-utils/compare/v0.2.1...v0.3.0) (2017-04-29)
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* add ValidationError ([#8](https://github.com/webpack-contrib/schema-utils/issues/8)) ([d48f0fb](https://github.com/webpack-contrib/schema-utils/commit/d48f0fb))
|
||||
|
||||
|
||||
|
||||
<a name="0.2.1"></a>
|
||||
## [0.2.1](https://github.com/webpack-contrib/schema-utils/compare/v0.2.0...v0.2.1) (2017-03-13)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* Include .babelrc to `files` ([28f0363](https://github.com/webpack-contrib/schema-utils/commit/28f0363))
|
||||
* Include source to `files` ([43b0f2f](https://github.com/webpack-contrib/schema-utils/commit/43b0f2f))
|
||||
|
||||
|
||||
|
||||
<a name="0.2.0"></a>
|
||||
# [0.2.0](https://github.com/webpack-contrib/schema-utils/compare/v0.1.0...v0.2.0) (2017-03-12)
|
||||
|
||||
<a name="0.1.0"></a>
|
||||
# 0.1.0 (2017-03-07)
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* **validations:** add validateOptions module ([ae9b47b](https://github.com/webpack-contrib/schema-utils/commit/ae9b47b))
|
||||
|
||||
|
||||
|
||||
# Change Log
|
||||
|
||||
All notable changes to this project will be documented in this file. See [standard-version](https://github.com/conventional-changelog/standard-version) for commit guidelines.
|
||||
20
desktop-operator/node_modules/@develar/schema-utils/LICENSE
generated
vendored
Normal file
20
desktop-operator/node_modules/@develar/schema-utils/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,20 @@
|
||||
Copyright JS Foundation and other contributors
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining
|
||||
a copy of this software and associated documentation files (the
|
||||
'Software'), to deal in the Software without restriction, including
|
||||
without limitation the rights to use, copy, modify, merge, publish,
|
||||
distribute, sublicense, and/or sell copies of the Software, and to
|
||||
permit persons to whom the Software is furnished to do so, subject to
|
||||
the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be
|
||||
included in all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND,
|
||||
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
|
||||
IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
|
||||
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
|
||||
TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
|
||||
SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
276
desktop-operator/node_modules/@develar/schema-utils/README.md
generated
vendored
Normal file
276
desktop-operator/node_modules/@develar/schema-utils/README.md
generated
vendored
Normal file
@@ -0,0 +1,276 @@
|
||||
<div align="center">
|
||||
<a href="http://json-schema.org">
|
||||
<img width="160" height="160"
|
||||
src="https://raw.githubusercontent.com/webpack-contrib/schema-utils/master/.github/assets/logo.png">
|
||||
</a>
|
||||
<a href="https://github.com/webpack/webpack">
|
||||
<img width="200" height="200"
|
||||
src="https://webpack.js.org/assets/icon-square-big.svg">
|
||||
</a>
|
||||
</div>
|
||||
|
||||
[![npm][npm]][npm-url]
|
||||
[![node][node]][node-url]
|
||||
[![deps][deps]][deps-url]
|
||||
[![tests][tests]][tests-url]
|
||||
[![coverage][cover]][cover-url]
|
||||
[![chat][chat]][chat-url]
|
||||
[![size][size]][size-url]
|
||||
|
||||
# schema-utils
|
||||
|
||||
Package for validate options in loaders and plugins.
|
||||
|
||||
## Getting Started
|
||||
|
||||
To begin, you'll need to install `schema-utils`:
|
||||
|
||||
```console
|
||||
npm install schema-utils
|
||||
```
|
||||
|
||||
## API
|
||||
|
||||
**schema.json**
|
||||
|
||||
```json
|
||||
{
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"option": {
|
||||
"type": ["boolean"]
|
||||
}
|
||||
},
|
||||
"additionalProperties": false
|
||||
}
|
||||
```
|
||||
|
||||
```js
|
||||
import schema from './path/to/schema.json';
|
||||
import validate from 'schema-utils';
|
||||
|
||||
const options = { option: true };
|
||||
const configuration = { name: 'Loader Name/Plugin Name/Name' };
|
||||
|
||||
validate(schema, options, configuration);
|
||||
```
|
||||
|
||||
### `schema`
|
||||
|
||||
Type: `String`
|
||||
|
||||
JSON schema.
|
||||
|
||||
Simple example of schema:
|
||||
|
||||
```json
|
||||
{
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"name": {
|
||||
"description": "This is description of option.",
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"additionalProperties": false
|
||||
}
|
||||
```
|
||||
|
||||
### `options`
|
||||
|
||||
Type: `Object`
|
||||
|
||||
Object with options.
|
||||
|
||||
```js
|
||||
validate(
|
||||
schema,
|
||||
{
|
||||
name: 123,
|
||||
},
|
||||
{ name: 'MyPlugin' }
|
||||
);
|
||||
```
|
||||
|
||||
### `configuration`
|
||||
|
||||
Allow to configure validator.
|
||||
|
||||
There is an alternative method to configure the `name` and`baseDataPath` options via the `title` property in the schema.
|
||||
For example:
|
||||
|
||||
```json
|
||||
{
|
||||
"title": "My Loader options",
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"name": {
|
||||
"description": "This is description of option.",
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"additionalProperties": false
|
||||
}
|
||||
```
|
||||
|
||||
The last word used for the `baseDataPath` option, other words used for the `name` option.
|
||||
Based on the example above the `name` option equals `My Loader`, the `baseDataPath` option equals `options`.
|
||||
|
||||
#### `name`
|
||||
|
||||
Type: `Object`
|
||||
Default: `"Object"`
|
||||
|
||||
Allow to setup name in validation errors.
|
||||
|
||||
```js
|
||||
validate(schema, options, { name: 'MyPlugin' });
|
||||
```
|
||||
|
||||
```shell
|
||||
Invalid configuration object. MyPlugin has been initialised using a configuration object that does not match the API schema.
|
||||
- configuration.optionName should be a integer.
|
||||
```
|
||||
|
||||
#### `baseDataPath`
|
||||
|
||||
Type: `String`
|
||||
Default: `"configuration"`
|
||||
|
||||
Allow to setup base data path in validation errors.
|
||||
|
||||
```js
|
||||
validate(schema, options, { name: 'MyPlugin', baseDataPath: 'options' });
|
||||
```
|
||||
|
||||
```shell
|
||||
Invalid options object. MyPlugin has been initialised using an options object that does not match the API schema.
|
||||
- options.optionName should be a integer.
|
||||
```
|
||||
|
||||
#### `postFormatter`
|
||||
|
||||
Type: `Function`
|
||||
Default: `undefined`
|
||||
|
||||
Allow to reformat errors.
|
||||
|
||||
```js
|
||||
validate(schema, options, {
|
||||
name: 'MyPlugin',
|
||||
postFormatter: (formattedError, error) => {
|
||||
if (error.keyword === 'type') {
|
||||
return `${formattedError}\nAdditional Information.`;
|
||||
}
|
||||
|
||||
return formattedError;
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
```shell
|
||||
Invalid options object. MyPlugin has been initialized using an options object that does not match the API schema.
|
||||
- options.optionName should be a integer.
|
||||
Additional Information.
|
||||
```
|
||||
|
||||
## Examples
|
||||
|
||||
**schema.json**
|
||||
|
||||
```json
|
||||
{
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"name": {
|
||||
"type": "string"
|
||||
},
|
||||
"test": {
|
||||
"anyOf": [
|
||||
{ "type": "array" },
|
||||
{ "type": "string" },
|
||||
{ "instanceof": "RegExp" }
|
||||
]
|
||||
},
|
||||
"transform": {
|
||||
"instanceof": "Function"
|
||||
},
|
||||
"sourceMap": {
|
||||
"type": "boolean"
|
||||
}
|
||||
},
|
||||
"additionalProperties": false
|
||||
}
|
||||
```
|
||||
|
||||
### `Loader`
|
||||
|
||||
```js
|
||||
import { getOptions } from 'loader-utils';
|
||||
import validateOptions from 'schema-utils';
|
||||
|
||||
import schema from 'path/to/schema.json';
|
||||
|
||||
function loader(src, map) {
|
||||
const options = getOptions(this) || {};
|
||||
|
||||
validateOptions(schema, options, {
|
||||
name: 'Loader Name',
|
||||
baseDataPath: 'options',
|
||||
});
|
||||
|
||||
// Code...
|
||||
}
|
||||
|
||||
export default loader;
|
||||
```
|
||||
|
||||
### `Plugin`
|
||||
|
||||
```js
|
||||
import validateOptions from 'schema-utils';
|
||||
|
||||
import schema from 'path/to/schema.json';
|
||||
|
||||
class Plugin {
|
||||
constructor(options) {
|
||||
validateOptions(schema, options, {
|
||||
name: 'Plugin Name',
|
||||
baseDataPath: 'options',
|
||||
});
|
||||
|
||||
this.options = options;
|
||||
}
|
||||
|
||||
apply(compiler) {
|
||||
// Code...
|
||||
}
|
||||
}
|
||||
|
||||
export default Plugin;
|
||||
```
|
||||
|
||||
## Contributing
|
||||
|
||||
Please take a moment to read our contributing guidelines if you haven't yet done so.
|
||||
|
||||
[CONTRIBUTING](./.github/CONTRIBUTING.md)
|
||||
|
||||
## License
|
||||
|
||||
[MIT](./LICENSE)
|
||||
|
||||
[npm]: https://img.shields.io/npm/v/schema-utils.svg
|
||||
[npm-url]: https://npmjs.com/package/schema-utils
|
||||
[node]: https://img.shields.io/node/v/schema-utils.svg
|
||||
[node-url]: https://nodejs.org
|
||||
[deps]: https://david-dm.org/webpack/schema-utils.svg
|
||||
[deps-url]: https://david-dm.org/webpack/schema-utils
|
||||
[tests]: https://dev.azure.com/webpack/schema-utils/_apis/build/status/webpack.schema-utils?branchName=master
|
||||
[tests-url]: https://dev.azure.com/webpack/schema-utils/_build/latest?definitionId=9&branchName=master
|
||||
[cover]: https://codecov.io/gh/webpack/schema-utils/branch/master/graph/badge.svg
|
||||
[cover-url]: https://codecov.io/gh/webpack/schema-utils
|
||||
[chat]: https://badges.gitter.im/webpack/webpack.svg
|
||||
[chat-url]: https://gitter.im/webpack/webpack
|
||||
[size]: https://packagephobia.now.sh/badge?p=schema-utils
|
||||
[size-url]: https://packagephobia.now.sh/result?p=schema-utils
|
||||
115
desktop-operator/node_modules/@develar/schema-utils/declarations/ValidationError.d.ts
generated
vendored
Normal file
115
desktop-operator/node_modules/@develar/schema-utils/declarations/ValidationError.d.ts
generated
vendored
Normal file
@@ -0,0 +1,115 @@
|
||||
export default ValidationError;
|
||||
export type JSONSchema6 = import('json-schema').JSONSchema6;
|
||||
export type JSONSchema7 = import('json-schema').JSONSchema7;
|
||||
export type Schema =
|
||||
| import('json-schema').JSONSchema4
|
||||
| import('json-schema').JSONSchema6
|
||||
| import('json-schema').JSONSchema7;
|
||||
export type ValidationErrorConfiguration = {
|
||||
name?: string | undefined;
|
||||
baseDataPath?: string | undefined;
|
||||
postFormatter?: import('./validate').PostFormatter | undefined;
|
||||
};
|
||||
export type PostFormatter = (
|
||||
formattedError: string,
|
||||
error: import('ajv').ErrorObject & {
|
||||
children?: import('ajv').ErrorObject[] | undefined;
|
||||
}
|
||||
) => string;
|
||||
export type SchemaUtilErrorObject = import('ajv').ErrorObject & {
|
||||
children?: import('ajv').ErrorObject[] | undefined;
|
||||
};
|
||||
export type SPECIFICITY = number;
|
||||
declare class ValidationError extends Error {
|
||||
/**
|
||||
* @param {Array<SchemaUtilErrorObject>} errors
|
||||
* @param {Schema} schema
|
||||
* @param {ValidationErrorConfiguration} configuration
|
||||
*/
|
||||
constructor(
|
||||
errors: (import('ajv').ErrorObject & {
|
||||
children?: import('ajv').ErrorObject[] | undefined;
|
||||
})[],
|
||||
schema:
|
||||
| import('json-schema').JSONSchema4
|
||||
| import('json-schema').JSONSchema6
|
||||
| import('json-schema').JSONSchema7,
|
||||
configuration?: import('./validate').ValidationErrorConfiguration
|
||||
);
|
||||
/** @type {Array<SchemaUtilErrorObject>} */
|
||||
errors: Array<SchemaUtilErrorObject>;
|
||||
/** @type {Schema} */
|
||||
schema: Schema;
|
||||
/** @type {string} */
|
||||
headerName: string;
|
||||
/** @type {string} */
|
||||
baseDataPath: string;
|
||||
/** @type {PostFormatter | null} */
|
||||
postFormatter: PostFormatter | null;
|
||||
/**
|
||||
* @param {string} path
|
||||
* @returns {Schema}
|
||||
*/
|
||||
getSchemaPart(
|
||||
path: string
|
||||
):
|
||||
| import('json-schema').JSONSchema4
|
||||
| import('json-schema').JSONSchema6
|
||||
| import('json-schema').JSONSchema7;
|
||||
/**
|
||||
* @param {Schema} schema
|
||||
* @param {Array<Object>} prevSchemas
|
||||
* @returns {string}
|
||||
*/
|
||||
formatSchema(
|
||||
schema:
|
||||
| import('json-schema').JSONSchema4
|
||||
| import('json-schema').JSONSchema6
|
||||
| import('json-schema').JSONSchema7,
|
||||
prevSchemas?: Object[]
|
||||
): string;
|
||||
/**
|
||||
* @param {Schema=} schemaPart
|
||||
* @param {(boolean | Array<string>)=} additionalPath
|
||||
* @param {boolean=} needDot
|
||||
* @returns {string}
|
||||
*/
|
||||
getSchemaPartText(
|
||||
schemaPart?:
|
||||
| import('json-schema').JSONSchema4
|
||||
| import('json-schema').JSONSchema6
|
||||
| import('json-schema').JSONSchema7
|
||||
| undefined,
|
||||
additionalPath?: boolean | string[] | undefined,
|
||||
needDot?: boolean | undefined
|
||||
): string;
|
||||
/**
|
||||
* @param {Schema=} schemaPart
|
||||
* @returns {string}
|
||||
*/
|
||||
getSchemaPartDescription(
|
||||
schemaPart?:
|
||||
| import('json-schema').JSONSchema4
|
||||
| import('json-schema').JSONSchema6
|
||||
| import('json-schema').JSONSchema7
|
||||
| undefined
|
||||
): string;
|
||||
/**
|
||||
* @param {SchemaUtilErrorObject} error
|
||||
* @returns {string}
|
||||
*/
|
||||
formatValidationError(
|
||||
error: import('ajv').ErrorObject & {
|
||||
children?: import('ajv').ErrorObject[] | undefined;
|
||||
}
|
||||
): string;
|
||||
/**
|
||||
* @param {Array<SchemaUtilErrorObject>} errors
|
||||
* @returns {string}
|
||||
*/
|
||||
formatValidationErrors(
|
||||
errors: (import('ajv').ErrorObject & {
|
||||
children?: import('ajv').ErrorObject[] | undefined;
|
||||
})[]
|
||||
): string;
|
||||
}
|
||||
2
desktop-operator/node_modules/@develar/schema-utils/declarations/index.d.ts
generated
vendored
Normal file
2
desktop-operator/node_modules/@develar/schema-utils/declarations/index.d.ts
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
declare const _exports: typeof import('./validate').default;
|
||||
export = _exports;
|
||||
13
desktop-operator/node_modules/@develar/schema-utils/declarations/keywords/absolutePath.d.ts
generated
vendored
Normal file
13
desktop-operator/node_modules/@develar/schema-utils/declarations/keywords/absolutePath.d.ts
generated
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
export default addAbsolutePathKeyword;
|
||||
export type Ajv = import('ajv').Ajv;
|
||||
export type SchemaUtilErrorObject = import('ajv').ErrorObject & {
|
||||
children?: import('ajv').ErrorObject[] | undefined;
|
||||
};
|
||||
/**
|
||||
*
|
||||
* @param {Ajv} ajv
|
||||
* @returns {Ajv}
|
||||
*/
|
||||
declare function addAbsolutePathKeyword(
|
||||
ajv: import('ajv').Ajv
|
||||
): import('ajv').Ajv;
|
||||
43
desktop-operator/node_modules/@develar/schema-utils/declarations/util/Range.d.ts
generated
vendored
Normal file
43
desktop-operator/node_modules/@develar/schema-utils/declarations/util/Range.d.ts
generated
vendored
Normal file
@@ -0,0 +1,43 @@
|
||||
export = Range;
|
||||
/**
|
||||
* @typedef {[number, boolean]} RangeValue
|
||||
*/
|
||||
/**
|
||||
* @callback RangeValueCallback
|
||||
* @param {RangeValue} rangeValue
|
||||
* @returns {boolean}
|
||||
*/
|
||||
declare class Range {
|
||||
/** @type {Array<RangeValue>} */
|
||||
_left: Array<RangeValue>;
|
||||
/** @type {Array<RangeValue>} */
|
||||
_right: Array<RangeValue>;
|
||||
/**
|
||||
* @param {number} value
|
||||
* @param {boolean=} exclusive
|
||||
*/
|
||||
left(value: number, exclusive?: boolean | undefined): void;
|
||||
/**
|
||||
* @param {number} value
|
||||
* @param {boolean=} exclusive
|
||||
*/
|
||||
right(value: number, exclusive?: boolean | undefined): void;
|
||||
/**
|
||||
* @param {boolean} logic is not logic applied
|
||||
* @return {string} "smart" range string representation
|
||||
*/
|
||||
format(logic?: boolean): string;
|
||||
}
|
||||
declare namespace Range {
|
||||
export {
|
||||
getOperator,
|
||||
formatRight,
|
||||
formatLeft,
|
||||
formatRange,
|
||||
getRangeValue,
|
||||
RangeValue,
|
||||
RangeValueCallback,
|
||||
};
|
||||
}
|
||||
type RangeValue = [number, boolean];
|
||||
type RangeValueCallback = (rangeValue: [number, boolean]) => boolean;
|
||||
43
desktop-operator/node_modules/@develar/schema-utils/declarations/validate.d.ts
generated
vendored
Normal file
43
desktop-operator/node_modules/@develar/schema-utils/declarations/validate.d.ts
generated
vendored
Normal file
@@ -0,0 +1,43 @@
|
||||
export default validate;
|
||||
export type JSONSchema4 = import('json-schema').JSONSchema4;
|
||||
export type JSONSchema6 = import('json-schema').JSONSchema6;
|
||||
export type JSONSchema7 = import('json-schema').JSONSchema7;
|
||||
export type ErrorObject = Ajv.ErrorObject;
|
||||
export type Schema =
|
||||
| import('json-schema').JSONSchema4
|
||||
| import('json-schema').JSONSchema6
|
||||
| import('json-schema').JSONSchema7;
|
||||
export type SchemaUtilErrorObject = Ajv.ErrorObject & {
|
||||
children?: Ajv.ErrorObject[] | undefined;
|
||||
};
|
||||
export type PostFormatter = (
|
||||
formattedError: string,
|
||||
error: Ajv.ErrorObject & {
|
||||
children?: Ajv.ErrorObject[] | undefined;
|
||||
}
|
||||
) => string;
|
||||
export type ValidationErrorConfiguration = {
|
||||
name?: string | undefined;
|
||||
baseDataPath?: string | undefined;
|
||||
postFormatter?: PostFormatter | undefined;
|
||||
};
|
||||
/**
|
||||
* @param {Schema} schema
|
||||
* @param {Array<object> | object} options
|
||||
* @param {ValidationErrorConfiguration=} configuration
|
||||
* @returns {void}
|
||||
*/
|
||||
declare function validate(
|
||||
schema:
|
||||
| import('json-schema').JSONSchema4
|
||||
| import('json-schema').JSONSchema6
|
||||
| import('json-schema').JSONSchema7,
|
||||
options: any,
|
||||
configuration?: ValidationErrorConfiguration | undefined
|
||||
): void;
|
||||
declare namespace validate {
|
||||
export { ValidationError };
|
||||
export { ValidationError as ValidateError };
|
||||
}
|
||||
import Ajv from 'ajv';
|
||||
import ValidationError from './ValidationError';
|
||||
1286
desktop-operator/node_modules/@develar/schema-utils/dist/ValidationError.js
generated
vendored
Normal file
1286
desktop-operator/node_modules/@develar/schema-utils/dist/ValidationError.js
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
5
desktop-operator/node_modules/@develar/schema-utils/dist/index.js
generated
vendored
Normal file
5
desktop-operator/node_modules/@develar/schema-utils/dist/index.js
generated
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
"use strict";
|
||||
|
||||
const validate = require('./validate');
|
||||
|
||||
module.exports = validate.default;
|
||||
91
desktop-operator/node_modules/@develar/schema-utils/dist/keywords/absolutePath.js
generated
vendored
Normal file
91
desktop-operator/node_modules/@develar/schema-utils/dist/keywords/absolutePath.js
generated
vendored
Normal file
@@ -0,0 +1,91 @@
|
||||
"use strict";
|
||||
|
||||
Object.defineProperty(exports, "__esModule", {
|
||||
value: true
|
||||
});
|
||||
exports.default = void 0;
|
||||
|
||||
/** @typedef {import("ajv").Ajv} Ajv */
|
||||
|
||||
/** @typedef {import("../validate").SchemaUtilErrorObject} SchemaUtilErrorObject */
|
||||
|
||||
/**
|
||||
*
|
||||
* @param {string} data
|
||||
* @param {object} schema
|
||||
* @param {string} message
|
||||
* @returns {object} // Todo `returns` should be `SchemaUtilErrorObject`
|
||||
*/
|
||||
function errorMessage(message, schema, data) {
|
||||
return {
|
||||
keyword: 'absolutePath',
|
||||
params: {
|
||||
absolutePath: data
|
||||
},
|
||||
message,
|
||||
parentSchema: schema
|
||||
};
|
||||
}
|
||||
/**
|
||||
* @param {boolean} shouldBeAbsolute
|
||||
* @param {object} schema
|
||||
* @param {string} data
|
||||
* @returns {object}
|
||||
*/
|
||||
|
||||
|
||||
function getErrorFor(shouldBeAbsolute, schema, data) {
|
||||
const message = shouldBeAbsolute ? `The provided value ${JSON.stringify(data)} is not an absolute path!` : `A relative path is expected. However, the provided value ${JSON.stringify(data)} is an absolute path!`;
|
||||
return errorMessage(message, schema, data);
|
||||
}
|
||||
/**
|
||||
*
|
||||
* @param {Ajv} ajv
|
||||
* @returns {Ajv}
|
||||
*/
|
||||
|
||||
|
||||
function addAbsolutePathKeyword(ajv) {
|
||||
ajv.addKeyword('absolutePath', {
|
||||
errors: true,
|
||||
type: 'string',
|
||||
|
||||
compile(schema, parentSchema) {
|
||||
/**
|
||||
* @param {any} data
|
||||
* @returns {boolean}
|
||||
*/
|
||||
function callback(data) {
|
||||
let passes = true;
|
||||
const isExclamationMarkPresent = data.includes('!');
|
||||
|
||||
if (isExclamationMarkPresent) {
|
||||
callback.errors = [errorMessage(`The provided value ${JSON.stringify(data)} contains exclamation mark (!) which is not allowed because it's reserved for loader syntax.`, parentSchema, data)];
|
||||
passes = false;
|
||||
} // ?:[A-Za-z]:\\ - Windows absolute path
|
||||
// \\\\ - Windows network absolute path
|
||||
// \/ - Unix-like OS absolute path
|
||||
|
||||
|
||||
const isCorrectAbsolutePath = schema === /^(?:[A-Za-z]:(\\|\/)|\\\\|\/)/.test(data);
|
||||
|
||||
if (!isCorrectAbsolutePath) {
|
||||
callback.errors = [getErrorFor(schema, parentSchema, data)];
|
||||
passes = false;
|
||||
}
|
||||
|
||||
return passes;
|
||||
}
|
||||
/** @type {null | Array<SchemaUtilErrorObject>}*/
|
||||
|
||||
|
||||
callback.errors = [];
|
||||
return callback;
|
||||
}
|
||||
|
||||
});
|
||||
return ajv;
|
||||
}
|
||||
|
||||
var _default = addAbsolutePathKeyword;
|
||||
exports.default = _default;
|
||||
163
desktop-operator/node_modules/@develar/schema-utils/dist/util/Range.js
generated
vendored
Normal file
163
desktop-operator/node_modules/@develar/schema-utils/dist/util/Range.js
generated
vendored
Normal file
@@ -0,0 +1,163 @@
|
||||
"use strict";
|
||||
|
||||
/**
|
||||
* @typedef {[number, boolean]} RangeValue
|
||||
*/
|
||||
|
||||
/**
|
||||
* @callback RangeValueCallback
|
||||
* @param {RangeValue} rangeValue
|
||||
* @returns {boolean}
|
||||
*/
|
||||
class Range {
|
||||
/**
|
||||
* @param {"left" | "right"} side
|
||||
* @param {boolean} exclusive
|
||||
* @returns {">" | ">=" | "<" | "<="}
|
||||
*/
|
||||
static getOperator(side, exclusive) {
|
||||
if (side === 'left') {
|
||||
return exclusive ? '>' : '>=';
|
||||
}
|
||||
|
||||
return exclusive ? '<' : '<=';
|
||||
}
|
||||
/**
|
||||
* @param {number} value
|
||||
* @param {boolean} logic is not logic applied
|
||||
* @param {boolean} exclusive is range exclusive
|
||||
* @returns {string}
|
||||
*/
|
||||
|
||||
|
||||
static formatRight(value, logic, exclusive) {
|
||||
if (logic === false) {
|
||||
return Range.formatLeft(value, !logic, !exclusive);
|
||||
}
|
||||
|
||||
return `should be ${Range.getOperator('right', exclusive)} ${value}`;
|
||||
}
|
||||
/**
|
||||
* @param {number} value
|
||||
* @param {boolean} logic is not logic applied
|
||||
* @param {boolean} exclusive is range exclusive
|
||||
* @returns {string}
|
||||
*/
|
||||
|
||||
|
||||
static formatLeft(value, logic, exclusive) {
|
||||
if (logic === false) {
|
||||
return Range.formatRight(value, !logic, !exclusive);
|
||||
}
|
||||
|
||||
return `should be ${Range.getOperator('left', exclusive)} ${value}`;
|
||||
}
|
||||
/**
|
||||
* @param {number} start left side value
|
||||
* @param {number} end right side value
|
||||
* @param {boolean} startExclusive is range exclusive from left side
|
||||
* @param {boolean} endExclusive is range exclusive from right side
|
||||
* @param {boolean} logic is not logic applied
|
||||
* @returns {string}
|
||||
*/
|
||||
|
||||
|
||||
static formatRange(start, end, startExclusive, endExclusive, logic) {
|
||||
let result = 'should be';
|
||||
result += ` ${Range.getOperator(logic ? 'left' : 'right', logic ? startExclusive : !startExclusive)} ${start} `;
|
||||
result += logic ? 'and' : 'or';
|
||||
result += ` ${Range.getOperator(logic ? 'right' : 'left', logic ? endExclusive : !endExclusive)} ${end}`;
|
||||
return result;
|
||||
}
|
||||
/**
|
||||
* @param {Array<RangeValue>} values
|
||||
* @param {boolean} logic is not logic applied
|
||||
* @return {RangeValue} computed value and it's exclusive flag
|
||||
*/
|
||||
|
||||
|
||||
static getRangeValue(values, logic) {
|
||||
let minMax = logic ? Infinity : -Infinity;
|
||||
let j = -1;
|
||||
const predicate = logic ?
|
||||
/** @type {RangeValueCallback} */
|
||||
([value]) => value <= minMax :
|
||||
/** @type {RangeValueCallback} */
|
||||
([value]) => value >= minMax;
|
||||
|
||||
for (let i = 0; i < values.length; i++) {
|
||||
if (predicate(values[i])) {
|
||||
[minMax] = values[i];
|
||||
j = i;
|
||||
}
|
||||
}
|
||||
|
||||
if (j > -1) {
|
||||
return values[j];
|
||||
}
|
||||
|
||||
return [Infinity, true];
|
||||
}
|
||||
|
||||
constructor() {
|
||||
/** @type {Array<RangeValue>} */
|
||||
this._left = [];
|
||||
/** @type {Array<RangeValue>} */
|
||||
|
||||
this._right = [];
|
||||
}
|
||||
/**
|
||||
* @param {number} value
|
||||
* @param {boolean=} exclusive
|
||||
*/
|
||||
|
||||
|
||||
left(value, exclusive = false) {
|
||||
this._left.push([value, exclusive]);
|
||||
}
|
||||
/**
|
||||
* @param {number} value
|
||||
* @param {boolean=} exclusive
|
||||
*/
|
||||
|
||||
|
||||
right(value, exclusive = false) {
|
||||
this._right.push([value, exclusive]);
|
||||
}
|
||||
/**
|
||||
* @param {boolean} logic is not logic applied
|
||||
* @return {string} "smart" range string representation
|
||||
*/
|
||||
|
||||
|
||||
format(logic = true) {
|
||||
const [start, leftExclusive] = Range.getRangeValue(this._left, logic);
|
||||
const [end, rightExclusive] = Range.getRangeValue(this._right, !logic);
|
||||
|
||||
if (!Number.isFinite(start) && !Number.isFinite(end)) {
|
||||
return '';
|
||||
}
|
||||
|
||||
const realStart = leftExclusive ? start + 1 : start;
|
||||
const realEnd = rightExclusive ? end - 1 : end; // e.g. 5 < x < 7, 5 < x <= 6, 6 <= x <= 6
|
||||
|
||||
if (realStart === realEnd) {
|
||||
return `should be ${logic ? '' : '!'}= ${realStart}`;
|
||||
} // e.g. 4 < x < ∞
|
||||
|
||||
|
||||
if (Number.isFinite(start) && !Number.isFinite(end)) {
|
||||
return Range.formatLeft(start, logic, leftExclusive);
|
||||
} // e.g. ∞ < x < 4
|
||||
|
||||
|
||||
if (!Number.isFinite(start) && Number.isFinite(end)) {
|
||||
return Range.formatRight(end, logic, rightExclusive);
|
||||
}
|
||||
|
||||
return Range.formatRange(start, end, leftExclusive, rightExclusive, logic);
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
module.exports = Range;
|
||||
154
desktop-operator/node_modules/@develar/schema-utils/dist/validate.js
generated
vendored
Normal file
154
desktop-operator/node_modules/@develar/schema-utils/dist/validate.js
generated
vendored
Normal file
@@ -0,0 +1,154 @@
|
||||
"use strict";
|
||||
|
||||
Object.defineProperty(exports, "__esModule", {
|
||||
value: true
|
||||
});
|
||||
exports.default = void 0;
|
||||
|
||||
var _ajv = _interopRequireDefault(require("ajv"));
|
||||
|
||||
var _ajvKeywords = _interopRequireDefault(require("ajv-keywords"));
|
||||
|
||||
var _absolutePath = _interopRequireDefault(require("./keywords/absolutePath"));
|
||||
|
||||
var _ValidationError = _interopRequireDefault(require("./ValidationError"));
|
||||
|
||||
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
|
||||
|
||||
/** @typedef {import("json-schema").JSONSchema4} JSONSchema4 */
|
||||
|
||||
/** @typedef {import("json-schema").JSONSchema6} JSONSchema6 */
|
||||
|
||||
/** @typedef {import("json-schema").JSONSchema7} JSONSchema7 */
|
||||
|
||||
/** @typedef {import("ajv").ErrorObject} ErrorObject */
|
||||
|
||||
/** @typedef {(JSONSchema4 | JSONSchema6 | JSONSchema7)} Schema */
|
||||
|
||||
/** @typedef {ErrorObject & { children?: Array<ErrorObject>}} SchemaUtilErrorObject */
|
||||
|
||||
/**
|
||||
* @callback PostFormatter
|
||||
* @param {string} formattedError
|
||||
* @param {SchemaUtilErrorObject} error
|
||||
* @returns {string}
|
||||
*/
|
||||
|
||||
/**
|
||||
* @typedef {Object} ValidationErrorConfiguration
|
||||
* @property {string=} name
|
||||
* @property {string=} baseDataPath
|
||||
* @property {PostFormatter=} postFormatter
|
||||
*/
|
||||
const ajv = new _ajv.default({
|
||||
allErrors: true,
|
||||
verbose: true,
|
||||
$data: true,
|
||||
coerceTypes: true
|
||||
});
|
||||
(0, _ajvKeywords.default)(ajv, ['instanceof', 'formatMinimum', 'formatMaximum', 'patternRequired']); // Custom keywords
|
||||
|
||||
(0, _absolutePath.default)(ajv);
|
||||
/**
|
||||
* @param {Schema} schema
|
||||
* @param {Array<object> | object} options
|
||||
* @param {ValidationErrorConfiguration=} configuration
|
||||
* @returns {void}
|
||||
*/
|
||||
|
||||
function validate(schema, options, configuration) {
|
||||
let errors = [];
|
||||
|
||||
if (Array.isArray(options)) {
|
||||
errors = Array.from(options).map(nestedOptions => validateObject(schema, nestedOptions));
|
||||
errors.forEach((list, idx) => {
|
||||
const applyPrefix =
|
||||
/**
|
||||
* @param {SchemaUtilErrorObject} error
|
||||
*/
|
||||
error => {
|
||||
// eslint-disable-next-line no-param-reassign
|
||||
error.dataPath = `[${idx}]${error.dataPath}`;
|
||||
|
||||
if (error.children) {
|
||||
error.children.forEach(applyPrefix);
|
||||
}
|
||||
};
|
||||
|
||||
list.forEach(applyPrefix);
|
||||
});
|
||||
errors = errors.reduce((arr, items) => arr.concat(items), []);
|
||||
} else {
|
||||
errors = validateObject(schema, options);
|
||||
}
|
||||
|
||||
if (errors.length > 0) {
|
||||
throw new _ValidationError.default(errors, schema, configuration);
|
||||
}
|
||||
}
|
||||
/**
|
||||
* @param {Schema} schema
|
||||
* @param {Array<object> | object} options
|
||||
* @returns {Array<SchemaUtilErrorObject>}
|
||||
*/
|
||||
|
||||
|
||||
function validateObject(schema, options) {
|
||||
const compiledSchema = ajv.compile(schema);
|
||||
const valid = compiledSchema(options);
|
||||
|
||||
if (!compiledSchema.errors) {
|
||||
return [];
|
||||
}
|
||||
|
||||
return valid ? [] : filterErrors(compiledSchema.errors);
|
||||
}
|
||||
/**
|
||||
* @param {Array<ErrorObject>} errors
|
||||
* @returns {Array<SchemaUtilErrorObject>}
|
||||
*/
|
||||
|
||||
|
||||
function filterErrors(errors) {
|
||||
/** @type {Array<SchemaUtilErrorObject>} */
|
||||
let newErrors = [];
|
||||
|
||||
for (const error of
|
||||
/** @type {Array<SchemaUtilErrorObject>} */
|
||||
errors) {
|
||||
const {
|
||||
dataPath
|
||||
} = error;
|
||||
/** @type {Array<SchemaUtilErrorObject>} */
|
||||
|
||||
let children = [];
|
||||
newErrors = newErrors.filter(oldError => {
|
||||
if (oldError.dataPath.includes(dataPath)) {
|
||||
if (oldError.children) {
|
||||
children = children.concat(oldError.children.slice(0));
|
||||
} // eslint-disable-next-line no-undefined, no-param-reassign
|
||||
|
||||
|
||||
oldError.children = undefined;
|
||||
children.push(oldError);
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
});
|
||||
|
||||
if (children.length) {
|
||||
error.children = children;
|
||||
}
|
||||
|
||||
newErrors.push(error);
|
||||
}
|
||||
|
||||
return newErrors;
|
||||
} // TODO change after resolve https://github.com/microsoft/TypeScript/issues/34994
|
||||
|
||||
|
||||
validate.ValidationError = _ValidationError.default;
|
||||
validate.ValidateError = _ValidationError.default;
|
||||
var _default = validate;
|
||||
exports.default = _default;
|
||||
78
desktop-operator/node_modules/@develar/schema-utils/package.json
generated
vendored
Normal file
78
desktop-operator/node_modules/@develar/schema-utils/package.json
generated
vendored
Normal file
@@ -0,0 +1,78 @@
|
||||
{
|
||||
"name": "@develar/schema-utils",
|
||||
"version": "2.6.5",
|
||||
"description": "webpack Validation Utils",
|
||||
"license": "MIT",
|
||||
"repository": "webpack/schema-utils",
|
||||
"author": "webpack Contrib (https://github.com/webpack-contrib)",
|
||||
"homepage": "https://github.com/webpack/schema-utils",
|
||||
"bugs": "https://github.com/webpack/schema-utils/issues",
|
||||
"funding": {
|
||||
"type": "opencollective",
|
||||
"url": "https://opencollective.com/webpack"
|
||||
},
|
||||
"main": "dist/index.js",
|
||||
"types": "declarations/index.d.ts",
|
||||
"engines": {
|
||||
"node": ">= 8.9.0"
|
||||
},
|
||||
"scripts": {
|
||||
"start": "npm run build -- -w",
|
||||
"clean": "del-cli dist declarations",
|
||||
"prebuild": "npm run clean",
|
||||
"build:types": "tsc --declaration --emitDeclarationOnly --outDir declarations && prettier \"declarations/**/*.ts\" --write",
|
||||
"build:code": "cross-env NODE_ENV=production babel src -d dist --copy-files",
|
||||
"build": "npm-run-all -p \"build:**\"",
|
||||
"commitlint": "commitlint --from=master",
|
||||
"security": "npm audit",
|
||||
"lint:prettier": "prettier \"{**/*,*}.{js,json,md,yml,css,ts}\" --list-different",
|
||||
"lint:js": "eslint --cache .",
|
||||
"lint:types": "tsc --pretty --noEmit",
|
||||
"lint": "npm-run-all -l -p \"lint:**\"",
|
||||
"test:only": "cross-env NODE_ENV=test jest",
|
||||
"test:watch": "npm run test:only -- --watch",
|
||||
"test:coverage": "npm run test:only -- --collectCoverageFrom=\"src/**/*.js\" --coverage",
|
||||
"pretest": "npm run lint",
|
||||
"test": "npm run test:coverage",
|
||||
"prepare": "npm run build",
|
||||
"release": "standard-version",
|
||||
"defaults": "webpack-defaults"
|
||||
},
|
||||
"files": [
|
||||
"dist",
|
||||
"declarations"
|
||||
],
|
||||
"dependencies": {
|
||||
"ajv": "^6.12.0",
|
||||
"ajv-keywords": "^3.4.1"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@babel/cli": "^7.8.3",
|
||||
"@babel/core": "^7.8.3",
|
||||
"@babel/preset-env": "^7.8.3",
|
||||
"@commitlint/cli": "^8.3.5",
|
||||
"@commitlint/config-conventional": "^8.3.4",
|
||||
"@types/json-schema": "^7.0.4",
|
||||
"@webpack-contrib/defaults": "^6.3.0",
|
||||
"@webpack-contrib/eslint-config-webpack": "^3.0.0",
|
||||
"babel-jest": "^24.9.0",
|
||||
"commitlint-azure-pipelines-cli": "^1.0.3",
|
||||
"cross-env": "^6.0.3",
|
||||
"del": "^5.1.0",
|
||||
"del-cli": "^3.0.0",
|
||||
"eslint": "^6.8.0",
|
||||
"eslint-config-prettier": "^6.9.0",
|
||||
"eslint-plugin-import": "^2.20.0",
|
||||
"husky": "^4.0.10",
|
||||
"jest": "^24.9.0",
|
||||
"jest-junit": "^10.0.0",
|
||||
"lint-staged": "^9.5.0",
|
||||
"npm-run-all": "^4.1.5",
|
||||
"prettier": "^1.19.1",
|
||||
"standard-version": "^7.0.1",
|
||||
"typescript": "^3.7.5"
|
||||
},
|
||||
"keywords": [
|
||||
"webpack"
|
||||
]
|
||||
}
|
||||
20
desktop-operator/node_modules/@electron/asar/LICENSE.md
generated
vendored
Normal file
20
desktop-operator/node_modules/@electron/asar/LICENSE.md
generated
vendored
Normal file
@@ -0,0 +1,20 @@
|
||||
Copyright (c) 2014 GitHub Inc.
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining
|
||||
a copy of this software and associated documentation files (the
|
||||
"Software"), to deal in the Software without restriction, including
|
||||
without limitation the rights to use, copy, modify, merge, publish,
|
||||
distribute, sublicense, and/or sell copies of the Software, and to
|
||||
permit persons to whom the Software is furnished to do so, subject to
|
||||
the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be
|
||||
included in all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
||||
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
|
||||
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
|
||||
LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
|
||||
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
|
||||
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
208
desktop-operator/node_modules/@electron/asar/README.md
generated
vendored
Normal file
208
desktop-operator/node_modules/@electron/asar/README.md
generated
vendored
Normal file
@@ -0,0 +1,208 @@
|
||||
# @electron/asar - Electron Archive
|
||||
|
||||
[](https://github.com/electron/asar/actions/workflows/test.yml)
|
||||
[](https://npmjs.org/package/@electron/asar)
|
||||
|
||||
Asar is a simple extensive archive format, it works like `tar` that concatenates
|
||||
all files together without compression, while having random access support.
|
||||
|
||||
## Features
|
||||
|
||||
* Support random access
|
||||
* Use JSON to store files' information
|
||||
* Very easy to write a parser
|
||||
|
||||
## Command line utility
|
||||
|
||||
### Install
|
||||
|
||||
This module requires Node 10 or later.
|
||||
|
||||
```bash
|
||||
$ npm install --engine-strict @electron/asar
|
||||
```
|
||||
|
||||
### Usage
|
||||
|
||||
```bash
|
||||
$ asar --help
|
||||
|
||||
Usage: asar [options] [command]
|
||||
|
||||
Commands:
|
||||
|
||||
pack|p <dir> <output>
|
||||
create asar archive
|
||||
|
||||
list|l <archive>
|
||||
list files of asar archive
|
||||
|
||||
extract-file|ef <archive> <filename>
|
||||
extract one file from archive
|
||||
|
||||
extract|e <archive> <dest>
|
||||
extract archive
|
||||
|
||||
|
||||
Options:
|
||||
|
||||
-h, --help output usage information
|
||||
-V, --version output the version number
|
||||
|
||||
```
|
||||
|
||||
#### Excluding multiple resources from being packed
|
||||
|
||||
Given:
|
||||
```
|
||||
app
|
||||
(a) ├── x1
|
||||
(b) ├── x2
|
||||
(c) ├── y3
|
||||
(d) │ ├── x1
|
||||
(e) │ └── z1
|
||||
(f) │ └── x2
|
||||
(g) └── z4
|
||||
(h) └── w1
|
||||
```
|
||||
|
||||
Exclude: a, b
|
||||
```bash
|
||||
$ asar pack app app.asar --unpack-dir "{x1,x2}"
|
||||
```
|
||||
|
||||
Exclude: a, b, d, f
|
||||
```bash
|
||||
$ asar pack app app.asar --unpack-dir "**/{x1,x2}"
|
||||
```
|
||||
|
||||
Exclude: a, b, d, f, h
|
||||
```bash
|
||||
$ asar pack app app.asar --unpack-dir "{**/x1,**/x2,z4/w1}"
|
||||
```
|
||||
|
||||
## Using programmatically
|
||||
|
||||
### Example
|
||||
|
||||
```javascript
|
||||
const asar = require('@electron/asar');
|
||||
|
||||
const src = 'some/path/';
|
||||
const dest = 'name.asar';
|
||||
|
||||
await asar.createPackage(src, dest);
|
||||
console.log('done.');
|
||||
```
|
||||
|
||||
Please note that there is currently **no** error handling provided!
|
||||
|
||||
### Transform
|
||||
You can pass in a `transform` option, that is a function, which either returns
|
||||
nothing, or a `stream.Transform`. The latter will be used on files that will be
|
||||
in the `.asar` file to transform them (e.g. compress).
|
||||
|
||||
```javascript
|
||||
const asar = require('@electron/asar');
|
||||
|
||||
const src = 'some/path/';
|
||||
const dest = 'name.asar';
|
||||
|
||||
function transform (filename) {
|
||||
return new CustomTransformStream()
|
||||
}
|
||||
|
||||
await asar.createPackageWithOptions(src, dest, { transform: transform });
|
||||
console.log('done.');
|
||||
```
|
||||
|
||||
## Format
|
||||
|
||||
Asar uses [Pickle][pickle] to safely serialize binary value to file.
|
||||
|
||||
The format of asar is very flat:
|
||||
|
||||
```
|
||||
| UInt32: header_size | String: header | Bytes: file1 | ... | Bytes: file42 |
|
||||
```
|
||||
|
||||
The `header_size` and `header` are serialized with [Pickle][pickle] class, and
|
||||
`header_size`'s [Pickle][pickle] object is 8 bytes.
|
||||
|
||||
The `header` is a JSON string, and the `header_size` is the size of `header`'s
|
||||
`Pickle` object.
|
||||
|
||||
Structure of `header` is something like this:
|
||||
|
||||
```json
|
||||
{
|
||||
"files": {
|
||||
"tmp": {
|
||||
"files": {}
|
||||
},
|
||||
"usr" : {
|
||||
"files": {
|
||||
"bin": {
|
||||
"files": {
|
||||
"ls": {
|
||||
"offset": "0",
|
||||
"size": 100,
|
||||
"executable": true,
|
||||
"integrity": {
|
||||
"algorithm": "SHA256",
|
||||
"hash": "...",
|
||||
"blockSize": 1024,
|
||||
"blocks": ["...", "..."]
|
||||
}
|
||||
},
|
||||
"cd": {
|
||||
"offset": "100",
|
||||
"size": 100,
|
||||
"executable": true,
|
||||
"integrity": {
|
||||
"algorithm": "SHA256",
|
||||
"hash": "...",
|
||||
"blockSize": 1024,
|
||||
"blocks": ["...", "..."]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"etc": {
|
||||
"files": {
|
||||
"hosts": {
|
||||
"offset": "200",
|
||||
"size": 32,
|
||||
"integrity": {
|
||||
"algorithm": "SHA256",
|
||||
"hash": "...",
|
||||
"blockSize": 1024,
|
||||
"blocks": ["...", "..."]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
`offset` and `size` records the information to read the file from archive, the
|
||||
`offset` starts from 0 so you have to manually add the size of `header_size` and
|
||||
`header` to the `offset` to get the real offset of the file.
|
||||
|
||||
`offset` is a UINT64 number represented in string, because there is no way to
|
||||
precisely represent UINT64 in JavaScript `Number`. `size` is a JavaScript
|
||||
`Number` that is no larger than `Number.MAX_SAFE_INTEGER`, which has a value of
|
||||
`9007199254740991` and is about 8PB in size. We didn't store `size` in UINT64
|
||||
because file size in Node.js is represented as `Number` and it is not safe to
|
||||
convert `Number` to UINT64.
|
||||
|
||||
`integrity` is an object consisting of a few keys:
|
||||
* A hashing `algorithm`, currently only `SHA256` is supported.
|
||||
* A hex encoded `hash` value representing the hash of the entire file.
|
||||
* An array of hex encoded hashes for the `blocks` of the file. i.e. for a blockSize of 4KB this array contains the hash of every block if you split the file into N 4KB blocks.
|
||||
* A integer value `blockSize` representing the size in bytes of each block in the `blocks` hashes above
|
||||
|
||||
[pickle]: https://chromium.googlesource.com/chromium/src/+/main/base/pickle.h
|
||||
82
desktop-operator/node_modules/@electron/asar/bin/asar.js
generated
vendored
Executable file
82
desktop-operator/node_modules/@electron/asar/bin/asar.js
generated
vendored
Executable file
@@ -0,0 +1,82 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
var packageJSON = require('../package.json')
|
||||
var splitVersion = function (version) { return version.split('.').map(function (part) { return Number(part) }) }
|
||||
var requiredNodeVersion = splitVersion(packageJSON.engines.node.slice(2))
|
||||
var actualNodeVersion = splitVersion(process.versions.node)
|
||||
|
||||
if (actualNodeVersion[0] < requiredNodeVersion[0] || (actualNodeVersion[0] === requiredNodeVersion[0] && actualNodeVersion[1] < requiredNodeVersion[1])) {
|
||||
console.error('CANNOT RUN WITH NODE ' + process.versions.node)
|
||||
console.error('asar requires Node ' + packageJSON.engines.node + '.')
|
||||
process.exit(1)
|
||||
}
|
||||
|
||||
// Not consts so that this file can load in Node < 4.0
|
||||
var asar = require('../lib/asar')
|
||||
var program = require('commander')
|
||||
|
||||
program.version('v' + packageJSON.version)
|
||||
.description('Manipulate asar archive files')
|
||||
|
||||
program.command('pack <dir> <output>')
|
||||
.alias('p')
|
||||
.description('create asar archive')
|
||||
.option('--ordering <file path>', 'path to a text file for ordering contents')
|
||||
.option('--unpack <expression>', 'do not pack files matching glob <expression>')
|
||||
.option('--unpack-dir <expression>', 'do not pack dirs matching glob <expression> or starting with literal <expression>')
|
||||
.option('--exclude-hidden', 'exclude hidden files')
|
||||
.action(function (dir, output, options) {
|
||||
options = {
|
||||
unpack: options.unpack,
|
||||
unpackDir: options.unpackDir,
|
||||
ordering: options.ordering,
|
||||
version: options.sv,
|
||||
arch: options.sa,
|
||||
builddir: options.sb,
|
||||
dot: !options.excludeHidden
|
||||
}
|
||||
asar.createPackageWithOptions(dir, output, options).catch(error => {
|
||||
console.error(error)
|
||||
process.exit(1)
|
||||
})
|
||||
})
|
||||
|
||||
program.command('list <archive>')
|
||||
.alias('l')
|
||||
.description('list files of asar archive')
|
||||
.option('-i, --is-pack', 'each file in the asar is pack or unpack')
|
||||
.action(function (archive, options) {
|
||||
options = {
|
||||
isPack: options.isPack
|
||||
}
|
||||
var files = asar.listPackage(archive, options)
|
||||
for (var i in files) {
|
||||
console.log(files[i])
|
||||
}
|
||||
})
|
||||
|
||||
program.command('extract-file <archive> <filename>')
|
||||
.alias('ef')
|
||||
.description('extract one file from archive')
|
||||
.action(function (archive, filename) {
|
||||
require('fs').writeFileSync(require('path').basename(filename),
|
||||
asar.extractFile(archive, filename))
|
||||
})
|
||||
|
||||
program.command('extract <archive> <dest>')
|
||||
.alias('e')
|
||||
.description('extract archive')
|
||||
.action(function (archive, dest) {
|
||||
asar.extractAll(archive, dest)
|
||||
})
|
||||
|
||||
program.command('*')
|
||||
.action(function (_cmd, args) {
|
||||
console.log('asar: \'%s\' is not an asar command. See \'asar --help\'.', args[0])
|
||||
})
|
||||
|
||||
program.parse(process.argv)
|
||||
|
||||
if (program.args.length === 0) {
|
||||
program.help()
|
||||
}
|
||||
96
desktop-operator/node_modules/@electron/asar/lib/asar.d.ts
generated
vendored
Normal file
96
desktop-operator/node_modules/@electron/asar/lib/asar.d.ts
generated
vendored
Normal file
@@ -0,0 +1,96 @@
|
||||
import { FilesystemDirectoryEntry, FilesystemEntry, FilesystemLinkEntry } from './filesystem';
|
||||
import * as disk from './disk';
|
||||
import { CrawledFileType } from './crawlfs';
|
||||
import { IOptions } from './types/glob';
|
||||
export declare function createPackage(src: string, dest: string): Promise<NodeJS.WritableStream>;
|
||||
export type CreateOptions = {
|
||||
dot?: boolean;
|
||||
globOptions?: IOptions;
|
||||
/**
|
||||
* Path to a file containing the list of relative filepaths relative to `src` and the specific order they should be inserted into the asar.
|
||||
* Formats allowed below:
|
||||
* filepath
|
||||
* : filepath
|
||||
* <anything>:filepath
|
||||
*/
|
||||
ordering?: string;
|
||||
pattern?: string;
|
||||
transform?: (filePath: string) => NodeJS.ReadWriteStream | void;
|
||||
unpack?: string;
|
||||
unpackDir?: string;
|
||||
};
|
||||
export declare function createPackageWithOptions(src: string, dest: string, options: CreateOptions): Promise<NodeJS.WritableStream>;
|
||||
/**
|
||||
* Create an ASAR archive from a list of filenames.
|
||||
*
|
||||
* @param src - Base path. All files are relative to this.
|
||||
* @param dest - Archive filename (& path).
|
||||
* @param filenames - List of filenames relative to src.
|
||||
* @param [metadata] - Object with filenames as keys and {type='directory|file|link', stat: fs.stat} as values. (Optional)
|
||||
* @param [options] - Options passed to `createPackageWithOptions`.
|
||||
*/
|
||||
export declare function createPackageFromFiles(src: string, dest: string, filenames: string[], metadata?: disk.InputMetadata, options?: CreateOptions): Promise<NodeJS.WritableStream>;
|
||||
export type AsarStream = {
|
||||
/**
|
||||
Relative path to the file or directory from within the archive
|
||||
*/
|
||||
path: string;
|
||||
/**
|
||||
Function that returns a read stream for a file.
|
||||
Note: this is called multiple times per "file", so a new NodeJS.ReadableStream needs to be created each time
|
||||
*/
|
||||
streamGenerator: () => NodeJS.ReadableStream;
|
||||
/**
|
||||
Whether the file/link should be unpacked
|
||||
*/
|
||||
unpacked: boolean;
|
||||
stat: CrawledFileType['stat'];
|
||||
};
|
||||
export type AsarDirectory = Pick<AsarStream, 'path' | 'unpacked'> & {
|
||||
type: 'directory';
|
||||
};
|
||||
export type AsarSymlinkStream = AsarStream & {
|
||||
type: 'link';
|
||||
symlink: string;
|
||||
};
|
||||
export type AsarFileStream = AsarStream & {
|
||||
type: 'file';
|
||||
};
|
||||
export type AsarStreamType = AsarDirectory | AsarFileStream | AsarSymlinkStream;
|
||||
/**
|
||||
* Create an ASAR archive from a list of streams.
|
||||
*
|
||||
* @param dest - Archive filename (& path).
|
||||
* @param streams - List of streams to be piped in-memory into asar filesystem. Insertion order is preserved.
|
||||
*/
|
||||
export declare function createPackageFromStreams(dest: string, streams: AsarStreamType[]): Promise<import("fs").WriteStream>;
|
||||
export declare function statFile(archivePath: string, filename: string, followLinks?: boolean): FilesystemEntry;
|
||||
export declare function getRawHeader(archivePath: string): disk.ArchiveHeader;
|
||||
export interface ListOptions {
|
||||
isPack: boolean;
|
||||
}
|
||||
export declare function listPackage(archivePath: string, options: ListOptions): string[];
|
||||
export declare function extractFile(archivePath: string, filename: string, followLinks?: boolean): Buffer;
|
||||
export declare function extractAll(archivePath: string, dest: string): void;
|
||||
export declare function uncache(archivePath: string): boolean;
|
||||
export declare function uncacheAll(): void;
|
||||
export { EntryMetadata } from './filesystem';
|
||||
export { InputMetadata, DirectoryRecord, FileRecord, ArchiveHeader } from './disk';
|
||||
export type InputMetadataType = 'directory' | 'file' | 'link';
|
||||
export type DirectoryMetadata = FilesystemDirectoryEntry;
|
||||
export type FileMetadata = FilesystemEntry;
|
||||
export type LinkMetadata = FilesystemLinkEntry;
|
||||
declare const _default: {
|
||||
createPackage: typeof createPackage;
|
||||
createPackageWithOptions: typeof createPackageWithOptions;
|
||||
createPackageFromFiles: typeof createPackageFromFiles;
|
||||
createPackageFromStreams: typeof createPackageFromStreams;
|
||||
statFile: typeof statFile;
|
||||
getRawHeader: typeof getRawHeader;
|
||||
listPackage: typeof listPackage;
|
||||
extractFile: typeof extractFile;
|
||||
extractAll: typeof extractAll;
|
||||
uncache: typeof uncache;
|
||||
uncacheAll: typeof uncacheAll;
|
||||
};
|
||||
export default _default;
|
||||
337
desktop-operator/node_modules/@electron/asar/lib/asar.js
generated
vendored
Normal file
337
desktop-operator/node_modules/@electron/asar/lib/asar.js
generated
vendored
Normal file
@@ -0,0 +1,337 @@
|
||||
"use strict";
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||
}
|
||||
Object.defineProperty(o, k2, desc);
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k in mod) if (k !== "default" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.createPackage = createPackage;
|
||||
exports.createPackageWithOptions = createPackageWithOptions;
|
||||
exports.createPackageFromFiles = createPackageFromFiles;
|
||||
exports.createPackageFromStreams = createPackageFromStreams;
|
||||
exports.statFile = statFile;
|
||||
exports.getRawHeader = getRawHeader;
|
||||
exports.listPackage = listPackage;
|
||||
exports.extractFile = extractFile;
|
||||
exports.extractAll = extractAll;
|
||||
exports.uncache = uncache;
|
||||
exports.uncacheAll = uncacheAll;
|
||||
const path = __importStar(require("path"));
|
||||
const minimatch_1 = __importDefault(require("minimatch"));
|
||||
const wrapped_fs_1 = __importDefault(require("./wrapped-fs"));
|
||||
const filesystem_1 = require("./filesystem");
|
||||
const disk = __importStar(require("./disk"));
|
||||
const crawlfs_1 = require("./crawlfs");
|
||||
/**
|
||||
* Whether a directory should be excluded from packing due to the `--unpack-dir" option.
|
||||
*
|
||||
* @param dirPath - directory path to check
|
||||
* @param pattern - literal prefix [for backward compatibility] or glob pattern
|
||||
* @param unpackDirs - Array of directory paths previously marked as unpacked
|
||||
*/
|
||||
function isUnpackedDir(dirPath, pattern, unpackDirs) {
|
||||
if (dirPath.startsWith(pattern) || (0, minimatch_1.default)(dirPath, pattern)) {
|
||||
if (!unpackDirs.includes(dirPath)) {
|
||||
unpackDirs.push(dirPath);
|
||||
}
|
||||
return true;
|
||||
}
|
||||
else {
|
||||
return unpackDirs.some((unpackDir) => dirPath.startsWith(unpackDir) && !path.relative(unpackDir, dirPath).startsWith('..'));
|
||||
}
|
||||
}
|
||||
async function createPackage(src, dest) {
|
||||
return createPackageWithOptions(src, dest, {});
|
||||
}
|
||||
async function createPackageWithOptions(src, dest, options) {
|
||||
const globOptions = options.globOptions ? options.globOptions : {};
|
||||
globOptions.dot = options.dot === undefined ? true : options.dot;
|
||||
const pattern = src + (options.pattern ? options.pattern : '/**/*');
|
||||
const [filenames, metadata] = await (0, crawlfs_1.crawl)(pattern, globOptions);
|
||||
return createPackageFromFiles(src, dest, filenames, metadata, options);
|
||||
}
|
||||
/**
|
||||
* Create an ASAR archive from a list of filenames.
|
||||
*
|
||||
* @param src - Base path. All files are relative to this.
|
||||
* @param dest - Archive filename (& path).
|
||||
* @param filenames - List of filenames relative to src.
|
||||
* @param [metadata] - Object with filenames as keys and {type='directory|file|link', stat: fs.stat} as values. (Optional)
|
||||
* @param [options] - Options passed to `createPackageWithOptions`.
|
||||
*/
|
||||
async function createPackageFromFiles(src, dest, filenames, metadata = {}, options = {}) {
|
||||
src = path.normalize(src);
|
||||
dest = path.normalize(dest);
|
||||
filenames = filenames.map(function (filename) {
|
||||
return path.normalize(filename);
|
||||
});
|
||||
const filesystem = new filesystem_1.Filesystem(src);
|
||||
const files = [];
|
||||
const links = [];
|
||||
const unpackDirs = [];
|
||||
let filenamesSorted = [];
|
||||
if (options.ordering) {
|
||||
const orderingFiles = (await wrapped_fs_1.default.readFile(options.ordering))
|
||||
.toString()
|
||||
.split('\n')
|
||||
.map((line) => {
|
||||
if (line.includes(':')) {
|
||||
line = line.split(':').pop();
|
||||
}
|
||||
line = line.trim();
|
||||
if (line.startsWith('/')) {
|
||||
line = line.slice(1);
|
||||
}
|
||||
return line;
|
||||
});
|
||||
const ordering = [];
|
||||
for (const file of orderingFiles) {
|
||||
const pathComponents = file.split(path.sep);
|
||||
let str = src;
|
||||
for (const pathComponent of pathComponents) {
|
||||
str = path.join(str, pathComponent);
|
||||
ordering.push(str);
|
||||
}
|
||||
}
|
||||
let missing = 0;
|
||||
const total = filenames.length;
|
||||
for (const file of ordering) {
|
||||
if (!filenamesSorted.includes(file) && filenames.includes(file)) {
|
||||
filenamesSorted.push(file);
|
||||
}
|
||||
}
|
||||
for (const file of filenames) {
|
||||
if (!filenamesSorted.includes(file)) {
|
||||
filenamesSorted.push(file);
|
||||
missing += 1;
|
||||
}
|
||||
}
|
||||
console.log(`Ordering file has ${((total - missing) / total) * 100}% coverage.`);
|
||||
}
|
||||
else {
|
||||
filenamesSorted = filenames;
|
||||
}
|
||||
const handleFile = async function (filename) {
|
||||
if (!metadata[filename]) {
|
||||
const fileType = await (0, crawlfs_1.determineFileType)(filename);
|
||||
if (!fileType) {
|
||||
throw new Error('Unknown file type for file: ' + filename);
|
||||
}
|
||||
metadata[filename] = fileType;
|
||||
}
|
||||
const file = metadata[filename];
|
||||
const shouldUnpackPath = function (relativePath, unpack, unpackDir) {
|
||||
let shouldUnpack = false;
|
||||
if (unpack) {
|
||||
shouldUnpack = (0, minimatch_1.default)(filename, unpack, { matchBase: true });
|
||||
}
|
||||
if (!shouldUnpack && unpackDir) {
|
||||
shouldUnpack = isUnpackedDir(relativePath, unpackDir, unpackDirs);
|
||||
}
|
||||
return shouldUnpack;
|
||||
};
|
||||
let shouldUnpack;
|
||||
switch (file.type) {
|
||||
case 'directory':
|
||||
shouldUnpack = shouldUnpackPath(path.relative(src, filename), undefined, options.unpackDir);
|
||||
filesystem.insertDirectory(filename, shouldUnpack);
|
||||
break;
|
||||
case 'file':
|
||||
shouldUnpack = shouldUnpackPath(path.relative(src, path.dirname(filename)), options.unpack, options.unpackDir);
|
||||
files.push({ filename, unpack: shouldUnpack });
|
||||
return filesystem.insertFile(filename, () => wrapped_fs_1.default.createReadStream(filename), shouldUnpack, file, options);
|
||||
case 'link':
|
||||
shouldUnpack = shouldUnpackPath(path.relative(src, filename), options.unpack, options.unpackDir);
|
||||
links.push({ filename, unpack: shouldUnpack });
|
||||
filesystem.insertLink(filename, shouldUnpack);
|
||||
break;
|
||||
}
|
||||
return Promise.resolve();
|
||||
};
|
||||
const insertsDone = async function () {
|
||||
await wrapped_fs_1.default.mkdirp(path.dirname(dest));
|
||||
return disk.writeFilesystem(dest, filesystem, { files, links }, metadata);
|
||||
};
|
||||
const names = filenamesSorted.slice();
|
||||
const next = async function (name) {
|
||||
if (!name) {
|
||||
return insertsDone();
|
||||
}
|
||||
await handleFile(name);
|
||||
return next(names.shift());
|
||||
};
|
||||
return next(names.shift());
|
||||
}
|
||||
/**
|
||||
* Create an ASAR archive from a list of streams.
|
||||
*
|
||||
* @param dest - Archive filename (& path).
|
||||
* @param streams - List of streams to be piped in-memory into asar filesystem. Insertion order is preserved.
|
||||
*/
|
||||
async function createPackageFromStreams(dest, streams) {
|
||||
// We use an ambiguous root `src` since we're piping directly from a stream and the `filePath` for the stream is already relative to the src/root
|
||||
const src = '.';
|
||||
const filesystem = new filesystem_1.Filesystem(src);
|
||||
const files = [];
|
||||
const links = [];
|
||||
const handleFile = async function (stream) {
|
||||
const { path: destinationPath, type } = stream;
|
||||
const filename = path.normalize(destinationPath);
|
||||
switch (type) {
|
||||
case 'directory':
|
||||
filesystem.insertDirectory(filename, stream.unpacked);
|
||||
break;
|
||||
case 'file':
|
||||
files.push({
|
||||
filename,
|
||||
streamGenerator: stream.streamGenerator,
|
||||
link: undefined,
|
||||
mode: stream.stat.mode,
|
||||
unpack: stream.unpacked,
|
||||
});
|
||||
return filesystem.insertFile(filename, stream.streamGenerator, stream.unpacked, {
|
||||
type: 'file',
|
||||
stat: stream.stat,
|
||||
});
|
||||
case 'link':
|
||||
links.push({
|
||||
filename,
|
||||
streamGenerator: stream.streamGenerator,
|
||||
link: stream.symlink,
|
||||
mode: stream.stat.mode,
|
||||
unpack: stream.unpacked,
|
||||
});
|
||||
filesystem.insertLink(filename, stream.unpacked, path.dirname(filename), stream.symlink, src);
|
||||
break;
|
||||
}
|
||||
return Promise.resolve();
|
||||
};
|
||||
const insertsDone = async function () {
|
||||
await wrapped_fs_1.default.mkdirp(path.dirname(dest));
|
||||
return disk.streamFilesystem(dest, filesystem, { files, links });
|
||||
};
|
||||
const streamQueue = streams.slice();
|
||||
const next = async function (stream) {
|
||||
if (!stream) {
|
||||
return insertsDone();
|
||||
}
|
||||
await handleFile(stream);
|
||||
return next(streamQueue.shift());
|
||||
};
|
||||
return next(streamQueue.shift());
|
||||
}
|
||||
function statFile(archivePath, filename, followLinks = true) {
|
||||
const filesystem = disk.readFilesystemSync(archivePath);
|
||||
return filesystem.getFile(filename, followLinks);
|
||||
}
|
||||
function getRawHeader(archivePath) {
|
||||
return disk.readArchiveHeaderSync(archivePath);
|
||||
}
|
||||
function listPackage(archivePath, options) {
|
||||
return disk.readFilesystemSync(archivePath).listFiles(options);
|
||||
}
|
||||
function extractFile(archivePath, filename, followLinks = true) {
|
||||
const filesystem = disk.readFilesystemSync(archivePath);
|
||||
const fileInfo = filesystem.getFile(filename, followLinks);
|
||||
if ('link' in fileInfo || 'files' in fileInfo) {
|
||||
throw new Error('Expected to find file at: ' + filename + ' but found a directory or link');
|
||||
}
|
||||
return disk.readFileSync(filesystem, filename, fileInfo);
|
||||
}
|
||||
function extractAll(archivePath, dest) {
|
||||
const filesystem = disk.readFilesystemSync(archivePath);
|
||||
const filenames = filesystem.listFiles();
|
||||
// under windows just extract links as regular files
|
||||
const followLinks = process.platform === 'win32';
|
||||
// create destination directory
|
||||
wrapped_fs_1.default.mkdirpSync(dest);
|
||||
const extractionErrors = [];
|
||||
for (const fullPath of filenames) {
|
||||
// Remove leading slash
|
||||
const filename = fullPath.substr(1);
|
||||
const destFilename = path.join(dest, filename);
|
||||
const file = filesystem.getFile(filename, followLinks);
|
||||
if (path.relative(dest, destFilename).startsWith('..')) {
|
||||
throw new Error(`${fullPath}: file "${destFilename}" writes out of the package`);
|
||||
}
|
||||
if ('files' in file) {
|
||||
// it's a directory, create it and continue with the next entry
|
||||
wrapped_fs_1.default.mkdirpSync(destFilename);
|
||||
}
|
||||
else if ('link' in file) {
|
||||
// it's a symlink, create a symlink
|
||||
const linkSrcPath = path.dirname(path.join(dest, file.link));
|
||||
const linkDestPath = path.dirname(destFilename);
|
||||
const relativePath = path.relative(linkDestPath, linkSrcPath);
|
||||
// try to delete output file, because we can't overwrite a link
|
||||
try {
|
||||
wrapped_fs_1.default.unlinkSync(destFilename);
|
||||
}
|
||||
catch (_a) { }
|
||||
const linkTo = path.join(relativePath, path.basename(file.link));
|
||||
if (path.relative(dest, linkSrcPath).startsWith('..')) {
|
||||
throw new Error(`${fullPath}: file "${file.link}" links out of the package to "${linkSrcPath}"`);
|
||||
}
|
||||
wrapped_fs_1.default.symlinkSync(linkTo, destFilename);
|
||||
}
|
||||
else {
|
||||
// it's a file, try to extract it
|
||||
try {
|
||||
const content = disk.readFileSync(filesystem, filename, file);
|
||||
wrapped_fs_1.default.writeFileSync(destFilename, content);
|
||||
if (file.executable) {
|
||||
wrapped_fs_1.default.chmodSync(destFilename, '755');
|
||||
}
|
||||
}
|
||||
catch (e) {
|
||||
extractionErrors.push(e);
|
||||
}
|
||||
}
|
||||
}
|
||||
if (extractionErrors.length) {
|
||||
throw new Error('Unable to extract some files:\n\n' +
|
||||
extractionErrors.map((error) => error.stack).join('\n\n'));
|
||||
}
|
||||
}
|
||||
function uncache(archivePath) {
|
||||
return disk.uncacheFilesystem(archivePath);
|
||||
}
|
||||
function uncacheAll() {
|
||||
disk.uncacheAll();
|
||||
}
|
||||
// Export everything in default, too
|
||||
exports.default = {
|
||||
createPackage,
|
||||
createPackageWithOptions,
|
||||
createPackageFromFiles,
|
||||
createPackageFromStreams,
|
||||
statFile,
|
||||
getRawHeader,
|
||||
listPackage,
|
||||
extractFile,
|
||||
extractAll,
|
||||
uncache,
|
||||
uncacheAll,
|
||||
};
|
||||
//# sourceMappingURL=asar.js.map
|
||||
1
desktop-operator/node_modules/@electron/asar/lib/asar.js.map
generated
vendored
Normal file
1
desktop-operator/node_modules/@electron/asar/lib/asar.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
12
desktop-operator/node_modules/@electron/asar/lib/crawlfs.d.ts
generated
vendored
Normal file
12
desktop-operator/node_modules/@electron/asar/lib/crawlfs.d.ts
generated
vendored
Normal file
@@ -0,0 +1,12 @@
|
||||
import { Stats } from 'fs';
|
||||
import { IOptions } from './types/glob';
|
||||
export type CrawledFileType = {
|
||||
type: 'file' | 'directory' | 'link';
|
||||
stat: Pick<Stats, 'mode' | 'size'>;
|
||||
transformed?: {
|
||||
path: string;
|
||||
stat: Stats;
|
||||
};
|
||||
};
|
||||
export declare function determineFileType(filename: string): Promise<CrawledFileType | null>;
|
||||
export declare function crawl(dir: string, options: IOptions): Promise<readonly [string[], Record<string, CrawledFileType>]>;
|
||||
79
desktop-operator/node_modules/@electron/asar/lib/crawlfs.js
generated
vendored
Normal file
79
desktop-operator/node_modules/@electron/asar/lib/crawlfs.js
generated
vendored
Normal file
@@ -0,0 +1,79 @@
|
||||
"use strict";
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||
}
|
||||
Object.defineProperty(o, k2, desc);
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k in mod) if (k !== "default" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.determineFileType = determineFileType;
|
||||
exports.crawl = crawl;
|
||||
const util_1 = require("util");
|
||||
const glob_1 = require("glob");
|
||||
const wrapped_fs_1 = __importDefault(require("./wrapped-fs"));
|
||||
const path = __importStar(require("path"));
|
||||
const glob = (0, util_1.promisify)(glob_1.glob);
|
||||
async function determineFileType(filename) {
|
||||
const stat = await wrapped_fs_1.default.lstat(filename);
|
||||
if (stat.isFile()) {
|
||||
return { type: 'file', stat };
|
||||
}
|
||||
else if (stat.isDirectory()) {
|
||||
return { type: 'directory', stat };
|
||||
}
|
||||
else if (stat.isSymbolicLink()) {
|
||||
return { type: 'link', stat };
|
||||
}
|
||||
return null;
|
||||
}
|
||||
async function crawl(dir, options) {
|
||||
const metadata = {};
|
||||
const crawled = await glob(dir, options);
|
||||
const results = await Promise.all(crawled.map(async (filename) => [filename, await determineFileType(filename)]));
|
||||
const links = [];
|
||||
const filenames = results
|
||||
.map(([filename, type]) => {
|
||||
if (type) {
|
||||
metadata[filename] = type;
|
||||
if (type.type === 'link')
|
||||
links.push(filename);
|
||||
}
|
||||
return filename;
|
||||
})
|
||||
.filter((filename) => {
|
||||
// Newer glob can return files inside symlinked directories, to avoid
|
||||
// those appearing in archives we need to manually exclude theme here
|
||||
const exactLinkIndex = links.findIndex((link) => filename === link);
|
||||
return links.every((link, index) => {
|
||||
if (index === exactLinkIndex) {
|
||||
return true;
|
||||
}
|
||||
const isFileWithinSymlinkDir = filename.startsWith(link);
|
||||
// symlink may point outside the directory: https://github.com/electron/asar/issues/303
|
||||
const relativePath = path.relative(link, path.dirname(filename));
|
||||
return !isFileWithinSymlinkDir || relativePath.startsWith('..');
|
||||
});
|
||||
});
|
||||
return [filenames, metadata];
|
||||
}
|
||||
//# sourceMappingURL=crawlfs.js.map
|
||||
1
desktop-operator/node_modules/@electron/asar/lib/crawlfs.js.map
generated
vendored
Normal file
1
desktop-operator/node_modules/@electron/asar/lib/crawlfs.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"crawlfs.js","sourceRoot":"","sources":["../src/crawlfs.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;AAmBA,8CAUC;AAED,sBA8BC;AA7DD,+BAAiC;AACjC,+BAAqC;AAErC,8DAA8B;AAE9B,2CAA6B;AAG7B,MAAM,IAAI,GAAG,IAAA,gBAAS,EAAC,WAAK,CAAC,CAAC;AAWvB,KAAK,UAAU,iBAAiB,CAAC,QAAgB;IACtD,MAAM,IAAI,GAAG,MAAM,oBAAE,CAAC,KAAK,CAAC,QAAQ,CAAC,CAAC;IACtC,IAAI,IAAI,CAAC,MAAM,EAAE,EAAE,CAAC;QAClB,OAAO,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,CAAC;IAChC,CAAC;SAAM,IAAI,IAAI,CAAC,WAAW,EAAE,EAAE,CAAC;QAC9B,OAAO,EAAE,IAAI,EAAE,WAAW,EAAE,IAAI,EAAE,CAAC;IACrC,CAAC;SAAM,IAAI,IAAI,CAAC,cAAc,EAAE,EAAE,CAAC;QACjC,OAAO,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,CAAC;IAChC,CAAC;IACD,OAAO,IAAI,CAAC;AACd,CAAC;AAEM,KAAK,UAAU,KAAK,CAAC,GAAW,EAAE,OAAiB;IACxD,MAAM,QAAQ,GAAoC,EAAE,CAAC;IACrD,MAAM,OAAO,GAAG,MAAM,IAAI,CAAC,GAAG,EAAE,OAAO,CAAC,CAAC;IACzC,MAAM,OAAO,GAAG,MAAM,OAAO,CAAC,GAAG,CAC/B,OAAO,CAAC,GAAG,CAAC,KAAK,EAAE,QAAQ,EAAE,EAAE,CAAC,CAAC,QAAQ,EAAE,MAAM,iBAAiB,CAAC,QAAQ,CAAC,CAAU,CAAC,CACxF,CAAC;IACF,MAAM,KAAK,GAAa,EAAE,CAAC;IAC3B,MAAM,SAAS,GAAG,OAAO;SACtB,GAAG,CAAC,CAAC,CAAC,QAAQ,EAAE,IAAI,CAAC,EAAE,EAAE;QACxB,IAAI,IAAI,EAAE,CAAC;YACT,QAAQ,CAAC,QAAQ,CAAC,GAAG,IAAI,CAAC;YAC1B,IAAI,IAAI,CAAC,IAAI,KAAK,MAAM;gBAAE,KAAK,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;QACjD,CAAC;QACD,OAAO,QAAQ,CAAC;IAClB,CAAC,CAAC;SACD,MAAM,CAAC,CAAC,QAAQ,EAAE,EAAE;QACnB,qEAAqE;QACrE,qEAAqE;QACrE,MAAM,cAAc,GAAG,KAAK,CAAC,SAAS,CAAC,CAAC,IAAI,EAAE,EAAE,CAAC,QAAQ,KAAK,IAAI,CAAC,CAAC;QACpE,OAAO,KAAK,CAAC,KAAK,CAAC,CAAC,IAAI,EAAE,KAAK,EAAE,EAAE;YACjC,IAAI,KAAK,KAAK,cAAc,EAAE,CAAC;gBAC7B,OAAO,IAAI,CAAC;YACd,CAAC;YACD,MAAM,sBAAsB,GAAG,QAAQ,CAAC,UAAU,CAAC,IAAI,CAAC,CAAC;YACzD,uFAAuF;YACvF,MAAM,YAAY,GAAG,IAAI,CAAC,QAAQ,CAAC,IAAI,EAAE,IAAI,CAAC,OAAO,CAAC,QAAQ,CAAC,CAAC,CAAC;YACjE,OAAO,CAAC,sBAAsB,IAAI,YAAY,CAAC,UAAU,CAAC,IAAI,CAAC,CAAC;QAClE,CAAC,CAAC,CAAC;IACL,CAAC,CAAC,CAAC;IACL,OAAO,CAAC,SAAS,EAAE,QAAQ,CAAU,CAAC;AACxC,CAAC"}
|
||||
44
desktop-operator/node_modules/@electron/asar/lib/disk.d.ts
generated
vendored
Normal file
44
desktop-operator/node_modules/@electron/asar/lib/disk.d.ts
generated
vendored
Normal file
@@ -0,0 +1,44 @@
|
||||
import { Filesystem, FilesystemFileEntry } from './filesystem';
|
||||
import { CrawledFileType } from './crawlfs';
|
||||
import { Stats } from 'fs';
|
||||
export type InputMetadata = {
|
||||
[property: string]: CrawledFileType;
|
||||
};
|
||||
export type BasicFilesArray = {
|
||||
filename: string;
|
||||
unpack: boolean;
|
||||
}[];
|
||||
export type BasicStreamArray = {
|
||||
filename: string;
|
||||
streamGenerator: () => NodeJS.ReadableStream;
|
||||
mode: Stats['mode'];
|
||||
unpack: boolean;
|
||||
link: string | undefined;
|
||||
}[];
|
||||
export type FilesystemFilesAndLinks<T extends BasicFilesArray | BasicStreamArray> = {
|
||||
files: T;
|
||||
links: T;
|
||||
};
|
||||
export declare function writeFilesystem(dest: string, filesystem: Filesystem, lists: FilesystemFilesAndLinks<BasicFilesArray>, metadata: InputMetadata): Promise<NodeJS.WritableStream>;
|
||||
export declare function streamFilesystem(dest: string, filesystem: Filesystem, lists: FilesystemFilesAndLinks<BasicStreamArray>): Promise<import("fs").WriteStream>;
|
||||
export interface FileRecord extends FilesystemFileEntry {
|
||||
integrity: {
|
||||
hash: string;
|
||||
algorithm: 'SHA256';
|
||||
blocks: string[];
|
||||
blockSize: number;
|
||||
};
|
||||
}
|
||||
export type DirectoryRecord = {
|
||||
files: Record<string, DirectoryRecord | FileRecord>;
|
||||
};
|
||||
export type ArchiveHeader = {
|
||||
header: DirectoryRecord;
|
||||
headerString: string;
|
||||
headerSize: number;
|
||||
};
|
||||
export declare function readArchiveHeaderSync(archivePath: string): ArchiveHeader;
|
||||
export declare function readFilesystemSync(archivePath: string): Filesystem;
|
||||
export declare function uncacheFilesystem(archivePath: string): boolean;
|
||||
export declare function uncacheAll(): void;
|
||||
export declare function readFileSync(filesystem: Filesystem, filename: string, info: FilesystemFileEntry): Buffer;
|
||||
219
desktop-operator/node_modules/@electron/asar/lib/disk.js
generated
vendored
Normal file
219
desktop-operator/node_modules/@electron/asar/lib/disk.js
generated
vendored
Normal file
@@ -0,0 +1,219 @@
|
||||
"use strict";
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||
}
|
||||
Object.defineProperty(o, k2, desc);
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k in mod) if (k !== "default" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
var __asyncValues = (this && this.__asyncValues) || function (o) {
|
||||
if (!Symbol.asyncIterator) throw new TypeError("Symbol.asyncIterator is not defined.");
|
||||
var m = o[Symbol.asyncIterator], i;
|
||||
return m ? m.call(o) : (o = typeof __values === "function" ? __values(o) : o[Symbol.iterator](), i = {}, verb("next"), verb("throw"), verb("return"), i[Symbol.asyncIterator] = function () { return this; }, i);
|
||||
function verb(n) { i[n] = o[n] && function (v) { return new Promise(function (resolve, reject) { v = o[n](v), settle(resolve, reject, v.done, v.value); }); }; }
|
||||
function settle(resolve, reject, d, v) { Promise.resolve(v).then(function(v) { resolve({ value: v, done: d }); }, reject); }
|
||||
};
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.writeFilesystem = writeFilesystem;
|
||||
exports.streamFilesystem = streamFilesystem;
|
||||
exports.readArchiveHeaderSync = readArchiveHeaderSync;
|
||||
exports.readFilesystemSync = readFilesystemSync;
|
||||
exports.uncacheFilesystem = uncacheFilesystem;
|
||||
exports.uncacheAll = uncacheAll;
|
||||
exports.readFileSync = readFileSync;
|
||||
const path = __importStar(require("path"));
|
||||
const wrapped_fs_1 = __importDefault(require("./wrapped-fs"));
|
||||
const pickle_1 = require("./pickle");
|
||||
const filesystem_1 = require("./filesystem");
|
||||
const util_1 = require("util");
|
||||
const stream = __importStar(require("stream"));
|
||||
const pipeline = (0, util_1.promisify)(stream.pipeline);
|
||||
let filesystemCache = Object.create(null);
|
||||
async function copyFile(dest, src, filename) {
|
||||
const srcFile = path.join(src, filename);
|
||||
const targetFile = path.join(dest, filename);
|
||||
const [content, stats] = await Promise.all([
|
||||
wrapped_fs_1.default.readFile(srcFile),
|
||||
wrapped_fs_1.default.stat(srcFile),
|
||||
wrapped_fs_1.default.mkdirp(path.dirname(targetFile)),
|
||||
]);
|
||||
return wrapped_fs_1.default.writeFile(targetFile, content, { mode: stats.mode });
|
||||
}
|
||||
async function streamTransformedFile(stream, outStream) {
|
||||
return new Promise((resolve, reject) => {
|
||||
stream.pipe(outStream, { end: false });
|
||||
stream.on('error', reject);
|
||||
stream.on('end', () => resolve());
|
||||
});
|
||||
}
|
||||
const writeFileListToStream = async function (dest, filesystem, out, lists, metadata) {
|
||||
const { files, links } = lists;
|
||||
for (const file of files) {
|
||||
if (file.unpack) {
|
||||
// the file should not be packed into archive
|
||||
const filename = path.relative(filesystem.getRootPath(), file.filename);
|
||||
await copyFile(`${dest}.unpacked`, filesystem.getRootPath(), filename);
|
||||
}
|
||||
else {
|
||||
const transformed = metadata[file.filename].transformed;
|
||||
const stream = wrapped_fs_1.default.createReadStream(transformed ? transformed.path : file.filename);
|
||||
await streamTransformedFile(stream, out);
|
||||
}
|
||||
}
|
||||
for (const file of links.filter((f) => f.unpack)) {
|
||||
// the symlink needs to be recreated outside in .unpacked
|
||||
const filename = path.relative(filesystem.getRootPath(), file.filename);
|
||||
const link = await wrapped_fs_1.default.readlink(file.filename);
|
||||
await createSymlink(dest, filename, link);
|
||||
}
|
||||
return out.end();
|
||||
};
|
||||
async function writeFilesystem(dest, filesystem, lists, metadata) {
|
||||
const out = await createFilesystemWriteStream(filesystem, dest);
|
||||
return writeFileListToStream(dest, filesystem, out, lists, metadata);
|
||||
}
|
||||
async function streamFilesystem(dest, filesystem, lists) {
|
||||
var _a, e_1, _b, _c;
|
||||
const out = await createFilesystemWriteStream(filesystem, dest);
|
||||
const { files, links } = lists;
|
||||
try {
|
||||
for (var _d = true, files_1 = __asyncValues(files), files_1_1; files_1_1 = await files_1.next(), _a = files_1_1.done, !_a; _d = true) {
|
||||
_c = files_1_1.value;
|
||||
_d = false;
|
||||
const file = _c;
|
||||
// the file should not be packed into archive
|
||||
if (file.unpack) {
|
||||
const targetFile = path.join(`${dest}.unpacked`, file.filename);
|
||||
await wrapped_fs_1.default.mkdirp(path.dirname(targetFile));
|
||||
const writeStream = wrapped_fs_1.default.createWriteStream(targetFile, { mode: file.mode });
|
||||
await pipeline(file.streamGenerator(), writeStream);
|
||||
}
|
||||
else {
|
||||
await streamTransformedFile(file.streamGenerator(), out);
|
||||
}
|
||||
}
|
||||
}
|
||||
catch (e_1_1) { e_1 = { error: e_1_1 }; }
|
||||
finally {
|
||||
try {
|
||||
if (!_d && !_a && (_b = files_1.return)) await _b.call(files_1);
|
||||
}
|
||||
finally { if (e_1) throw e_1.error; }
|
||||
}
|
||||
for (const file of links.filter((f) => f.unpack && f.link)) {
|
||||
// the symlink needs to be recreated outside in .unpacked
|
||||
await createSymlink(dest, file.filename, file.link);
|
||||
}
|
||||
return out.end();
|
||||
}
|
||||
function readArchiveHeaderSync(archivePath) {
|
||||
const fd = wrapped_fs_1.default.openSync(archivePath, 'r');
|
||||
let size;
|
||||
let headerBuf;
|
||||
try {
|
||||
const sizeBuf = Buffer.alloc(8);
|
||||
if (wrapped_fs_1.default.readSync(fd, sizeBuf, 0, 8, null) !== 8) {
|
||||
throw new Error('Unable to read header size');
|
||||
}
|
||||
const sizePickle = pickle_1.Pickle.createFromBuffer(sizeBuf);
|
||||
size = sizePickle.createIterator().readUInt32();
|
||||
headerBuf = Buffer.alloc(size);
|
||||
if (wrapped_fs_1.default.readSync(fd, headerBuf, 0, size, null) !== size) {
|
||||
throw new Error('Unable to read header');
|
||||
}
|
||||
}
|
||||
finally {
|
||||
wrapped_fs_1.default.closeSync(fd);
|
||||
}
|
||||
const headerPickle = pickle_1.Pickle.createFromBuffer(headerBuf);
|
||||
const header = headerPickle.createIterator().readString();
|
||||
return { headerString: header, header: JSON.parse(header), headerSize: size };
|
||||
}
|
||||
function readFilesystemSync(archivePath) {
|
||||
if (!filesystemCache[archivePath]) {
|
||||
const header = readArchiveHeaderSync(archivePath);
|
||||
const filesystem = new filesystem_1.Filesystem(archivePath);
|
||||
filesystem.setHeader(header.header, header.headerSize);
|
||||
filesystemCache[archivePath] = filesystem;
|
||||
}
|
||||
return filesystemCache[archivePath];
|
||||
}
|
||||
function uncacheFilesystem(archivePath) {
|
||||
if (filesystemCache[archivePath]) {
|
||||
filesystemCache[archivePath] = undefined;
|
||||
return true;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
function uncacheAll() {
|
||||
filesystemCache = {};
|
||||
}
|
||||
function readFileSync(filesystem, filename, info) {
|
||||
let buffer = Buffer.alloc(info.size);
|
||||
if (info.size <= 0) {
|
||||
return buffer;
|
||||
}
|
||||
if (info.unpacked) {
|
||||
// it's an unpacked file, copy it.
|
||||
buffer = wrapped_fs_1.default.readFileSync(path.join(`${filesystem.getRootPath()}.unpacked`, filename));
|
||||
}
|
||||
else {
|
||||
// Node throws an exception when reading 0 bytes into a 0-size buffer,
|
||||
// so we short-circuit the read in this case.
|
||||
const fd = wrapped_fs_1.default.openSync(filesystem.getRootPath(), 'r');
|
||||
try {
|
||||
const offset = 8 + filesystem.getHeaderSize() + parseInt(info.offset);
|
||||
wrapped_fs_1.default.readSync(fd, buffer, 0, info.size, offset);
|
||||
}
|
||||
finally {
|
||||
wrapped_fs_1.default.closeSync(fd);
|
||||
}
|
||||
}
|
||||
return buffer;
|
||||
}
|
||||
async function createFilesystemWriteStream(filesystem, dest) {
|
||||
const headerPickle = pickle_1.Pickle.createEmpty();
|
||||
headerPickle.writeString(JSON.stringify(filesystem.getHeader()));
|
||||
const headerBuf = headerPickle.toBuffer();
|
||||
const sizePickle = pickle_1.Pickle.createEmpty();
|
||||
sizePickle.writeUInt32(headerBuf.length);
|
||||
const sizeBuf = sizePickle.toBuffer();
|
||||
const out = wrapped_fs_1.default.createWriteStream(dest);
|
||||
await new Promise((resolve, reject) => {
|
||||
out.on('error', reject);
|
||||
out.write(sizeBuf);
|
||||
return out.write(headerBuf, () => resolve());
|
||||
});
|
||||
return out;
|
||||
}
|
||||
async function createSymlink(dest, filepath, link) {
|
||||
// if symlink is within subdirectories, then we need to recreate dir structure
|
||||
await wrapped_fs_1.default.mkdirp(path.join(`${dest}.unpacked`, path.dirname(filepath)));
|
||||
// create symlink within unpacked dir
|
||||
await wrapped_fs_1.default.symlink(link, path.join(`${dest}.unpacked`, filepath)).catch(async (error) => {
|
||||
if (error.code === 'EPERM' && error.syscall === 'symlink') {
|
||||
throw new Error('Could not create symlinks for unpacked assets. On Windows, consider activating Developer Mode to allow non-admin users to create symlinks by following the instructions at https://docs.microsoft.com/en-us/windows/apps/get-started/enable-your-device-for-development.');
|
||||
}
|
||||
throw error;
|
||||
});
|
||||
}
|
||||
//# sourceMappingURL=disk.js.map
|
||||
1
desktop-operator/node_modules/@electron/asar/lib/disk.js.map
generated
vendored
Normal file
1
desktop-operator/node_modules/@electron/asar/lib/disk.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
44
desktop-operator/node_modules/@electron/asar/lib/filesystem.d.ts
generated
vendored
Normal file
44
desktop-operator/node_modules/@electron/asar/lib/filesystem.d.ts
generated
vendored
Normal file
@@ -0,0 +1,44 @@
|
||||
import { FileIntegrity } from './integrity';
|
||||
import { CrawledFileType } from './crawlfs';
|
||||
export type EntryMetadata = {
|
||||
unpacked?: boolean;
|
||||
};
|
||||
export type FilesystemDirectoryEntry = {
|
||||
files: Record<string, FilesystemEntry>;
|
||||
} & EntryMetadata;
|
||||
export type FilesystemFileEntry = {
|
||||
unpacked: boolean;
|
||||
executable: boolean;
|
||||
offset: string;
|
||||
size: number;
|
||||
integrity: FileIntegrity;
|
||||
} & EntryMetadata;
|
||||
export type FilesystemLinkEntry = {
|
||||
link: string;
|
||||
} & EntryMetadata;
|
||||
export type FilesystemEntry = FilesystemDirectoryEntry | FilesystemFileEntry | FilesystemLinkEntry;
|
||||
export declare class Filesystem {
|
||||
private src;
|
||||
private header;
|
||||
private headerSize;
|
||||
private offset;
|
||||
constructor(src: string);
|
||||
getRootPath(): string;
|
||||
getHeader(): FilesystemEntry;
|
||||
getHeaderSize(): number;
|
||||
setHeader(header: FilesystemEntry, headerSize: number): void;
|
||||
searchNodeFromDirectory(p: string): FilesystemEntry;
|
||||
searchNodeFromPath(p: string): FilesystemEntry;
|
||||
insertDirectory(p: string, shouldUnpack: boolean): Record<string, FilesystemEntry>;
|
||||
insertFile(p: string, streamGenerator: () => NodeJS.ReadableStream, shouldUnpack: boolean, file: CrawledFileType, options?: {
|
||||
transform?: (filePath: string) => NodeJS.ReadWriteStream | void;
|
||||
}): Promise<void>;
|
||||
insertLink(p: string, shouldUnpack: boolean, parentPath?: string, symlink?: string, // /var/tmp => /private/var
|
||||
src?: string): string;
|
||||
private resolveLink;
|
||||
listFiles(options?: {
|
||||
isPack: boolean;
|
||||
}): string[];
|
||||
getNode(p: string, followLinks?: boolean): FilesystemEntry;
|
||||
getFile(p: string, followLinks?: boolean): FilesystemEntry;
|
||||
}
|
||||
199
desktop-operator/node_modules/@electron/asar/lib/filesystem.js
generated
vendored
Normal file
199
desktop-operator/node_modules/@electron/asar/lib/filesystem.js
generated
vendored
Normal file
@@ -0,0 +1,199 @@
|
||||
"use strict";
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||
}
|
||||
Object.defineProperty(o, k2, desc);
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k in mod) if (k !== "default" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.Filesystem = void 0;
|
||||
const os = __importStar(require("os"));
|
||||
const path = __importStar(require("path"));
|
||||
const util_1 = require("util");
|
||||
const stream = __importStar(require("stream"));
|
||||
const integrity_1 = require("./integrity");
|
||||
const wrapped_fs_1 = __importDefault(require("./wrapped-fs"));
|
||||
const UINT32_MAX = 2 ** 32 - 1;
|
||||
const pipeline = (0, util_1.promisify)(stream.pipeline);
|
||||
class Filesystem {
|
||||
constructor(src) {
|
||||
this.src = path.resolve(src);
|
||||
this.header = { files: Object.create(null) };
|
||||
this.headerSize = 0;
|
||||
this.offset = BigInt(0);
|
||||
}
|
||||
getRootPath() {
|
||||
return this.src;
|
||||
}
|
||||
getHeader() {
|
||||
return this.header;
|
||||
}
|
||||
getHeaderSize() {
|
||||
return this.headerSize;
|
||||
}
|
||||
setHeader(header, headerSize) {
|
||||
this.header = header;
|
||||
this.headerSize = headerSize;
|
||||
}
|
||||
searchNodeFromDirectory(p) {
|
||||
let json = this.header;
|
||||
const dirs = p.split(path.sep);
|
||||
for (const dir of dirs) {
|
||||
if (dir !== '.') {
|
||||
if ('files' in json) {
|
||||
if (!json.files[dir]) {
|
||||
json.files[dir] = { files: Object.create(null) };
|
||||
}
|
||||
json = json.files[dir];
|
||||
}
|
||||
else {
|
||||
throw new Error('Unexpected directory state while traversing: ' + p);
|
||||
}
|
||||
}
|
||||
}
|
||||
return json;
|
||||
}
|
||||
searchNodeFromPath(p) {
|
||||
p = path.relative(this.src, p);
|
||||
if (!p) {
|
||||
return this.header;
|
||||
}
|
||||
const name = path.basename(p);
|
||||
const node = this.searchNodeFromDirectory(path.dirname(p));
|
||||
if (!node.files) {
|
||||
node.files = Object.create(null);
|
||||
}
|
||||
if (!node.files[name]) {
|
||||
node.files[name] = Object.create(null);
|
||||
}
|
||||
return node.files[name];
|
||||
}
|
||||
insertDirectory(p, shouldUnpack) {
|
||||
const node = this.searchNodeFromPath(p);
|
||||
if (shouldUnpack) {
|
||||
node.unpacked = shouldUnpack;
|
||||
}
|
||||
node.files = node.files || Object.create(null);
|
||||
return node.files;
|
||||
}
|
||||
async insertFile(p, streamGenerator, shouldUnpack, file, options = {}) {
|
||||
const dirNode = this.searchNodeFromPath(path.dirname(p));
|
||||
const node = this.searchNodeFromPath(p);
|
||||
if (shouldUnpack || dirNode.unpacked) {
|
||||
node.size = file.stat.size;
|
||||
node.unpacked = true;
|
||||
node.integrity = await (0, integrity_1.getFileIntegrity)(streamGenerator());
|
||||
return Promise.resolve();
|
||||
}
|
||||
let size;
|
||||
const transformed = options.transform && options.transform(p);
|
||||
if (transformed) {
|
||||
const tmpdir = await wrapped_fs_1.default.mkdtemp(path.join(os.tmpdir(), 'asar-'));
|
||||
const tmpfile = path.join(tmpdir, path.basename(p));
|
||||
const out = wrapped_fs_1.default.createWriteStream(tmpfile);
|
||||
await pipeline(streamGenerator(), transformed, out);
|
||||
file.transformed = {
|
||||
path: tmpfile,
|
||||
stat: await wrapped_fs_1.default.lstat(tmpfile),
|
||||
};
|
||||
size = file.transformed.stat.size;
|
||||
}
|
||||
else {
|
||||
size = file.stat.size;
|
||||
}
|
||||
// JavaScript cannot precisely present integers >= UINT32_MAX.
|
||||
if (size > UINT32_MAX) {
|
||||
throw new Error(`${p}: file size can not be larger than 4.2GB`);
|
||||
}
|
||||
node.size = size;
|
||||
node.offset = this.offset.toString();
|
||||
node.integrity = await (0, integrity_1.getFileIntegrity)(streamGenerator());
|
||||
if (process.platform !== 'win32' && file.stat.mode & 0o100) {
|
||||
node.executable = true;
|
||||
}
|
||||
this.offset += BigInt(size);
|
||||
}
|
||||
insertLink(p, shouldUnpack, parentPath = wrapped_fs_1.default.realpathSync(path.dirname(p)), symlink = wrapped_fs_1.default.readlinkSync(p), // /var/tmp => /private/var
|
||||
src = wrapped_fs_1.default.realpathSync(this.src)) {
|
||||
const link = this.resolveLink(src, parentPath, symlink);
|
||||
if (link.startsWith('..')) {
|
||||
throw new Error(`${p}: file "${link}" links out of the package`);
|
||||
}
|
||||
const node = this.searchNodeFromPath(p);
|
||||
const dirNode = this.searchNodeFromPath(path.dirname(p));
|
||||
if (shouldUnpack || dirNode.unpacked) {
|
||||
node.unpacked = true;
|
||||
}
|
||||
node.link = link;
|
||||
return link;
|
||||
}
|
||||
resolveLink(src, parentPath, symlink) {
|
||||
const target = path.join(parentPath, symlink);
|
||||
const link = path.relative(src, target);
|
||||
return link;
|
||||
}
|
||||
listFiles(options) {
|
||||
const files = [];
|
||||
const fillFilesFromMetadata = function (basePath, metadata) {
|
||||
if (!('files' in metadata)) {
|
||||
return;
|
||||
}
|
||||
for (const [childPath, childMetadata] of Object.entries(metadata.files)) {
|
||||
const fullPath = path.join(basePath, childPath);
|
||||
const packState = 'unpacked' in childMetadata && childMetadata.unpacked ? 'unpack' : 'pack ';
|
||||
files.push(options && options.isPack ? `${packState} : ${fullPath}` : fullPath);
|
||||
fillFilesFromMetadata(fullPath, childMetadata);
|
||||
}
|
||||
};
|
||||
fillFilesFromMetadata('/', this.header);
|
||||
return files;
|
||||
}
|
||||
getNode(p, followLinks = true) {
|
||||
const node = this.searchNodeFromDirectory(path.dirname(p));
|
||||
const name = path.basename(p);
|
||||
if ('link' in node && followLinks) {
|
||||
return this.getNode(path.join(node.link, name));
|
||||
}
|
||||
if (name) {
|
||||
return node.files[name];
|
||||
}
|
||||
else {
|
||||
return node;
|
||||
}
|
||||
}
|
||||
getFile(p, followLinks = true) {
|
||||
const info = this.getNode(p, followLinks);
|
||||
if (!info) {
|
||||
throw new Error(`"${p}" was not found in this archive`);
|
||||
}
|
||||
// if followLinks is false we don't resolve symlinks
|
||||
if ('link' in info && followLinks) {
|
||||
return this.getFile(info.link, followLinks);
|
||||
}
|
||||
else {
|
||||
return info;
|
||||
}
|
||||
}
|
||||
}
|
||||
exports.Filesystem = Filesystem;
|
||||
//# sourceMappingURL=filesystem.js.map
|
||||
1
desktop-operator/node_modules/@electron/asar/lib/filesystem.js.map
generated
vendored
Normal file
1
desktop-operator/node_modules/@electron/asar/lib/filesystem.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
7
desktop-operator/node_modules/@electron/asar/lib/integrity.d.ts
generated
vendored
Normal file
7
desktop-operator/node_modules/@electron/asar/lib/integrity.d.ts
generated
vendored
Normal file
@@ -0,0 +1,7 @@
|
||||
export type FileIntegrity = {
|
||||
algorithm: 'SHA256';
|
||||
hash: string;
|
||||
blockSize: number;
|
||||
blocks: string[];
|
||||
};
|
||||
export declare function getFileIntegrity(inputFileStream: NodeJS.ReadableStream): Promise<FileIntegrity>;
|
||||
75
desktop-operator/node_modules/@electron/asar/lib/integrity.js
generated
vendored
Normal file
75
desktop-operator/node_modules/@electron/asar/lib/integrity.js
generated
vendored
Normal file
@@ -0,0 +1,75 @@
|
||||
"use strict";
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||
}
|
||||
Object.defineProperty(o, k2, desc);
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k in mod) if (k !== "default" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.getFileIntegrity = getFileIntegrity;
|
||||
const crypto = __importStar(require("crypto"));
|
||||
const stream = __importStar(require("stream"));
|
||||
const util_1 = require("util");
|
||||
const ALGORITHM = 'SHA256';
|
||||
// 4MB default block size
|
||||
const BLOCK_SIZE = 4 * 1024 * 1024;
|
||||
const pipeline = (0, util_1.promisify)(stream.pipeline);
|
||||
function hashBlock(block) {
|
||||
return crypto.createHash(ALGORITHM).update(block).digest('hex');
|
||||
}
|
||||
async function getFileIntegrity(inputFileStream) {
|
||||
const fileHash = crypto.createHash(ALGORITHM);
|
||||
const blockHashes = [];
|
||||
let currentBlockSize = 0;
|
||||
let currentBlock = [];
|
||||
await pipeline(inputFileStream, new stream.PassThrough({
|
||||
decodeStrings: false,
|
||||
transform(_chunk, encoding, callback) {
|
||||
fileHash.update(_chunk);
|
||||
function handleChunk(chunk) {
|
||||
const diffToSlice = Math.min(BLOCK_SIZE - currentBlockSize, chunk.byteLength);
|
||||
currentBlockSize += diffToSlice;
|
||||
currentBlock.push(chunk.slice(0, diffToSlice));
|
||||
if (currentBlockSize === BLOCK_SIZE) {
|
||||
blockHashes.push(hashBlock(Buffer.concat(currentBlock)));
|
||||
currentBlock = [];
|
||||
currentBlockSize = 0;
|
||||
}
|
||||
if (diffToSlice < chunk.byteLength) {
|
||||
handleChunk(chunk.slice(diffToSlice));
|
||||
}
|
||||
}
|
||||
handleChunk(_chunk);
|
||||
callback();
|
||||
},
|
||||
flush(callback) {
|
||||
blockHashes.push(hashBlock(Buffer.concat(currentBlock)));
|
||||
currentBlock = [];
|
||||
callback();
|
||||
},
|
||||
}));
|
||||
return {
|
||||
algorithm: ALGORITHM,
|
||||
hash: fileHash.digest('hex'),
|
||||
blockSize: BLOCK_SIZE,
|
||||
blocks: blockHashes,
|
||||
};
|
||||
}
|
||||
//# sourceMappingURL=integrity.js.map
|
||||
1
desktop-operator/node_modules/@electron/asar/lib/integrity.js.map
generated
vendored
Normal file
1
desktop-operator/node_modules/@electron/asar/lib/integrity.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"integrity.js","sourceRoot":"","sources":["../src/integrity.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;AAsBA,4CA8CC;AApED,+CAAiC;AAEjC,+CAAiC;AACjC,+BAAiC;AAEjC,MAAM,SAAS,GAAG,QAAQ,CAAC;AAC3B,yBAAyB;AACzB,MAAM,UAAU,GAAG,CAAC,GAAG,IAAI,GAAG,IAAI,CAAC;AAEnC,MAAM,QAAQ,GAAG,IAAA,gBAAS,EAAC,MAAM,CAAC,QAAQ,CAAC,CAAC;AAE5C,SAAS,SAAS,CAAC,KAAa;IAC9B,OAAO,MAAM,CAAC,UAAU,CAAC,SAAS,CAAC,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC;AAClE,CAAC;AASM,KAAK,UAAU,gBAAgB,CACpC,eAAsC;IAEtC,MAAM,QAAQ,GAAG,MAAM,CAAC,UAAU,CAAC,SAAS,CAAC,CAAC;IAE9C,MAAM,WAAW,GAAa,EAAE,CAAC;IACjC,IAAI,gBAAgB,GAAG,CAAC,CAAC;IACzB,IAAI,YAAY,GAAa,EAAE,CAAC;IAEhC,MAAM,QAAQ,CACZ,eAAe,EACf,IAAI,MAAM,CAAC,WAAW,CAAC;QACrB,aAAa,EAAE,KAAK;QACpB,SAAS,CAAC,MAAc,EAAE,QAAQ,EAAE,QAAQ;YAC1C,QAAQ,CAAC,MAAM,CAAC,MAAM,CAAC,CAAC;YAExB,SAAS,WAAW,CAAC,KAAa;gBAChC,MAAM,WAAW,GAAG,IAAI,CAAC,GAAG,CAAC,UAAU,GAAG,gBAAgB,EAAE,KAAK,CAAC,UAAU,CAAC,CAAC;gBAC9E,gBAAgB,IAAI,WAAW,CAAC;gBAChC,YAAY,CAAC,IAAI,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC,EAAE,WAAW,CAAC,CAAC,CAAC;gBAC/C,IAAI,gBAAgB,KAAK,UAAU,EAAE,CAAC;oBACpC,WAAW,CAAC,IAAI,CAAC,SAAS,CAAC,MAAM,CAAC,MAAM,CAAC,YAAY,CAAC,CAAC,CAAC,CAAC;oBACzD,YAAY,GAAG,EAAE,CAAC;oBAClB,gBAAgB,GAAG,CAAC,CAAC;gBACvB,CAAC;gBACD,IAAI,WAAW,GAAG,KAAK,CAAC,UAAU,EAAE,CAAC;oBACnC,WAAW,CAAC,KAAK,CAAC,KAAK,CAAC,WAAW,CAAC,CAAC,CAAC;gBACxC,CAAC;YACH,CAAC;YACD,WAAW,CAAC,MAAM,CAAC,CAAC;YACpB,QAAQ,EAAE,CAAC;QACb,CAAC;QACD,KAAK,CAAC,QAAQ;YACZ,WAAW,CAAC,IAAI,CAAC,SAAS,CAAC,MAAM,CAAC,MAAM,CAAC,YAAY,CAAC,CAAC,CAAC,CAAC;YACzD,YAAY,GAAG,EAAE,CAAC;YAClB,QAAQ,EAAE,CAAC;QACb,CAAC;KACF,CAAC,CACH,CAAC;IAEF,OAAO;QACL,SAAS,EAAE,SAAS;QACpB,IAAI,EAAE,QAAQ,CAAC,MAAM,CAAC,KAAK,CAAC;QAC5B,SAAS,EAAE,UAAU;QACrB,MAAM,EAAE,WAAW;KACpB,CAAC;AACJ,CAAC"}
|
||||
46
desktop-operator/node_modules/@electron/asar/lib/pickle.d.ts
generated
vendored
Normal file
46
desktop-operator/node_modules/@electron/asar/lib/pickle.d.ts
generated
vendored
Normal file
@@ -0,0 +1,46 @@
|
||||
declare class PickleIterator {
|
||||
private payload;
|
||||
private payloadOffset;
|
||||
private readIndex;
|
||||
private endIndex;
|
||||
constructor(pickle: Pickle);
|
||||
readBool(): boolean;
|
||||
readInt(): number;
|
||||
readUInt32(): number;
|
||||
readInt64(): bigint;
|
||||
readUInt64(): bigint;
|
||||
readFloat(): number;
|
||||
readDouble(): number;
|
||||
readString(): string;
|
||||
readBytes(length: number): Buffer;
|
||||
readBytes<R, F extends (...args: any[]) => R>(length: number, method: F): R;
|
||||
getReadPayloadOffsetAndAdvance(length: number): number;
|
||||
advance(size: number): void;
|
||||
}
|
||||
export declare class Pickle {
|
||||
private header;
|
||||
private headerSize;
|
||||
private capacityAfterHeader;
|
||||
private writeOffset;
|
||||
private constructor();
|
||||
static createEmpty(): Pickle;
|
||||
static createFromBuffer(buffer: Buffer): Pickle;
|
||||
getHeader(): Buffer;
|
||||
getHeaderSize(): number;
|
||||
createIterator(): PickleIterator;
|
||||
toBuffer(): Buffer;
|
||||
writeBool(value: boolean): boolean;
|
||||
writeInt(value: number): boolean;
|
||||
writeUInt32(value: number): boolean;
|
||||
writeInt64(value: number): boolean;
|
||||
writeUInt64(value: number): boolean;
|
||||
writeFloat(value: number): boolean;
|
||||
writeDouble(value: number): boolean;
|
||||
writeString(value: string): boolean;
|
||||
setPayloadSize(payloadSize: number): number;
|
||||
getPayloadSize(): number;
|
||||
writeBytes(data: string, length: number, method?: undefined): boolean;
|
||||
writeBytes(data: number | BigInt, length: number, method: Function): boolean;
|
||||
resize(newCapacity: number): void;
|
||||
}
|
||||
export {};
|
||||
199
desktop-operator/node_modules/@electron/asar/lib/pickle.js
generated
vendored
Normal file
199
desktop-operator/node_modules/@electron/asar/lib/pickle.js
generated
vendored
Normal file
@@ -0,0 +1,199 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.Pickle = void 0;
|
||||
// sizeof(T).
|
||||
const SIZE_INT32 = 4;
|
||||
const SIZE_UINT32 = 4;
|
||||
const SIZE_INT64 = 8;
|
||||
const SIZE_UINT64 = 8;
|
||||
const SIZE_FLOAT = 4;
|
||||
const SIZE_DOUBLE = 8;
|
||||
// The allocation granularity of the payload.
|
||||
const PAYLOAD_UNIT = 64;
|
||||
// Largest JS number.
|
||||
const CAPACITY_READ_ONLY = 9007199254740992;
|
||||
// Aligns 'i' by rounding it up to the next multiple of 'alignment'.
|
||||
const alignInt = function (i, alignment) {
|
||||
return i + ((alignment - (i % alignment)) % alignment);
|
||||
};
|
||||
// PickleIterator reads data from a Pickle. The Pickle object must remain valid
|
||||
// while the PickleIterator object is in use.
|
||||
class PickleIterator {
|
||||
constructor(pickle) {
|
||||
this.payload = pickle.getHeader();
|
||||
this.payloadOffset = pickle.getHeaderSize();
|
||||
this.readIndex = 0;
|
||||
this.endIndex = pickle.getPayloadSize();
|
||||
}
|
||||
readBool() {
|
||||
return this.readInt() !== 0;
|
||||
}
|
||||
readInt() {
|
||||
return this.readBytes(SIZE_INT32, Buffer.prototype.readInt32LE);
|
||||
}
|
||||
readUInt32() {
|
||||
return this.readBytes(SIZE_UINT32, Buffer.prototype.readUInt32LE);
|
||||
}
|
||||
readInt64() {
|
||||
return this.readBytes(SIZE_INT64, Buffer.prototype.readBigInt64LE);
|
||||
}
|
||||
readUInt64() {
|
||||
return this.readBytes(SIZE_UINT64, Buffer.prototype.readBigUInt64LE);
|
||||
}
|
||||
readFloat() {
|
||||
return this.readBytes(SIZE_FLOAT, Buffer.prototype.readFloatLE);
|
||||
}
|
||||
readDouble() {
|
||||
return this.readBytes(SIZE_DOUBLE, Buffer.prototype.readDoubleLE);
|
||||
}
|
||||
readString() {
|
||||
return this.readBytes(this.readInt()).toString();
|
||||
}
|
||||
readBytes(length, method) {
|
||||
const readPayloadOffset = this.getReadPayloadOffsetAndAdvance(length);
|
||||
if (method != null) {
|
||||
return method.call(this.payload, readPayloadOffset, length);
|
||||
}
|
||||
else {
|
||||
return this.payload.slice(readPayloadOffset, readPayloadOffset + length);
|
||||
}
|
||||
}
|
||||
getReadPayloadOffsetAndAdvance(length) {
|
||||
if (length > this.endIndex - this.readIndex) {
|
||||
this.readIndex = this.endIndex;
|
||||
throw new Error('Failed to read data with length of ' + length);
|
||||
}
|
||||
const readPayloadOffset = this.payloadOffset + this.readIndex;
|
||||
this.advance(length);
|
||||
return readPayloadOffset;
|
||||
}
|
||||
advance(size) {
|
||||
const alignedSize = alignInt(size, SIZE_UINT32);
|
||||
if (this.endIndex - this.readIndex < alignedSize) {
|
||||
this.readIndex = this.endIndex;
|
||||
}
|
||||
else {
|
||||
this.readIndex += alignedSize;
|
||||
}
|
||||
}
|
||||
}
|
||||
// This class provides facilities for basic binary value packing and unpacking.
|
||||
//
|
||||
// The Pickle class supports appending primitive values (ints, strings, etc.)
|
||||
// to a pickle instance. The Pickle instance grows its internal memory buffer
|
||||
// dynamically to hold the sequence of primitive values. The internal memory
|
||||
// buffer is exposed as the "data" of the Pickle. This "data" can be passed
|
||||
// to a Pickle object to initialize it for reading.
|
||||
//
|
||||
// When reading from a Pickle object, it is important for the consumer to know
|
||||
// what value types to read and in what order to read them as the Pickle does
|
||||
// not keep track of the type of data written to it.
|
||||
//
|
||||
// The Pickle's data has a header which contains the size of the Pickle's
|
||||
// payload. It can optionally support additional space in the header. That
|
||||
// space is controlled by the header_size parameter passed to the Pickle
|
||||
// constructor.
|
||||
class Pickle {
|
||||
constructor(buffer) {
|
||||
if (buffer) {
|
||||
this.header = buffer;
|
||||
this.headerSize = buffer.length - this.getPayloadSize();
|
||||
this.capacityAfterHeader = CAPACITY_READ_ONLY;
|
||||
this.writeOffset = 0;
|
||||
if (this.headerSize > buffer.length) {
|
||||
this.headerSize = 0;
|
||||
}
|
||||
if (this.headerSize !== alignInt(this.headerSize, SIZE_UINT32)) {
|
||||
this.headerSize = 0;
|
||||
}
|
||||
if (this.headerSize === 0) {
|
||||
this.header = Buffer.alloc(0);
|
||||
}
|
||||
}
|
||||
else {
|
||||
this.header = Buffer.alloc(0);
|
||||
this.headerSize = SIZE_UINT32;
|
||||
this.capacityAfterHeader = 0;
|
||||
this.writeOffset = 0;
|
||||
this.resize(PAYLOAD_UNIT);
|
||||
this.setPayloadSize(0);
|
||||
}
|
||||
}
|
||||
static createEmpty() {
|
||||
return new Pickle();
|
||||
}
|
||||
static createFromBuffer(buffer) {
|
||||
return new Pickle(buffer);
|
||||
}
|
||||
getHeader() {
|
||||
return this.header;
|
||||
}
|
||||
getHeaderSize() {
|
||||
return this.headerSize;
|
||||
}
|
||||
createIterator() {
|
||||
return new PickleIterator(this);
|
||||
}
|
||||
toBuffer() {
|
||||
return this.header.slice(0, this.headerSize + this.getPayloadSize());
|
||||
}
|
||||
writeBool(value) {
|
||||
return this.writeInt(value ? 1 : 0);
|
||||
}
|
||||
writeInt(value) {
|
||||
return this.writeBytes(value, SIZE_INT32, Buffer.prototype.writeInt32LE);
|
||||
}
|
||||
writeUInt32(value) {
|
||||
return this.writeBytes(value, SIZE_UINT32, Buffer.prototype.writeUInt32LE);
|
||||
}
|
||||
writeInt64(value) {
|
||||
return this.writeBytes(BigInt(value), SIZE_INT64, Buffer.prototype.writeBigInt64LE);
|
||||
}
|
||||
writeUInt64(value) {
|
||||
return this.writeBytes(BigInt(value), SIZE_UINT64, Buffer.prototype.writeBigUInt64LE);
|
||||
}
|
||||
writeFloat(value) {
|
||||
return this.writeBytes(value, SIZE_FLOAT, Buffer.prototype.writeFloatLE);
|
||||
}
|
||||
writeDouble(value) {
|
||||
return this.writeBytes(value, SIZE_DOUBLE, Buffer.prototype.writeDoubleLE);
|
||||
}
|
||||
writeString(value) {
|
||||
const length = Buffer.byteLength(value, 'utf8');
|
||||
if (!this.writeInt(length)) {
|
||||
return false;
|
||||
}
|
||||
return this.writeBytes(value, length);
|
||||
}
|
||||
setPayloadSize(payloadSize) {
|
||||
return this.header.writeUInt32LE(payloadSize, 0);
|
||||
}
|
||||
getPayloadSize() {
|
||||
return this.header.readUInt32LE(0);
|
||||
}
|
||||
writeBytes(data, length, method) {
|
||||
const dataLength = alignInt(length, SIZE_UINT32);
|
||||
const newSize = this.writeOffset + dataLength;
|
||||
if (newSize > this.capacityAfterHeader) {
|
||||
this.resize(Math.max(this.capacityAfterHeader * 2, newSize));
|
||||
}
|
||||
if (method) {
|
||||
method.call(this.header, data, this.headerSize + this.writeOffset);
|
||||
}
|
||||
else {
|
||||
this.header.write(data, this.headerSize + this.writeOffset, length);
|
||||
}
|
||||
const endOffset = this.headerSize + this.writeOffset + length;
|
||||
this.header.fill(0, endOffset, endOffset + dataLength - length);
|
||||
this.setPayloadSize(newSize);
|
||||
this.writeOffset = newSize;
|
||||
return true;
|
||||
}
|
||||
resize(newCapacity) {
|
||||
newCapacity = alignInt(newCapacity, PAYLOAD_UNIT);
|
||||
this.header = Buffer.concat([this.header, Buffer.alloc(newCapacity)]);
|
||||
this.capacityAfterHeader = newCapacity;
|
||||
}
|
||||
}
|
||||
exports.Pickle = Pickle;
|
||||
//# sourceMappingURL=pickle.js.map
|
||||
1
desktop-operator/node_modules/@electron/asar/lib/pickle.js.map
generated
vendored
Normal file
1
desktop-operator/node_modules/@electron/asar/lib/pickle.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
157
desktop-operator/node_modules/@electron/asar/lib/types/glob.d.ts
generated
vendored
Normal file
157
desktop-operator/node_modules/@electron/asar/lib/types/glob.d.ts
generated
vendored
Normal file
@@ -0,0 +1,157 @@
|
||||
/**
|
||||
* TODO(erikian): remove this file once we upgrade to the latest `glob` version.
|
||||
* https://github.com/electron/asar/pull/332#issuecomment-2435407933
|
||||
*/
|
||||
interface IMinimatchOptions {
|
||||
/**
|
||||
* Dump a ton of stuff to stderr.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
debug?: boolean | undefined;
|
||||
/**
|
||||
* Do not expand `{a,b}` and `{1..3}` brace sets.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
nobrace?: boolean | undefined;
|
||||
/**
|
||||
* Disable `**` matching against multiple folder names.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
noglobstar?: boolean | undefined;
|
||||
/**
|
||||
* Allow patterns to match filenames starting with a period,
|
||||
* even if the pattern does not explicitly have a period in that spot.
|
||||
*
|
||||
* Note that by default, `'a/**' + '/b'` will **not** match `a/.d/b`, unless `dot` is set.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
dot?: boolean | undefined;
|
||||
/**
|
||||
* Disable "extglob" style patterns like `+(a|b)`.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
noext?: boolean | undefined;
|
||||
/**
|
||||
* Perform a case-insensitive match.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
nocase?: boolean | undefined;
|
||||
/**
|
||||
* When a match is not found by `minimatch.match`,
|
||||
* return a list containing the pattern itself if this option is set.
|
||||
* Otherwise, an empty list is returned if there are no matches.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
nonull?: boolean | undefined;
|
||||
/**
|
||||
* If set, then patterns without slashes will be matched
|
||||
* against the basename of the path if it contains slashes. For example,
|
||||
* `a?b` would match the path `/xyz/123/acb`, but not `/xyz/acb/123`.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
matchBase?: boolean | undefined;
|
||||
/**
|
||||
* Suppress the behavior of treating `#` at the start of a pattern as a comment.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
nocomment?: boolean | undefined;
|
||||
/**
|
||||
* Suppress the behavior of treating a leading `!` character as negation.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
nonegate?: boolean | undefined;
|
||||
/**
|
||||
* Returns from negate expressions the same as if they were not negated.
|
||||
* (Ie, true on a hit, false on a miss.)
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
flipNegate?: boolean | undefined;
|
||||
/**
|
||||
* Compare a partial path to a pattern. As long as the parts of the path that
|
||||
* are present are not contradicted by the pattern, it will be treated as a
|
||||
* match. This is useful in applications where you're walking through a
|
||||
* folder structure, and don't yet have the full path, but want to ensure that
|
||||
* you do not walk down paths that can never be a match.
|
||||
*
|
||||
* @default false
|
||||
*
|
||||
* @example
|
||||
* import minimatch = require("minimatch");
|
||||
*
|
||||
* minimatch('/a/b', '/a/*' + '/c/d', { partial: true }) // true, might be /a/b/c/d
|
||||
* minimatch('/a/b', '/**' + '/d', { partial: true }) // true, might be /a/b/.../d
|
||||
* minimatch('/x/y/z', '/a/**' + '/z', { partial: true }) // false, because x !== a
|
||||
*/
|
||||
partial?: boolean;
|
||||
/**
|
||||
* Use `\\` as a path separator _only_, and _never_ as an escape
|
||||
* character. If set, all `\\` characters are replaced with `/` in
|
||||
* the pattern. Note that this makes it **impossible** to match
|
||||
* against paths containing literal glob pattern characters, but
|
||||
* allows matching with patterns constructed using `path.join()` and
|
||||
* `path.resolve()` on Windows platforms, mimicking the (buggy!)
|
||||
* behavior of earlier versions on Windows. Please use with
|
||||
* caution, and be mindful of the caveat about Windows paths
|
||||
*
|
||||
* For legacy reasons, this is also set if
|
||||
* `options.allowWindowsEscape` is set to the exact value `false`.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
windowsPathsNoEscape?: boolean;
|
||||
}
|
||||
export interface IOptions extends IMinimatchOptions {
|
||||
cwd?: string | undefined;
|
||||
root?: string | undefined;
|
||||
dot?: boolean | undefined;
|
||||
nomount?: boolean | undefined;
|
||||
mark?: boolean | undefined;
|
||||
nosort?: boolean | undefined;
|
||||
stat?: boolean | undefined;
|
||||
silent?: boolean | undefined;
|
||||
strict?: boolean | undefined;
|
||||
cache?: {
|
||||
[path: string]: boolean | 'DIR' | 'FILE' | ReadonlyArray<string>;
|
||||
} | undefined;
|
||||
statCache?: {
|
||||
[path: string]: false | {
|
||||
isDirectory(): boolean;
|
||||
} | undefined;
|
||||
} | undefined;
|
||||
symlinks?: {
|
||||
[path: string]: boolean | undefined;
|
||||
} | undefined;
|
||||
realpathCache?: {
|
||||
[path: string]: string;
|
||||
} | undefined;
|
||||
sync?: boolean | undefined;
|
||||
nounique?: boolean | undefined;
|
||||
nonull?: boolean | undefined;
|
||||
debug?: boolean | undefined;
|
||||
nobrace?: boolean | undefined;
|
||||
noglobstar?: boolean | undefined;
|
||||
noext?: boolean | undefined;
|
||||
nocase?: boolean | undefined;
|
||||
matchBase?: any;
|
||||
nodir?: boolean | undefined;
|
||||
ignore?: string | ReadonlyArray<string> | undefined;
|
||||
follow?: boolean | undefined;
|
||||
realpath?: boolean | undefined;
|
||||
nonegate?: boolean | undefined;
|
||||
nocomment?: boolean | undefined;
|
||||
absolute?: boolean | undefined;
|
||||
allowWindowsEscape?: boolean | undefined;
|
||||
fs?: typeof import('fs');
|
||||
}
|
||||
export {};
|
||||
3
desktop-operator/node_modules/@electron/asar/lib/types/glob.js
generated
vendored
Normal file
3
desktop-operator/node_modules/@electron/asar/lib/types/glob.js
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
//# sourceMappingURL=glob.js.map
|
||||
1
desktop-operator/node_modules/@electron/asar/lib/types/glob.js.map
generated
vendored
Normal file
1
desktop-operator/node_modules/@electron/asar/lib/types/glob.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"glob.js","sourceRoot":"","sources":["../../src/types/glob.ts"],"names":[],"mappings":""}
|
||||
13
desktop-operator/node_modules/@electron/asar/lib/wrapped-fs.d.ts
generated
vendored
Normal file
13
desktop-operator/node_modules/@electron/asar/lib/wrapped-fs.d.ts
generated
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
type AsarFS = typeof import('fs') & {
|
||||
mkdirp(dir: string): Promise<void>;
|
||||
mkdirpSync(dir: string): void;
|
||||
lstat: (typeof import('fs'))['promises']['lstat'];
|
||||
mkdtemp: (typeof import('fs'))['promises']['mkdtemp'];
|
||||
readFile: (typeof import('fs'))['promises']['readFile'];
|
||||
stat: (typeof import('fs'))['promises']['stat'];
|
||||
writeFile: (typeof import('fs'))['promises']['writeFile'];
|
||||
symlink: (typeof import('fs'))['promises']['symlink'];
|
||||
readlink: (typeof import('fs'))['promises']['readlink'];
|
||||
};
|
||||
declare const promisified: AsarFS;
|
||||
export default promisified;
|
||||
26
desktop-operator/node_modules/@electron/asar/lib/wrapped-fs.js
generated
vendored
Normal file
26
desktop-operator/node_modules/@electron/asar/lib/wrapped-fs.js
generated
vendored
Normal file
@@ -0,0 +1,26 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const fs = 'electron' in process.versions ? require('original-fs') : require('fs');
|
||||
const promisifiedMethods = [
|
||||
'lstat',
|
||||
'mkdtemp',
|
||||
'readFile',
|
||||
'stat',
|
||||
'writeFile',
|
||||
'symlink',
|
||||
'readlink',
|
||||
];
|
||||
const promisified = {};
|
||||
for (const method of Object.keys(fs)) {
|
||||
if (promisifiedMethods.includes(method)) {
|
||||
promisified[method] = fs.promises[method];
|
||||
}
|
||||
else {
|
||||
promisified[method] = fs[method];
|
||||
}
|
||||
}
|
||||
// To make it more like fs-extra
|
||||
promisified.mkdirp = (dir) => fs.promises.mkdir(dir, { recursive: true });
|
||||
promisified.mkdirpSync = (dir) => fs.mkdirSync(dir, { recursive: true });
|
||||
exports.default = promisified;
|
||||
//# sourceMappingURL=wrapped-fs.js.map
|
||||
1
desktop-operator/node_modules/@electron/asar/lib/wrapped-fs.js.map
generated
vendored
Normal file
1
desktop-operator/node_modules/@electron/asar/lib/wrapped-fs.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"wrapped-fs.js","sourceRoot":"","sources":["../src/wrapped-fs.ts"],"names":[],"mappings":";;AAAA,MAAM,EAAE,GAAG,UAAU,IAAI,OAAO,CAAC,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,aAAa,CAAC,CAAC,CAAC,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC;AAEnF,MAAM,kBAAkB,GAAG;IACzB,OAAO;IACP,SAAS;IACT,UAAU;IACV,MAAM;IACN,WAAW;IACX,SAAS;IACT,UAAU;CACX,CAAC;AAcF,MAAM,WAAW,GAAG,EAAY,CAAC;AAEjC,KAAK,MAAM,MAAM,IAAI,MAAM,CAAC,IAAI,CAAC,EAAE,CAAC,EAAE,CAAC;IACrC,IAAI,kBAAkB,CAAC,QAAQ,CAAC,MAAM,CAAC,EAAE,CAAC;QACvC,WAAmB,CAAC,MAAM,CAAC,GAAG,EAAE,CAAC,QAAQ,CAAC,MAAM,CAAC,CAAC;IACrD,CAAC;SAAM,CAAC;QACL,WAAmB,CAAC,MAAM,CAAC,GAAG,EAAE,CAAC,MAAM,CAAC,CAAC;IAC5C,CAAC;AACH,CAAC;AACD,gCAAgC;AAChC,WAAW,CAAC,MAAM,GAAG,CAAC,GAAG,EAAE,EAAE,CAAC,EAAE,CAAC,QAAQ,CAAC,KAAK,CAAC,GAAG,EAAE,EAAE,SAAS,EAAE,IAAI,EAAE,CAAC,CAAC;AAC1E,WAAW,CAAC,UAAU,GAAG,CAAC,GAAG,EAAE,EAAE,CAAC,EAAE,CAAC,SAAS,CAAC,GAAG,EAAE,EAAE,SAAS,EAAE,IAAI,EAAE,CAAC,CAAC;AAEzE,kBAAe,WAAW,CAAC"}
|
||||
15
desktop-operator/node_modules/@electron/asar/node_modules/minimatch/LICENSE
generated
vendored
Normal file
15
desktop-operator/node_modules/@electron/asar/node_modules/minimatch/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,15 @@
|
||||
The ISC License
|
||||
|
||||
Copyright (c) Isaac Z. Schlueter and Contributors
|
||||
|
||||
Permission to use, copy, modify, and/or distribute this software for any
|
||||
purpose with or without fee is hereby granted, provided that the above
|
||||
copyright notice and this permission notice appear in all copies.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
|
||||
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
|
||||
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
|
||||
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
|
||||
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
|
||||
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR
|
||||
IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
|
||||
230
desktop-operator/node_modules/@electron/asar/node_modules/minimatch/README.md
generated
vendored
Normal file
230
desktop-operator/node_modules/@electron/asar/node_modules/minimatch/README.md
generated
vendored
Normal file
@@ -0,0 +1,230 @@
|
||||
# minimatch
|
||||
|
||||
A minimal matching utility.
|
||||
|
||||
[](http://travis-ci.org/isaacs/minimatch)
|
||||
|
||||
|
||||
This is the matching library used internally by npm.
|
||||
|
||||
It works by converting glob expressions into JavaScript `RegExp`
|
||||
objects.
|
||||
|
||||
## Usage
|
||||
|
||||
```javascript
|
||||
var minimatch = require("minimatch")
|
||||
|
||||
minimatch("bar.foo", "*.foo") // true!
|
||||
minimatch("bar.foo", "*.bar") // false!
|
||||
minimatch("bar.foo", "*.+(bar|foo)", { debug: true }) // true, and noisy!
|
||||
```
|
||||
|
||||
## Features
|
||||
|
||||
Supports these glob features:
|
||||
|
||||
* Brace Expansion
|
||||
* Extended glob matching
|
||||
* "Globstar" `**` matching
|
||||
|
||||
See:
|
||||
|
||||
* `man sh`
|
||||
* `man bash`
|
||||
* `man 3 fnmatch`
|
||||
* `man 5 gitignore`
|
||||
|
||||
## Minimatch Class
|
||||
|
||||
Create a minimatch object by instantiating the `minimatch.Minimatch` class.
|
||||
|
||||
```javascript
|
||||
var Minimatch = require("minimatch").Minimatch
|
||||
var mm = new Minimatch(pattern, options)
|
||||
```
|
||||
|
||||
### Properties
|
||||
|
||||
* `pattern` The original pattern the minimatch object represents.
|
||||
* `options` The options supplied to the constructor.
|
||||
* `set` A 2-dimensional array of regexp or string expressions.
|
||||
Each row in the
|
||||
array corresponds to a brace-expanded pattern. Each item in the row
|
||||
corresponds to a single path-part. For example, the pattern
|
||||
`{a,b/c}/d` would expand to a set of patterns like:
|
||||
|
||||
[ [ a, d ]
|
||||
, [ b, c, d ] ]
|
||||
|
||||
If a portion of the pattern doesn't have any "magic" in it
|
||||
(that is, it's something like `"foo"` rather than `fo*o?`), then it
|
||||
will be left as a string rather than converted to a regular
|
||||
expression.
|
||||
|
||||
* `regexp` Created by the `makeRe` method. A single regular expression
|
||||
expressing the entire pattern. This is useful in cases where you wish
|
||||
to use the pattern somewhat like `fnmatch(3)` with `FNM_PATH` enabled.
|
||||
* `negate` True if the pattern is negated.
|
||||
* `comment` True if the pattern is a comment.
|
||||
* `empty` True if the pattern is `""`.
|
||||
|
||||
### Methods
|
||||
|
||||
* `makeRe` Generate the `regexp` member if necessary, and return it.
|
||||
Will return `false` if the pattern is invalid.
|
||||
* `match(fname)` Return true if the filename matches the pattern, or
|
||||
false otherwise.
|
||||
* `matchOne(fileArray, patternArray, partial)` Take a `/`-split
|
||||
filename, and match it against a single row in the `regExpSet`. This
|
||||
method is mainly for internal use, but is exposed so that it can be
|
||||
used by a glob-walker that needs to avoid excessive filesystem calls.
|
||||
|
||||
All other methods are internal, and will be called as necessary.
|
||||
|
||||
### minimatch(path, pattern, options)
|
||||
|
||||
Main export. Tests a path against the pattern using the options.
|
||||
|
||||
```javascript
|
||||
var isJS = minimatch(file, "*.js", { matchBase: true })
|
||||
```
|
||||
|
||||
### minimatch.filter(pattern, options)
|
||||
|
||||
Returns a function that tests its
|
||||
supplied argument, suitable for use with `Array.filter`. Example:
|
||||
|
||||
```javascript
|
||||
var javascripts = fileList.filter(minimatch.filter("*.js", {matchBase: true}))
|
||||
```
|
||||
|
||||
### minimatch.match(list, pattern, options)
|
||||
|
||||
Match against the list of
|
||||
files, in the style of fnmatch or glob. If nothing is matched, and
|
||||
options.nonull is set, then return a list containing the pattern itself.
|
||||
|
||||
```javascript
|
||||
var javascripts = minimatch.match(fileList, "*.js", {matchBase: true}))
|
||||
```
|
||||
|
||||
### minimatch.makeRe(pattern, options)
|
||||
|
||||
Make a regular expression object from the pattern.
|
||||
|
||||
## Options
|
||||
|
||||
All options are `false` by default.
|
||||
|
||||
### debug
|
||||
|
||||
Dump a ton of stuff to stderr.
|
||||
|
||||
### nobrace
|
||||
|
||||
Do not expand `{a,b}` and `{1..3}` brace sets.
|
||||
|
||||
### noglobstar
|
||||
|
||||
Disable `**` matching against multiple folder names.
|
||||
|
||||
### dot
|
||||
|
||||
Allow patterns to match filenames starting with a period, even if
|
||||
the pattern does not explicitly have a period in that spot.
|
||||
|
||||
Note that by default, `a/**/b` will **not** match `a/.d/b`, unless `dot`
|
||||
is set.
|
||||
|
||||
### noext
|
||||
|
||||
Disable "extglob" style patterns like `+(a|b)`.
|
||||
|
||||
### nocase
|
||||
|
||||
Perform a case-insensitive match.
|
||||
|
||||
### nonull
|
||||
|
||||
When a match is not found by `minimatch.match`, return a list containing
|
||||
the pattern itself if this option is set. When not set, an empty list
|
||||
is returned if there are no matches.
|
||||
|
||||
### matchBase
|
||||
|
||||
If set, then patterns without slashes will be matched
|
||||
against the basename of the path if it contains slashes. For example,
|
||||
`a?b` would match the path `/xyz/123/acb`, but not `/xyz/acb/123`.
|
||||
|
||||
### nocomment
|
||||
|
||||
Suppress the behavior of treating `#` at the start of a pattern as a
|
||||
comment.
|
||||
|
||||
### nonegate
|
||||
|
||||
Suppress the behavior of treating a leading `!` character as negation.
|
||||
|
||||
### flipNegate
|
||||
|
||||
Returns from negate expressions the same as if they were not negated.
|
||||
(Ie, true on a hit, false on a miss.)
|
||||
|
||||
### partial
|
||||
|
||||
Compare a partial path to a pattern. As long as the parts of the path that
|
||||
are present are not contradicted by the pattern, it will be treated as a
|
||||
match. This is useful in applications where you're walking through a
|
||||
folder structure, and don't yet have the full path, but want to ensure that
|
||||
you do not walk down paths that can never be a match.
|
||||
|
||||
For example,
|
||||
|
||||
```js
|
||||
minimatch('/a/b', '/a/*/c/d', { partial: true }) // true, might be /a/b/c/d
|
||||
minimatch('/a/b', '/**/d', { partial: true }) // true, might be /a/b/.../d
|
||||
minimatch('/x/y/z', '/a/**/z', { partial: true }) // false, because x !== a
|
||||
```
|
||||
|
||||
### allowWindowsEscape
|
||||
|
||||
Windows path separator `\` is by default converted to `/`, which
|
||||
prohibits the usage of `\` as a escape character. This flag skips that
|
||||
behavior and allows using the escape character.
|
||||
|
||||
## Comparisons to other fnmatch/glob implementations
|
||||
|
||||
While strict compliance with the existing standards is a worthwhile
|
||||
goal, some discrepancies exist between minimatch and other
|
||||
implementations, and are intentional.
|
||||
|
||||
If the pattern starts with a `!` character, then it is negated. Set the
|
||||
`nonegate` flag to suppress this behavior, and treat leading `!`
|
||||
characters normally. This is perhaps relevant if you wish to start the
|
||||
pattern with a negative extglob pattern like `!(a|B)`. Multiple `!`
|
||||
characters at the start of a pattern will negate the pattern multiple
|
||||
times.
|
||||
|
||||
If a pattern starts with `#`, then it is treated as a comment, and
|
||||
will not match anything. Use `\#` to match a literal `#` at the
|
||||
start of a line, or set the `nocomment` flag to suppress this behavior.
|
||||
|
||||
The double-star character `**` is supported by default, unless the
|
||||
`noglobstar` flag is set. This is supported in the manner of bsdglob
|
||||
and bash 4.1, where `**` only has special significance if it is the only
|
||||
thing in a path part. That is, `a/**/b` will match `a/x/y/b`, but
|
||||
`a/**b` will not.
|
||||
|
||||
If an escaped pattern has no matches, and the `nonull` flag is set,
|
||||
then minimatch.match returns the pattern as-provided, rather than
|
||||
interpreting the character escapes. For example,
|
||||
`minimatch.match([], "\\*a\\?")` will return `"\\*a\\?"` rather than
|
||||
`"*a?"`. This is akin to setting the `nullglob` option in bash, except
|
||||
that it does not resolve escaped pattern characters.
|
||||
|
||||
If brace expansion is not disabled, then it is performed before any
|
||||
other interpretation of the glob pattern. Thus, a pattern like
|
||||
`+(a|{b),c)}`, which would not be valid in bash or zsh, is expanded
|
||||
**first** into the set of `+(a|b)` and `+(a|c)`, and those patterns are
|
||||
checked for validity. Since those two are valid, matching proceeds.
|
||||
947
desktop-operator/node_modules/@electron/asar/node_modules/minimatch/minimatch.js
generated
vendored
Normal file
947
desktop-operator/node_modules/@electron/asar/node_modules/minimatch/minimatch.js
generated
vendored
Normal file
@@ -0,0 +1,947 @@
|
||||
module.exports = minimatch
|
||||
minimatch.Minimatch = Minimatch
|
||||
|
||||
var path = (function () { try { return require('path') } catch (e) {}}()) || {
|
||||
sep: '/'
|
||||
}
|
||||
minimatch.sep = path.sep
|
||||
|
||||
var GLOBSTAR = minimatch.GLOBSTAR = Minimatch.GLOBSTAR = {}
|
||||
var expand = require('brace-expansion')
|
||||
|
||||
var plTypes = {
|
||||
'!': { open: '(?:(?!(?:', close: '))[^/]*?)'},
|
||||
'?': { open: '(?:', close: ')?' },
|
||||
'+': { open: '(?:', close: ')+' },
|
||||
'*': { open: '(?:', close: ')*' },
|
||||
'@': { open: '(?:', close: ')' }
|
||||
}
|
||||
|
||||
// any single thing other than /
|
||||
// don't need to escape / when using new RegExp()
|
||||
var qmark = '[^/]'
|
||||
|
||||
// * => any number of characters
|
||||
var star = qmark + '*?'
|
||||
|
||||
// ** when dots are allowed. Anything goes, except .. and .
|
||||
// not (^ or / followed by one or two dots followed by $ or /),
|
||||
// followed by anything, any number of times.
|
||||
var twoStarDot = '(?:(?!(?:\\\/|^)(?:\\.{1,2})($|\\\/)).)*?'
|
||||
|
||||
// not a ^ or / followed by a dot,
|
||||
// followed by anything, any number of times.
|
||||
var twoStarNoDot = '(?:(?!(?:\\\/|^)\\.).)*?'
|
||||
|
||||
// characters that need to be escaped in RegExp.
|
||||
var reSpecials = charSet('().*{}+?[]^$\\!')
|
||||
|
||||
// "abc" -> { a:true, b:true, c:true }
|
||||
function charSet (s) {
|
||||
return s.split('').reduce(function (set, c) {
|
||||
set[c] = true
|
||||
return set
|
||||
}, {})
|
||||
}
|
||||
|
||||
// normalizes slashes.
|
||||
var slashSplit = /\/+/
|
||||
|
||||
minimatch.filter = filter
|
||||
function filter (pattern, options) {
|
||||
options = options || {}
|
||||
return function (p, i, list) {
|
||||
return minimatch(p, pattern, options)
|
||||
}
|
||||
}
|
||||
|
||||
function ext (a, b) {
|
||||
b = b || {}
|
||||
var t = {}
|
||||
Object.keys(a).forEach(function (k) {
|
||||
t[k] = a[k]
|
||||
})
|
||||
Object.keys(b).forEach(function (k) {
|
||||
t[k] = b[k]
|
||||
})
|
||||
return t
|
||||
}
|
||||
|
||||
minimatch.defaults = function (def) {
|
||||
if (!def || typeof def !== 'object' || !Object.keys(def).length) {
|
||||
return minimatch
|
||||
}
|
||||
|
||||
var orig = minimatch
|
||||
|
||||
var m = function minimatch (p, pattern, options) {
|
||||
return orig(p, pattern, ext(def, options))
|
||||
}
|
||||
|
||||
m.Minimatch = function Minimatch (pattern, options) {
|
||||
return new orig.Minimatch(pattern, ext(def, options))
|
||||
}
|
||||
m.Minimatch.defaults = function defaults (options) {
|
||||
return orig.defaults(ext(def, options)).Minimatch
|
||||
}
|
||||
|
||||
m.filter = function filter (pattern, options) {
|
||||
return orig.filter(pattern, ext(def, options))
|
||||
}
|
||||
|
||||
m.defaults = function defaults (options) {
|
||||
return orig.defaults(ext(def, options))
|
||||
}
|
||||
|
||||
m.makeRe = function makeRe (pattern, options) {
|
||||
return orig.makeRe(pattern, ext(def, options))
|
||||
}
|
||||
|
||||
m.braceExpand = function braceExpand (pattern, options) {
|
||||
return orig.braceExpand(pattern, ext(def, options))
|
||||
}
|
||||
|
||||
m.match = function (list, pattern, options) {
|
||||
return orig.match(list, pattern, ext(def, options))
|
||||
}
|
||||
|
||||
return m
|
||||
}
|
||||
|
||||
Minimatch.defaults = function (def) {
|
||||
return minimatch.defaults(def).Minimatch
|
||||
}
|
||||
|
||||
function minimatch (p, pattern, options) {
|
||||
assertValidPattern(pattern)
|
||||
|
||||
if (!options) options = {}
|
||||
|
||||
// shortcut: comments match nothing.
|
||||
if (!options.nocomment && pattern.charAt(0) === '#') {
|
||||
return false
|
||||
}
|
||||
|
||||
return new Minimatch(pattern, options).match(p)
|
||||
}
|
||||
|
||||
function Minimatch (pattern, options) {
|
||||
if (!(this instanceof Minimatch)) {
|
||||
return new Minimatch(pattern, options)
|
||||
}
|
||||
|
||||
assertValidPattern(pattern)
|
||||
|
||||
if (!options) options = {}
|
||||
|
||||
pattern = pattern.trim()
|
||||
|
||||
// windows support: need to use /, not \
|
||||
if (!options.allowWindowsEscape && path.sep !== '/') {
|
||||
pattern = pattern.split(path.sep).join('/')
|
||||
}
|
||||
|
||||
this.options = options
|
||||
this.set = []
|
||||
this.pattern = pattern
|
||||
this.regexp = null
|
||||
this.negate = false
|
||||
this.comment = false
|
||||
this.empty = false
|
||||
this.partial = !!options.partial
|
||||
|
||||
// make the set of regexps etc.
|
||||
this.make()
|
||||
}
|
||||
|
||||
Minimatch.prototype.debug = function () {}
|
||||
|
||||
Minimatch.prototype.make = make
|
||||
function make () {
|
||||
var pattern = this.pattern
|
||||
var options = this.options
|
||||
|
||||
// empty patterns and comments match nothing.
|
||||
if (!options.nocomment && pattern.charAt(0) === '#') {
|
||||
this.comment = true
|
||||
return
|
||||
}
|
||||
if (!pattern) {
|
||||
this.empty = true
|
||||
return
|
||||
}
|
||||
|
||||
// step 1: figure out negation, etc.
|
||||
this.parseNegate()
|
||||
|
||||
// step 2: expand braces
|
||||
var set = this.globSet = this.braceExpand()
|
||||
|
||||
if (options.debug) this.debug = function debug() { console.error.apply(console, arguments) }
|
||||
|
||||
this.debug(this.pattern, set)
|
||||
|
||||
// step 3: now we have a set, so turn each one into a series of path-portion
|
||||
// matching patterns.
|
||||
// These will be regexps, except in the case of "**", which is
|
||||
// set to the GLOBSTAR object for globstar behavior,
|
||||
// and will not contain any / characters
|
||||
set = this.globParts = set.map(function (s) {
|
||||
return s.split(slashSplit)
|
||||
})
|
||||
|
||||
this.debug(this.pattern, set)
|
||||
|
||||
// glob --> regexps
|
||||
set = set.map(function (s, si, set) {
|
||||
return s.map(this.parse, this)
|
||||
}, this)
|
||||
|
||||
this.debug(this.pattern, set)
|
||||
|
||||
// filter out everything that didn't compile properly.
|
||||
set = set.filter(function (s) {
|
||||
return s.indexOf(false) === -1
|
||||
})
|
||||
|
||||
this.debug(this.pattern, set)
|
||||
|
||||
this.set = set
|
||||
}
|
||||
|
||||
Minimatch.prototype.parseNegate = parseNegate
|
||||
function parseNegate () {
|
||||
var pattern = this.pattern
|
||||
var negate = false
|
||||
var options = this.options
|
||||
var negateOffset = 0
|
||||
|
||||
if (options.nonegate) return
|
||||
|
||||
for (var i = 0, l = pattern.length
|
||||
; i < l && pattern.charAt(i) === '!'
|
||||
; i++) {
|
||||
negate = !negate
|
||||
negateOffset++
|
||||
}
|
||||
|
||||
if (negateOffset) this.pattern = pattern.substr(negateOffset)
|
||||
this.negate = negate
|
||||
}
|
||||
|
||||
// Brace expansion:
|
||||
// a{b,c}d -> abd acd
|
||||
// a{b,}c -> abc ac
|
||||
// a{0..3}d -> a0d a1d a2d a3d
|
||||
// a{b,c{d,e}f}g -> abg acdfg acefg
|
||||
// a{b,c}d{e,f}g -> abdeg acdeg abdeg abdfg
|
||||
//
|
||||
// Invalid sets are not expanded.
|
||||
// a{2..}b -> a{2..}b
|
||||
// a{b}c -> a{b}c
|
||||
minimatch.braceExpand = function (pattern, options) {
|
||||
return braceExpand(pattern, options)
|
||||
}
|
||||
|
||||
Minimatch.prototype.braceExpand = braceExpand
|
||||
|
||||
function braceExpand (pattern, options) {
|
||||
if (!options) {
|
||||
if (this instanceof Minimatch) {
|
||||
options = this.options
|
||||
} else {
|
||||
options = {}
|
||||
}
|
||||
}
|
||||
|
||||
pattern = typeof pattern === 'undefined'
|
||||
? this.pattern : pattern
|
||||
|
||||
assertValidPattern(pattern)
|
||||
|
||||
// Thanks to Yeting Li <https://github.com/yetingli> for
|
||||
// improving this regexp to avoid a ReDOS vulnerability.
|
||||
if (options.nobrace || !/\{(?:(?!\{).)*\}/.test(pattern)) {
|
||||
// shortcut. no need to expand.
|
||||
return [pattern]
|
||||
}
|
||||
|
||||
return expand(pattern)
|
||||
}
|
||||
|
||||
var MAX_PATTERN_LENGTH = 1024 * 64
|
||||
var assertValidPattern = function (pattern) {
|
||||
if (typeof pattern !== 'string') {
|
||||
throw new TypeError('invalid pattern')
|
||||
}
|
||||
|
||||
if (pattern.length > MAX_PATTERN_LENGTH) {
|
||||
throw new TypeError('pattern is too long')
|
||||
}
|
||||
}
|
||||
|
||||
// parse a component of the expanded set.
|
||||
// At this point, no pattern may contain "/" in it
|
||||
// so we're going to return a 2d array, where each entry is the full
|
||||
// pattern, split on '/', and then turned into a regular expression.
|
||||
// A regexp is made at the end which joins each array with an
|
||||
// escaped /, and another full one which joins each regexp with |.
|
||||
//
|
||||
// Following the lead of Bash 4.1, note that "**" only has special meaning
|
||||
// when it is the *only* thing in a path portion. Otherwise, any series
|
||||
// of * is equivalent to a single *. Globstar behavior is enabled by
|
||||
// default, and can be disabled by setting options.noglobstar.
|
||||
Minimatch.prototype.parse = parse
|
||||
var SUBPARSE = {}
|
||||
function parse (pattern, isSub) {
|
||||
assertValidPattern(pattern)
|
||||
|
||||
var options = this.options
|
||||
|
||||
// shortcuts
|
||||
if (pattern === '**') {
|
||||
if (!options.noglobstar)
|
||||
return GLOBSTAR
|
||||
else
|
||||
pattern = '*'
|
||||
}
|
||||
if (pattern === '') return ''
|
||||
|
||||
var re = ''
|
||||
var hasMagic = !!options.nocase
|
||||
var escaping = false
|
||||
// ? => one single character
|
||||
var patternListStack = []
|
||||
var negativeLists = []
|
||||
var stateChar
|
||||
var inClass = false
|
||||
var reClassStart = -1
|
||||
var classStart = -1
|
||||
// . and .. never match anything that doesn't start with .,
|
||||
// even when options.dot is set.
|
||||
var patternStart = pattern.charAt(0) === '.' ? '' // anything
|
||||
// not (start or / followed by . or .. followed by / or end)
|
||||
: options.dot ? '(?!(?:^|\\\/)\\.{1,2}(?:$|\\\/))'
|
||||
: '(?!\\.)'
|
||||
var self = this
|
||||
|
||||
function clearStateChar () {
|
||||
if (stateChar) {
|
||||
// we had some state-tracking character
|
||||
// that wasn't consumed by this pass.
|
||||
switch (stateChar) {
|
||||
case '*':
|
||||
re += star
|
||||
hasMagic = true
|
||||
break
|
||||
case '?':
|
||||
re += qmark
|
||||
hasMagic = true
|
||||
break
|
||||
default:
|
||||
re += '\\' + stateChar
|
||||
break
|
||||
}
|
||||
self.debug('clearStateChar %j %j', stateChar, re)
|
||||
stateChar = false
|
||||
}
|
||||
}
|
||||
|
||||
for (var i = 0, len = pattern.length, c
|
||||
; (i < len) && (c = pattern.charAt(i))
|
||||
; i++) {
|
||||
this.debug('%s\t%s %s %j', pattern, i, re, c)
|
||||
|
||||
// skip over any that are escaped.
|
||||
if (escaping && reSpecials[c]) {
|
||||
re += '\\' + c
|
||||
escaping = false
|
||||
continue
|
||||
}
|
||||
|
||||
switch (c) {
|
||||
/* istanbul ignore next */
|
||||
case '/': {
|
||||
// completely not allowed, even escaped.
|
||||
// Should already be path-split by now.
|
||||
return false
|
||||
}
|
||||
|
||||
case '\\':
|
||||
clearStateChar()
|
||||
escaping = true
|
||||
continue
|
||||
|
||||
// the various stateChar values
|
||||
// for the "extglob" stuff.
|
||||
case '?':
|
||||
case '*':
|
||||
case '+':
|
||||
case '@':
|
||||
case '!':
|
||||
this.debug('%s\t%s %s %j <-- stateChar', pattern, i, re, c)
|
||||
|
||||
// all of those are literals inside a class, except that
|
||||
// the glob [!a] means [^a] in regexp
|
||||
if (inClass) {
|
||||
this.debug(' in class')
|
||||
if (c === '!' && i === classStart + 1) c = '^'
|
||||
re += c
|
||||
continue
|
||||
}
|
||||
|
||||
// if we already have a stateChar, then it means
|
||||
// that there was something like ** or +? in there.
|
||||
// Handle the stateChar, then proceed with this one.
|
||||
self.debug('call clearStateChar %j', stateChar)
|
||||
clearStateChar()
|
||||
stateChar = c
|
||||
// if extglob is disabled, then +(asdf|foo) isn't a thing.
|
||||
// just clear the statechar *now*, rather than even diving into
|
||||
// the patternList stuff.
|
||||
if (options.noext) clearStateChar()
|
||||
continue
|
||||
|
||||
case '(':
|
||||
if (inClass) {
|
||||
re += '('
|
||||
continue
|
||||
}
|
||||
|
||||
if (!stateChar) {
|
||||
re += '\\('
|
||||
continue
|
||||
}
|
||||
|
||||
patternListStack.push({
|
||||
type: stateChar,
|
||||
start: i - 1,
|
||||
reStart: re.length,
|
||||
open: plTypes[stateChar].open,
|
||||
close: plTypes[stateChar].close
|
||||
})
|
||||
// negation is (?:(?!js)[^/]*)
|
||||
re += stateChar === '!' ? '(?:(?!(?:' : '(?:'
|
||||
this.debug('plType %j %j', stateChar, re)
|
||||
stateChar = false
|
||||
continue
|
||||
|
||||
case ')':
|
||||
if (inClass || !patternListStack.length) {
|
||||
re += '\\)'
|
||||
continue
|
||||
}
|
||||
|
||||
clearStateChar()
|
||||
hasMagic = true
|
||||
var pl = patternListStack.pop()
|
||||
// negation is (?:(?!js)[^/]*)
|
||||
// The others are (?:<pattern>)<type>
|
||||
re += pl.close
|
||||
if (pl.type === '!') {
|
||||
negativeLists.push(pl)
|
||||
}
|
||||
pl.reEnd = re.length
|
||||
continue
|
||||
|
||||
case '|':
|
||||
if (inClass || !patternListStack.length || escaping) {
|
||||
re += '\\|'
|
||||
escaping = false
|
||||
continue
|
||||
}
|
||||
|
||||
clearStateChar()
|
||||
re += '|'
|
||||
continue
|
||||
|
||||
// these are mostly the same in regexp and glob
|
||||
case '[':
|
||||
// swallow any state-tracking char before the [
|
||||
clearStateChar()
|
||||
|
||||
if (inClass) {
|
||||
re += '\\' + c
|
||||
continue
|
||||
}
|
||||
|
||||
inClass = true
|
||||
classStart = i
|
||||
reClassStart = re.length
|
||||
re += c
|
||||
continue
|
||||
|
||||
case ']':
|
||||
// a right bracket shall lose its special
|
||||
// meaning and represent itself in
|
||||
// a bracket expression if it occurs
|
||||
// first in the list. -- POSIX.2 2.8.3.2
|
||||
if (i === classStart + 1 || !inClass) {
|
||||
re += '\\' + c
|
||||
escaping = false
|
||||
continue
|
||||
}
|
||||
|
||||
// handle the case where we left a class open.
|
||||
// "[z-a]" is valid, equivalent to "\[z-a\]"
|
||||
// split where the last [ was, make sure we don't have
|
||||
// an invalid re. if so, re-walk the contents of the
|
||||
// would-be class to re-translate any characters that
|
||||
// were passed through as-is
|
||||
// TODO: It would probably be faster to determine this
|
||||
// without a try/catch and a new RegExp, but it's tricky
|
||||
// to do safely. For now, this is safe and works.
|
||||
var cs = pattern.substring(classStart + 1, i)
|
||||
try {
|
||||
RegExp('[' + cs + ']')
|
||||
} catch (er) {
|
||||
// not a valid class!
|
||||
var sp = this.parse(cs, SUBPARSE)
|
||||
re = re.substr(0, reClassStart) + '\\[' + sp[0] + '\\]'
|
||||
hasMagic = hasMagic || sp[1]
|
||||
inClass = false
|
||||
continue
|
||||
}
|
||||
|
||||
// finish up the class.
|
||||
hasMagic = true
|
||||
inClass = false
|
||||
re += c
|
||||
continue
|
||||
|
||||
default:
|
||||
// swallow any state char that wasn't consumed
|
||||
clearStateChar()
|
||||
|
||||
if (escaping) {
|
||||
// no need
|
||||
escaping = false
|
||||
} else if (reSpecials[c]
|
||||
&& !(c === '^' && inClass)) {
|
||||
re += '\\'
|
||||
}
|
||||
|
||||
re += c
|
||||
|
||||
} // switch
|
||||
} // for
|
||||
|
||||
// handle the case where we left a class open.
|
||||
// "[abc" is valid, equivalent to "\[abc"
|
||||
if (inClass) {
|
||||
// split where the last [ was, and escape it
|
||||
// this is a huge pita. We now have to re-walk
|
||||
// the contents of the would-be class to re-translate
|
||||
// any characters that were passed through as-is
|
||||
cs = pattern.substr(classStart + 1)
|
||||
sp = this.parse(cs, SUBPARSE)
|
||||
re = re.substr(0, reClassStart) + '\\[' + sp[0]
|
||||
hasMagic = hasMagic || sp[1]
|
||||
}
|
||||
|
||||
// handle the case where we had a +( thing at the *end*
|
||||
// of the pattern.
|
||||
// each pattern list stack adds 3 chars, and we need to go through
|
||||
// and escape any | chars that were passed through as-is for the regexp.
|
||||
// Go through and escape them, taking care not to double-escape any
|
||||
// | chars that were already escaped.
|
||||
for (pl = patternListStack.pop(); pl; pl = patternListStack.pop()) {
|
||||
var tail = re.slice(pl.reStart + pl.open.length)
|
||||
this.debug('setting tail', re, pl)
|
||||
// maybe some even number of \, then maybe 1 \, followed by a |
|
||||
tail = tail.replace(/((?:\\{2}){0,64})(\\?)\|/g, function (_, $1, $2) {
|
||||
if (!$2) {
|
||||
// the | isn't already escaped, so escape it.
|
||||
$2 = '\\'
|
||||
}
|
||||
|
||||
// need to escape all those slashes *again*, without escaping the
|
||||
// one that we need for escaping the | character. As it works out,
|
||||
// escaping an even number of slashes can be done by simply repeating
|
||||
// it exactly after itself. That's why this trick works.
|
||||
//
|
||||
// I am sorry that you have to see this.
|
||||
return $1 + $1 + $2 + '|'
|
||||
})
|
||||
|
||||
this.debug('tail=%j\n %s', tail, tail, pl, re)
|
||||
var t = pl.type === '*' ? star
|
||||
: pl.type === '?' ? qmark
|
||||
: '\\' + pl.type
|
||||
|
||||
hasMagic = true
|
||||
re = re.slice(0, pl.reStart) + t + '\\(' + tail
|
||||
}
|
||||
|
||||
// handle trailing things that only matter at the very end.
|
||||
clearStateChar()
|
||||
if (escaping) {
|
||||
// trailing \\
|
||||
re += '\\\\'
|
||||
}
|
||||
|
||||
// only need to apply the nodot start if the re starts with
|
||||
// something that could conceivably capture a dot
|
||||
var addPatternStart = false
|
||||
switch (re.charAt(0)) {
|
||||
case '[': case '.': case '(': addPatternStart = true
|
||||
}
|
||||
|
||||
// Hack to work around lack of negative lookbehind in JS
|
||||
// A pattern like: *.!(x).!(y|z) needs to ensure that a name
|
||||
// like 'a.xyz.yz' doesn't match. So, the first negative
|
||||
// lookahead, has to look ALL the way ahead, to the end of
|
||||
// the pattern.
|
||||
for (var n = negativeLists.length - 1; n > -1; n--) {
|
||||
var nl = negativeLists[n]
|
||||
|
||||
var nlBefore = re.slice(0, nl.reStart)
|
||||
var nlFirst = re.slice(nl.reStart, nl.reEnd - 8)
|
||||
var nlLast = re.slice(nl.reEnd - 8, nl.reEnd)
|
||||
var nlAfter = re.slice(nl.reEnd)
|
||||
|
||||
nlLast += nlAfter
|
||||
|
||||
// Handle nested stuff like *(*.js|!(*.json)), where open parens
|
||||
// mean that we should *not* include the ) in the bit that is considered
|
||||
// "after" the negated section.
|
||||
var openParensBefore = nlBefore.split('(').length - 1
|
||||
var cleanAfter = nlAfter
|
||||
for (i = 0; i < openParensBefore; i++) {
|
||||
cleanAfter = cleanAfter.replace(/\)[+*?]?/, '')
|
||||
}
|
||||
nlAfter = cleanAfter
|
||||
|
||||
var dollar = ''
|
||||
if (nlAfter === '' && isSub !== SUBPARSE) {
|
||||
dollar = '$'
|
||||
}
|
||||
var newRe = nlBefore + nlFirst + nlAfter + dollar + nlLast
|
||||
re = newRe
|
||||
}
|
||||
|
||||
// if the re is not "" at this point, then we need to make sure
|
||||
// it doesn't match against an empty path part.
|
||||
// Otherwise a/* will match a/, which it should not.
|
||||
if (re !== '' && hasMagic) {
|
||||
re = '(?=.)' + re
|
||||
}
|
||||
|
||||
if (addPatternStart) {
|
||||
re = patternStart + re
|
||||
}
|
||||
|
||||
// parsing just a piece of a larger pattern.
|
||||
if (isSub === SUBPARSE) {
|
||||
return [re, hasMagic]
|
||||
}
|
||||
|
||||
// skip the regexp for non-magical patterns
|
||||
// unescape anything in it, though, so that it'll be
|
||||
// an exact match against a file etc.
|
||||
if (!hasMagic) {
|
||||
return globUnescape(pattern)
|
||||
}
|
||||
|
||||
var flags = options.nocase ? 'i' : ''
|
||||
try {
|
||||
var regExp = new RegExp('^' + re + '$', flags)
|
||||
} catch (er) /* istanbul ignore next - should be impossible */ {
|
||||
// If it was an invalid regular expression, then it can't match
|
||||
// anything. This trick looks for a character after the end of
|
||||
// the string, which is of course impossible, except in multi-line
|
||||
// mode, but it's not a /m regex.
|
||||
return new RegExp('$.')
|
||||
}
|
||||
|
||||
regExp._glob = pattern
|
||||
regExp._src = re
|
||||
|
||||
return regExp
|
||||
}
|
||||
|
||||
minimatch.makeRe = function (pattern, options) {
|
||||
return new Minimatch(pattern, options || {}).makeRe()
|
||||
}
|
||||
|
||||
Minimatch.prototype.makeRe = makeRe
|
||||
function makeRe () {
|
||||
if (this.regexp || this.regexp === false) return this.regexp
|
||||
|
||||
// at this point, this.set is a 2d array of partial
|
||||
// pattern strings, or "**".
|
||||
//
|
||||
// It's better to use .match(). This function shouldn't
|
||||
// be used, really, but it's pretty convenient sometimes,
|
||||
// when you just want to work with a regex.
|
||||
var set = this.set
|
||||
|
||||
if (!set.length) {
|
||||
this.regexp = false
|
||||
return this.regexp
|
||||
}
|
||||
var options = this.options
|
||||
|
||||
var twoStar = options.noglobstar ? star
|
||||
: options.dot ? twoStarDot
|
||||
: twoStarNoDot
|
||||
var flags = options.nocase ? 'i' : ''
|
||||
|
||||
var re = set.map(function (pattern) {
|
||||
return pattern.map(function (p) {
|
||||
return (p === GLOBSTAR) ? twoStar
|
||||
: (typeof p === 'string') ? regExpEscape(p)
|
||||
: p._src
|
||||
}).join('\\\/')
|
||||
}).join('|')
|
||||
|
||||
// must match entire pattern
|
||||
// ending in a * or ** will make it less strict.
|
||||
re = '^(?:' + re + ')$'
|
||||
|
||||
// can match anything, as long as it's not this.
|
||||
if (this.negate) re = '^(?!' + re + ').*$'
|
||||
|
||||
try {
|
||||
this.regexp = new RegExp(re, flags)
|
||||
} catch (ex) /* istanbul ignore next - should be impossible */ {
|
||||
this.regexp = false
|
||||
}
|
||||
return this.regexp
|
||||
}
|
||||
|
||||
minimatch.match = function (list, pattern, options) {
|
||||
options = options || {}
|
||||
var mm = new Minimatch(pattern, options)
|
||||
list = list.filter(function (f) {
|
||||
return mm.match(f)
|
||||
})
|
||||
if (mm.options.nonull && !list.length) {
|
||||
list.push(pattern)
|
||||
}
|
||||
return list
|
||||
}
|
||||
|
||||
Minimatch.prototype.match = function match (f, partial) {
|
||||
if (typeof partial === 'undefined') partial = this.partial
|
||||
this.debug('match', f, this.pattern)
|
||||
// short-circuit in the case of busted things.
|
||||
// comments, etc.
|
||||
if (this.comment) return false
|
||||
if (this.empty) return f === ''
|
||||
|
||||
if (f === '/' && partial) return true
|
||||
|
||||
var options = this.options
|
||||
|
||||
// windows: need to use /, not \
|
||||
if (path.sep !== '/') {
|
||||
f = f.split(path.sep).join('/')
|
||||
}
|
||||
|
||||
// treat the test path as a set of pathparts.
|
||||
f = f.split(slashSplit)
|
||||
this.debug(this.pattern, 'split', f)
|
||||
|
||||
// just ONE of the pattern sets in this.set needs to match
|
||||
// in order for it to be valid. If negating, then just one
|
||||
// match means that we have failed.
|
||||
// Either way, return on the first hit.
|
||||
|
||||
var set = this.set
|
||||
this.debug(this.pattern, 'set', set)
|
||||
|
||||
// Find the basename of the path by looking for the last non-empty segment
|
||||
var filename
|
||||
var i
|
||||
for (i = f.length - 1; i >= 0; i--) {
|
||||
filename = f[i]
|
||||
if (filename) break
|
||||
}
|
||||
|
||||
for (i = 0; i < set.length; i++) {
|
||||
var pattern = set[i]
|
||||
var file = f
|
||||
if (options.matchBase && pattern.length === 1) {
|
||||
file = [filename]
|
||||
}
|
||||
var hit = this.matchOne(file, pattern, partial)
|
||||
if (hit) {
|
||||
if (options.flipNegate) return true
|
||||
return !this.negate
|
||||
}
|
||||
}
|
||||
|
||||
// didn't get any hits. this is success if it's a negative
|
||||
// pattern, failure otherwise.
|
||||
if (options.flipNegate) return false
|
||||
return this.negate
|
||||
}
|
||||
|
||||
// set partial to true to test if, for example,
|
||||
// "/a/b" matches the start of "/*/b/*/d"
|
||||
// Partial means, if you run out of file before you run
|
||||
// out of pattern, then that's fine, as long as all
|
||||
// the parts match.
|
||||
Minimatch.prototype.matchOne = function (file, pattern, partial) {
|
||||
var options = this.options
|
||||
|
||||
this.debug('matchOne',
|
||||
{ 'this': this, file: file, pattern: pattern })
|
||||
|
||||
this.debug('matchOne', file.length, pattern.length)
|
||||
|
||||
for (var fi = 0,
|
||||
pi = 0,
|
||||
fl = file.length,
|
||||
pl = pattern.length
|
||||
; (fi < fl) && (pi < pl)
|
||||
; fi++, pi++) {
|
||||
this.debug('matchOne loop')
|
||||
var p = pattern[pi]
|
||||
var f = file[fi]
|
||||
|
||||
this.debug(pattern, p, f)
|
||||
|
||||
// should be impossible.
|
||||
// some invalid regexp stuff in the set.
|
||||
/* istanbul ignore if */
|
||||
if (p === false) return false
|
||||
|
||||
if (p === GLOBSTAR) {
|
||||
this.debug('GLOBSTAR', [pattern, p, f])
|
||||
|
||||
// "**"
|
||||
// a/**/b/**/c would match the following:
|
||||
// a/b/x/y/z/c
|
||||
// a/x/y/z/b/c
|
||||
// a/b/x/b/x/c
|
||||
// a/b/c
|
||||
// To do this, take the rest of the pattern after
|
||||
// the **, and see if it would match the file remainder.
|
||||
// If so, return success.
|
||||
// If not, the ** "swallows" a segment, and try again.
|
||||
// This is recursively awful.
|
||||
//
|
||||
// a/**/b/**/c matching a/b/x/y/z/c
|
||||
// - a matches a
|
||||
// - doublestar
|
||||
// - matchOne(b/x/y/z/c, b/**/c)
|
||||
// - b matches b
|
||||
// - doublestar
|
||||
// - matchOne(x/y/z/c, c) -> no
|
||||
// - matchOne(y/z/c, c) -> no
|
||||
// - matchOne(z/c, c) -> no
|
||||
// - matchOne(c, c) yes, hit
|
||||
var fr = fi
|
||||
var pr = pi + 1
|
||||
if (pr === pl) {
|
||||
this.debug('** at the end')
|
||||
// a ** at the end will just swallow the rest.
|
||||
// We have found a match.
|
||||
// however, it will not swallow /.x, unless
|
||||
// options.dot is set.
|
||||
// . and .. are *never* matched by **, for explosively
|
||||
// exponential reasons.
|
||||
for (; fi < fl; fi++) {
|
||||
if (file[fi] === '.' || file[fi] === '..' ||
|
||||
(!options.dot && file[fi].charAt(0) === '.')) return false
|
||||
}
|
||||
return true
|
||||
}
|
||||
|
||||
// ok, let's see if we can swallow whatever we can.
|
||||
while (fr < fl) {
|
||||
var swallowee = file[fr]
|
||||
|
||||
this.debug('\nglobstar while', file, fr, pattern, pr, swallowee)
|
||||
|
||||
// XXX remove this slice. Just pass the start index.
|
||||
if (this.matchOne(file.slice(fr), pattern.slice(pr), partial)) {
|
||||
this.debug('globstar found match!', fr, fl, swallowee)
|
||||
// found a match.
|
||||
return true
|
||||
} else {
|
||||
// can't swallow "." or ".." ever.
|
||||
// can only swallow ".foo" when explicitly asked.
|
||||
if (swallowee === '.' || swallowee === '..' ||
|
||||
(!options.dot && swallowee.charAt(0) === '.')) {
|
||||
this.debug('dot detected!', file, fr, pattern, pr)
|
||||
break
|
||||
}
|
||||
|
||||
// ** swallows a segment, and continue.
|
||||
this.debug('globstar swallow a segment, and continue')
|
||||
fr++
|
||||
}
|
||||
}
|
||||
|
||||
// no match was found.
|
||||
// However, in partial mode, we can't say this is necessarily over.
|
||||
// If there's more *pattern* left, then
|
||||
/* istanbul ignore if */
|
||||
if (partial) {
|
||||
// ran out of file
|
||||
this.debug('\n>>> no match, partial?', file, fr, pattern, pr)
|
||||
if (fr === fl) return true
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
// something other than **
|
||||
// non-magic patterns just have to match exactly
|
||||
// patterns with magic have been turned into regexps.
|
||||
var hit
|
||||
if (typeof p === 'string') {
|
||||
hit = f === p
|
||||
this.debug('string match', p, f, hit)
|
||||
} else {
|
||||
hit = f.match(p)
|
||||
this.debug('pattern match', p, f, hit)
|
||||
}
|
||||
|
||||
if (!hit) return false
|
||||
}
|
||||
|
||||
// Note: ending in / means that we'll get a final ""
|
||||
// at the end of the pattern. This can only match a
|
||||
// corresponding "" at the end of the file.
|
||||
// If the file ends in /, then it can only match a
|
||||
// a pattern that ends in /, unless the pattern just
|
||||
// doesn't have any more for it. But, a/b/ should *not*
|
||||
// match "a/b/*", even though "" matches against the
|
||||
// [^/]*? pattern, except in partial mode, where it might
|
||||
// simply not be reached yet.
|
||||
// However, a/b/ should still satisfy a/*
|
||||
|
||||
// now either we fell off the end of the pattern, or we're done.
|
||||
if (fi === fl && pi === pl) {
|
||||
// ran out of pattern and filename at the same time.
|
||||
// an exact hit!
|
||||
return true
|
||||
} else if (fi === fl) {
|
||||
// ran out of file, but still had pattern left.
|
||||
// this is ok if we're doing the match as part of
|
||||
// a glob fs traversal.
|
||||
return partial
|
||||
} else /* istanbul ignore else */ if (pi === pl) {
|
||||
// ran out of pattern, still have file left.
|
||||
// this is only acceptable if we're on the very last
|
||||
// empty segment of a file with a trailing slash.
|
||||
// a/* should match a/b/
|
||||
return (fi === fl - 1) && (file[fi] === '')
|
||||
}
|
||||
|
||||
// should be unreachable.
|
||||
/* istanbul ignore next */
|
||||
throw new Error('wtf?')
|
||||
}
|
||||
|
||||
// replace stuff like \* with *
|
||||
function globUnescape (s) {
|
||||
return s.replace(/\\(.)/g, '$1')
|
||||
}
|
||||
|
||||
function regExpEscape (s) {
|
||||
return s.replace(/[-[\]{}()*+?.,\\^$|#\s]/g, '\\$&')
|
||||
}
|
||||
33
desktop-operator/node_modules/@electron/asar/node_modules/minimatch/package.json
generated
vendored
Normal file
33
desktop-operator/node_modules/@electron/asar/node_modules/minimatch/package.json
generated
vendored
Normal file
@@ -0,0 +1,33 @@
|
||||
{
|
||||
"author": "Isaac Z. Schlueter <i@izs.me> (http://blog.izs.me)",
|
||||
"name": "minimatch",
|
||||
"description": "a glob matcher in javascript",
|
||||
"version": "3.1.2",
|
||||
"publishConfig": {
|
||||
"tag": "v3-legacy"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git://github.com/isaacs/minimatch.git"
|
||||
},
|
||||
"main": "minimatch.js",
|
||||
"scripts": {
|
||||
"test": "tap",
|
||||
"preversion": "npm test",
|
||||
"postversion": "npm publish",
|
||||
"postpublish": "git push origin --all; git push origin --tags"
|
||||
},
|
||||
"engines": {
|
||||
"node": "*"
|
||||
},
|
||||
"dependencies": {
|
||||
"brace-expansion": "^1.1.7"
|
||||
},
|
||||
"devDependencies": {
|
||||
"tap": "^15.1.6"
|
||||
},
|
||||
"license": "ISC",
|
||||
"files": [
|
||||
"minimatch.js"
|
||||
]
|
||||
}
|
||||
60
desktop-operator/node_modules/@electron/asar/package.json
generated
vendored
Normal file
60
desktop-operator/node_modules/@electron/asar/package.json
generated
vendored
Normal file
@@ -0,0 +1,60 @@
|
||||
{
|
||||
"name": "@electron/asar",
|
||||
"description": "Creating Electron app packages",
|
||||
"version": "3.4.1",
|
||||
"main": "./lib/asar.js",
|
||||
"types": "./lib/asar.d.ts",
|
||||
"bin": {
|
||||
"asar": "./bin/asar.js"
|
||||
},
|
||||
"files": [
|
||||
"bin",
|
||||
"lib"
|
||||
],
|
||||
"engines": {
|
||||
"node": ">=10.12.0"
|
||||
},
|
||||
"license": "MIT",
|
||||
"homepage": "https://github.com/electron/asar",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git+https://github.com/electron/asar.git"
|
||||
},
|
||||
"bugs": {
|
||||
"url": "https://github.com/electron/asar/issues"
|
||||
},
|
||||
"publishConfig": {
|
||||
"provenance": true
|
||||
},
|
||||
"scripts": {
|
||||
"build": "tsc",
|
||||
"mocha": "xvfb-maybe electron-mocha && mocha",
|
||||
"mocha:update": "mocha --update",
|
||||
"mocha:watch": "mocha --watch",
|
||||
"test": "yarn lint && yarn mocha",
|
||||
"lint": "yarn prettier:check",
|
||||
"prettier": "prettier \"src/**/*.ts\" \"test/**/*.js\" \"*.js\"",
|
||||
"prettier:check": "yarn prettier --check",
|
||||
"prettier:write": "yarn prettier --write",
|
||||
"prepare": "tsc"
|
||||
},
|
||||
"dependencies": {
|
||||
"commander": "^5.0.0",
|
||||
"glob": "^7.1.6",
|
||||
"minimatch": "^3.0.4"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/minimatch": "^3.0.5",
|
||||
"@types/node": "^12.0.0",
|
||||
"chai": "^4.5.0",
|
||||
"electron": "^22.0.0",
|
||||
"electron-mocha": "^13.0.1",
|
||||
"lodash": "^4.17.15",
|
||||
"mocha": "^10.1.0",
|
||||
"mocha-chai-jest-snapshot": "^1.1.6",
|
||||
"prettier": "^3.3.3",
|
||||
"rimraf": "^3.0.2",
|
||||
"typescript": "^5.5.4",
|
||||
"xvfb-maybe": "^0.2.1"
|
||||
}
|
||||
}
|
||||
21
desktop-operator/node_modules/@electron/get/LICENSE
generated
vendored
Normal file
21
desktop-operator/node_modules/@electron/get/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
MIT License
|
||||
|
||||
Copyright (c) Contributors to the Electron project
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
138
desktop-operator/node_modules/@electron/get/README.md
generated
vendored
Normal file
138
desktop-operator/node_modules/@electron/get/README.md
generated
vendored
Normal file
@@ -0,0 +1,138 @@
|
||||
# @electron/get
|
||||
|
||||
> Download Electron release artifacts
|
||||
|
||||
[](https://circleci.com/gh/electron/get)
|
||||
[](https://npm.im/@electron/get)
|
||||
|
||||
## Usage
|
||||
|
||||
### Simple: Downloading an Electron Binary ZIP
|
||||
|
||||
```typescript
|
||||
import { download } from '@electron/get';
|
||||
|
||||
// NB: Use this syntax within an async function, Node does not have support for
|
||||
// top-level await as of Node 12.
|
||||
const zipFilePath = await download('4.0.4');
|
||||
```
|
||||
|
||||
### Advanced: Downloading a macOS Electron Symbol File
|
||||
|
||||
|
||||
```typescript
|
||||
import { downloadArtifact } from '@electron/get';
|
||||
|
||||
// NB: Use this syntax within an async function, Node does not have support for
|
||||
// top-level await as of Node 12.
|
||||
const zipFilePath = await downloadArtifact({
|
||||
version: '4.0.4',
|
||||
platform: 'darwin',
|
||||
artifactName: 'electron',
|
||||
artifactSuffix: 'symbols',
|
||||
arch: 'x64',
|
||||
});
|
||||
```
|
||||
|
||||
### Specifying a mirror
|
||||
|
||||
To specify another location to download Electron assets from, the following options are
|
||||
available:
|
||||
|
||||
* `mirrorOptions` Object
|
||||
* `mirror` String (optional) - The base URL of the mirror to download from.
|
||||
* `nightlyMirror` String (optional) - The Electron nightly-specific mirror URL.
|
||||
* `customDir` String (optional) - The name of the directory to download from, often scoped by version number.
|
||||
* `customFilename` String (optional) - The name of the asset to download.
|
||||
* `resolveAssetURL` Function (optional) - A function allowing customization of the url used to download the asset.
|
||||
|
||||
Anatomy of a download URL, in terms of `mirrorOptions`:
|
||||
|
||||
```
|
||||
https://github.com/electron/electron/releases/download/v4.0.4/electron-v4.0.4-linux-x64.zip
|
||||
| | | |
|
||||
------------------------------------------------------- -----------------------------
|
||||
| |
|
||||
mirror / nightlyMirror | | customFilename
|
||||
------
|
||||
||
|
||||
customDir
|
||||
```
|
||||
|
||||
Example:
|
||||
|
||||
```typescript
|
||||
import { download } from '@electron/get';
|
||||
|
||||
const zipFilePath = await download('4.0.4', {
|
||||
mirrorOptions: {
|
||||
mirror: 'https://mirror.example.com/electron/',
|
||||
customDir: 'custom',
|
||||
customFilename: 'unofficial-electron-linux.zip'
|
||||
}
|
||||
});
|
||||
// Will download from https://mirror.example.com/electron/custom/unofficial-electron-linux.zip
|
||||
|
||||
const nightlyZipFilePath = await download('8.0.0-nightly.20190901', {
|
||||
mirrorOptions: {
|
||||
nightlyMirror: 'https://nightly.example.com/',
|
||||
customDir: 'nightlies',
|
||||
customFilename: 'nightly-linux.zip'
|
||||
}
|
||||
});
|
||||
// Will download from https://nightly.example.com/nightlies/nightly-linux.zip
|
||||
```
|
||||
|
||||
`customDir` can have the placeholder `{{ version }}`, which will be replaced by the version
|
||||
specified (without the leading `v`). For example:
|
||||
|
||||
```javascript
|
||||
const zipFilePath = await download('4.0.4', {
|
||||
mirrorOptions: {
|
||||
mirror: 'https://mirror.example.com/electron/',
|
||||
customDir: 'version-{{ version }}',
|
||||
platform: 'linux',
|
||||
arch: 'x64'
|
||||
}
|
||||
});
|
||||
// Will download from https://mirror.example.com/electron/version-4.0.4/electron-v4.0.4-linux-x64.zip
|
||||
```
|
||||
|
||||
#### Using environment variables for mirror options
|
||||
Mirror options can also be specified via the following environment variables:
|
||||
* `ELECTRON_CUSTOM_DIR` - Specifies the custom directory to download from.
|
||||
* `ELECTRON_CUSTOM_FILENAME` - Specifies the custom file name to download.
|
||||
* `ELECTRON_MIRROR` - Specifies the URL of the server to download from if the version is not a nightly version.
|
||||
* `ELECTRON_NIGHTLY_MIRROR` - Specifies the URL of the server to download from if the version is a nightly version.
|
||||
|
||||
### Overriding the version downloaded
|
||||
|
||||
The version downloaded can be overriden by setting the `ELECTRON_CUSTOM_VERSION` environment variable.
|
||||
Setting this environment variable will override the version passed in to `download` or `downloadArtifact`.
|
||||
|
||||
## How It Works
|
||||
|
||||
This module downloads Electron to a known place on your system and caches it
|
||||
so that future requests for that asset can be returned instantly. The cache
|
||||
locations are:
|
||||
|
||||
* Linux: `$XDG_CACHE_HOME` or `~/.cache/electron/`
|
||||
* MacOS: `~/Library/Caches/electron/`
|
||||
* Windows: `%LOCALAPPDATA%/electron/Cache` or `~/AppData/Local/electron/Cache/`
|
||||
|
||||
By default, the module uses [`got`](https://github.com/sindresorhus/got) as the
|
||||
downloader. As a result, you can use the same [options](https://github.com/sindresorhus/got#options)
|
||||
via `downloadOptions`.
|
||||
|
||||
### Progress Bar
|
||||
|
||||
By default, a progress bar is shown when downloading an artifact for more than 30 seconds. To
|
||||
disable, set the `ELECTRON_GET_NO_PROGRESS` environment variable to any non-empty value, or set
|
||||
`quiet` to `true` in `downloadOptions`. If you need to monitor progress yourself via the API, set
|
||||
`getProgressCallback` in `downloadOptions`, which has the same function signature as `got`'s
|
||||
[`downloadProgress` event callback](https://github.com/sindresorhus/got#ondownloadprogress-progress).
|
||||
|
||||
### Proxies
|
||||
|
||||
Downstream packages should utilize the `initializeProxy` function to add HTTP(S) proxy support. If
|
||||
the environment variable `ELECTRON_GET_USE_PROXY` is set, it is called automatically.
|
||||
8
desktop-operator/node_modules/@electron/get/dist/cjs/Cache.d.ts
generated
vendored
Normal file
8
desktop-operator/node_modules/@electron/get/dist/cjs/Cache.d.ts
generated
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
export declare class Cache {
|
||||
private cacheRoot;
|
||||
constructor(cacheRoot?: string);
|
||||
static getCacheDirectory(downloadUrl: string): string;
|
||||
getCachePath(downloadUrl: string, fileName: string): string;
|
||||
getPathForFileInCache(url: string, fileName: string): Promise<string | null>;
|
||||
putFileInCache(url: string, currentPath: string, fileName: string): Promise<string>;
|
||||
}
|
||||
60
desktop-operator/node_modules/@electron/get/dist/cjs/Cache.js
generated
vendored
Normal file
60
desktop-operator/node_modules/@electron/get/dist/cjs/Cache.js
generated
vendored
Normal file
@@ -0,0 +1,60 @@
|
||||
"use strict";
|
||||
var __rest = (this && this.__rest) || function (s, e) {
|
||||
var t = {};
|
||||
for (var p in s) if (Object.prototype.hasOwnProperty.call(s, p) && e.indexOf(p) < 0)
|
||||
t[p] = s[p];
|
||||
if (s != null && typeof Object.getOwnPropertySymbols === "function")
|
||||
for (var i = 0, p = Object.getOwnPropertySymbols(s); i < p.length; i++) {
|
||||
if (e.indexOf(p[i]) < 0 && Object.prototype.propertyIsEnumerable.call(s, p[i]))
|
||||
t[p[i]] = s[p[i]];
|
||||
}
|
||||
return t;
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const debug_1 = require("debug");
|
||||
const env_paths_1 = require("env-paths");
|
||||
const fs = require("fs-extra");
|
||||
const path = require("path");
|
||||
const url = require("url");
|
||||
const crypto = require("crypto");
|
||||
const d = debug_1.default('@electron/get:cache');
|
||||
const defaultCacheRoot = env_paths_1.default('electron', {
|
||||
suffix: '',
|
||||
}).cache;
|
||||
class Cache {
|
||||
constructor(cacheRoot = defaultCacheRoot) {
|
||||
this.cacheRoot = cacheRoot;
|
||||
}
|
||||
static getCacheDirectory(downloadUrl) {
|
||||
const parsedDownloadUrl = url.parse(downloadUrl);
|
||||
// eslint-disable-next-line @typescript-eslint/no-unused-vars
|
||||
const { search, hash, pathname } = parsedDownloadUrl, rest = __rest(parsedDownloadUrl, ["search", "hash", "pathname"]);
|
||||
const strippedUrl = url.format(Object.assign(Object.assign({}, rest), { pathname: path.dirname(pathname || 'electron') }));
|
||||
return crypto
|
||||
.createHash('sha256')
|
||||
.update(strippedUrl)
|
||||
.digest('hex');
|
||||
}
|
||||
getCachePath(downloadUrl, fileName) {
|
||||
return path.resolve(this.cacheRoot, Cache.getCacheDirectory(downloadUrl), fileName);
|
||||
}
|
||||
async getPathForFileInCache(url, fileName) {
|
||||
const cachePath = this.getCachePath(url, fileName);
|
||||
if (await fs.pathExists(cachePath)) {
|
||||
return cachePath;
|
||||
}
|
||||
return null;
|
||||
}
|
||||
async putFileInCache(url, currentPath, fileName) {
|
||||
const cachePath = this.getCachePath(url, fileName);
|
||||
d(`Moving ${currentPath} to ${cachePath}`);
|
||||
if (await fs.pathExists(cachePath)) {
|
||||
d('* Replacing existing file');
|
||||
await fs.remove(cachePath);
|
||||
}
|
||||
await fs.move(currentPath, cachePath);
|
||||
return cachePath;
|
||||
}
|
||||
}
|
||||
exports.Cache = Cache;
|
||||
//# sourceMappingURL=Cache.js.map
|
||||
1
desktop-operator/node_modules/@electron/get/dist/cjs/Cache.js.map
generated
vendored
Normal file
1
desktop-operator/node_modules/@electron/get/dist/cjs/Cache.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"Cache.js","sourceRoot":"","sources":["../../src/Cache.ts"],"names":[],"mappings":";;;;;;;;;;;;;AAAA,iCAA0B;AAC1B,yCAAiC;AACjC,+BAA+B;AAC/B,6BAA6B;AAC7B,2BAA2B;AAC3B,iCAAiC;AAEjC,MAAM,CAAC,GAAG,eAAK,CAAC,qBAAqB,CAAC,CAAC;AAEvC,MAAM,gBAAgB,GAAG,mBAAQ,CAAC,UAAU,EAAE;IAC5C,MAAM,EAAE,EAAE;CACX,CAAC,CAAC,KAAK,CAAC;AAET,MAAa,KAAK;IAChB,YAAoB,YAAY,gBAAgB;QAA5B,cAAS,GAAT,SAAS,CAAmB;IAAG,CAAC;IAE7C,MAAM,CAAC,iBAAiB,CAAC,WAAmB;QACjD,MAAM,iBAAiB,GAAG,GAAG,CAAC,KAAK,CAAC,WAAW,CAAC,CAAC;QACjD,6DAA6D;QAC7D,MAAM,EAAE,MAAM,EAAE,IAAI,EAAE,QAAQ,KAAc,iBAAiB,EAA7B,gEAA6B,CAAC;QAC9D,MAAM,WAAW,GAAG,GAAG,CAAC,MAAM,iCAAM,IAAI,KAAE,QAAQ,EAAE,IAAI,CAAC,OAAO,CAAC,QAAQ,IAAI,UAAU,CAAC,IAAG,CAAC;QAE5F,OAAO,MAAM;aACV,UAAU,CAAC,QAAQ,CAAC;aACpB,MAAM,CAAC,WAAW,CAAC;aACnB,MAAM,CAAC,KAAK,CAAC,CAAC;IACnB,CAAC;IAEM,YAAY,CAAC,WAAmB,EAAE,QAAgB;QACvD,OAAO,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,SAAS,EAAE,KAAK,CAAC,iBAAiB,CAAC,WAAW,CAAC,EAAE,QAAQ,CAAC,CAAC;IACtF,CAAC;IAEM,KAAK,CAAC,qBAAqB,CAAC,GAAW,EAAE,QAAgB;QAC9D,MAAM,SAAS,GAAG,IAAI,CAAC,YAAY,CAAC,GAAG,EAAE,QAAQ,CAAC,CAAC;QACnD,IAAI,MAAM,EAAE,CAAC,UAAU,CAAC,SAAS,CAAC,EAAE;YAClC,OAAO,SAAS,CAAC;SAClB;QAED,OAAO,IAAI,CAAC;IACd,CAAC;IAEM,KAAK,CAAC,cAAc,CAAC,GAAW,EAAE,WAAmB,EAAE,QAAgB;QAC5E,MAAM,SAAS,GAAG,IAAI,CAAC,YAAY,CAAC,GAAG,EAAE,QAAQ,CAAC,CAAC;QACnD,CAAC,CAAC,UAAU,WAAW,OAAO,SAAS,EAAE,CAAC,CAAC;QAC3C,IAAI,MAAM,EAAE,CAAC,UAAU,CAAC,SAAS,CAAC,EAAE;YAClC,CAAC,CAAC,2BAA2B,CAAC,CAAC;YAC/B,MAAM,EAAE,CAAC,MAAM,CAAC,SAAS,CAAC,CAAC;SAC5B;QAED,MAAM,EAAE,CAAC,IAAI,CAAC,WAAW,EAAE,SAAS,CAAC,CAAC;QAEtC,OAAO,SAAS,CAAC;IACnB,CAAC;CACF;AAxCD,sBAwCC"}
|
||||
3
desktop-operator/node_modules/@electron/get/dist/cjs/Downloader.d.ts
generated
vendored
Normal file
3
desktop-operator/node_modules/@electron/get/dist/cjs/Downloader.d.ts
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
export interface Downloader<T> {
|
||||
download(url: string, targetFilePath: string, options: T): Promise<void>;
|
||||
}
|
||||
3
desktop-operator/node_modules/@electron/get/dist/cjs/Downloader.js
generated
vendored
Normal file
3
desktop-operator/node_modules/@electron/get/dist/cjs/Downloader.js
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
//# sourceMappingURL=Downloader.js.map
|
||||
1
desktop-operator/node_modules/@electron/get/dist/cjs/Downloader.js.map
generated
vendored
Normal file
1
desktop-operator/node_modules/@electron/get/dist/cjs/Downloader.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"Downloader.js","sourceRoot":"","sources":["../../src/Downloader.ts"],"names":[],"mappings":""}
|
||||
21
desktop-operator/node_modules/@electron/get/dist/cjs/GotDownloader.d.ts
generated
vendored
Normal file
21
desktop-operator/node_modules/@electron/get/dist/cjs/GotDownloader.d.ts
generated
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
import { Progress as GotProgress, Options as GotOptions } from 'got';
|
||||
import { Downloader } from './Downloader';
|
||||
/**
|
||||
* See [`got#options`](https://github.com/sindresorhus/got#options) for possible keys/values.
|
||||
*/
|
||||
export declare type GotDownloaderOptions = (GotOptions & {
|
||||
isStream?: true;
|
||||
}) & {
|
||||
/**
|
||||
* if defined, triggers every time `got`'s `downloadProgress` event callback is triggered.
|
||||
*/
|
||||
getProgressCallback?: (progress: GotProgress) => Promise<void>;
|
||||
/**
|
||||
* if `true`, disables the console progress bar (setting the `ELECTRON_GET_NO_PROGRESS`
|
||||
* environment variable to a non-empty value also does this).
|
||||
*/
|
||||
quiet?: boolean;
|
||||
};
|
||||
export declare class GotDownloader implements Downloader<GotDownloaderOptions> {
|
||||
download(url: string, targetFilePath: string, options?: GotDownloaderOptions): Promise<void>;
|
||||
}
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user