Compare commits
106 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
72642351d0 | ||
|
|
51a01ea03f | ||
|
|
d9a78ea837 | ||
|
|
5b221d5bd5 | ||
|
|
c36549ca69 | ||
|
|
7e79bef8da | ||
|
|
e3b4a4ba19 | ||
|
|
30d216c7ca | ||
|
|
d80483adc2 | ||
|
|
1cda391dfe | ||
|
|
375ec36781 | ||
|
|
4ad1c05444 | ||
|
|
c88eeb0b12 | ||
|
|
c6261aba6a | ||
|
|
a010b967b9 | ||
|
|
af6547f254 | ||
|
|
ba235b0b93 | ||
|
|
1bfde96e46 | ||
|
|
e1f9b4b6d3 | ||
|
|
95cf4fbed8 | ||
|
|
9ddc7d31bb | ||
|
|
83626017b9 | ||
|
|
b9372f0ef0 | ||
|
|
db97a7df14 | ||
|
|
575fca3806 | ||
|
|
a1c8f42435 | ||
|
|
a3c2680fec | ||
|
|
12dade0240 | ||
|
|
2a528a126c | ||
|
|
8839080069 | ||
|
|
8f66d75eb3 | ||
|
|
56ee681aec | ||
|
|
6db03f05a9 | ||
|
|
068da94e2a | ||
|
|
4b824b2d9f | ||
|
|
284c5e7aa6 | ||
|
|
036cd3e066 | ||
|
|
479c7a3f3f | ||
|
|
0404d870ad | ||
|
|
93a53763e0 | ||
|
|
c20d743286 | ||
|
|
ba938f64c5 | ||
|
|
af00d69e5c | ||
|
|
bc47da504c | ||
|
|
5a24c891c0 | ||
|
|
1103df98c1 | ||
|
|
74920e2e2f | ||
|
|
75775f2798 | ||
|
|
fad0f1060b | ||
|
|
0ca359e509 | ||
|
|
1d0ee31001 | ||
|
|
20a0a59670 | ||
|
|
9a00304a93 | ||
|
|
55b00bf884 | ||
|
|
e85f12977f | ||
|
|
940346e2f4 | ||
|
|
1854e6bb17 | ||
|
|
26b2ef0abb | ||
|
|
9cceaacd14 | ||
|
|
1ed13f7f88 | ||
|
|
729aa30253 | ||
|
|
b8bbc9c32f | ||
|
|
a263e3eb2c | ||
|
|
10bae4f98b | ||
|
|
b02aef2af9 | ||
|
|
56c0b633c8 | ||
|
|
4e8e8eba66 | ||
|
|
d5638b922d | ||
|
|
dc695c9a04 | ||
|
|
52909258ca | ||
|
|
e9b9801ac1 | ||
|
|
86a358d568 | ||
|
|
97c5bfaa7d | ||
|
|
8d0c110415 | ||
|
|
a131f4a11b | ||
|
|
335873a7f6 | ||
|
|
7446e07a8c | ||
|
|
693f7b482a | ||
|
|
1d4a13466f | ||
|
|
17844d4c28 | ||
|
|
55d0e3141c | ||
|
|
a967eb1080 | ||
|
|
21ff749cf3 | ||
|
|
18862bb8e0 | ||
|
|
27833615b7 | ||
|
|
00fae5cadd | ||
|
|
4fcbd5c4f7 | ||
|
|
bb8fd0646a | ||
|
|
1218adf5f2 | ||
|
|
818bf40a9c | ||
|
|
254612a49b | ||
|
|
92101e249a | ||
|
|
a18ab484cc | ||
|
|
7af9d67770 | ||
|
|
d63afcce89 | ||
|
|
15d0969cd9 | ||
|
|
5574b50d20 | ||
|
|
662c903bf3 | ||
|
|
545043e1d6 | ||
|
|
8f10ff8f96 | ||
|
|
62f3bd94de | ||
|
|
253b1868ec | ||
|
|
c4aefb6175 | ||
|
|
956cad0da4 | ||
|
|
83d8df84bf | ||
|
|
0c058fa162 |
9
.gitignore
vendored
9
.gitignore
vendored
@ -28,3 +28,12 @@ coverage/
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
|
||||
# Forgejo deployment runtime files
|
||||
deploy/forgejo/.env
|
||||
deploy/forgejo/forgejo/
|
||||
deploy/forgejo/postgres/
|
||||
deploy/forgejo/caddy/data/
|
||||
deploy/forgejo/caddy/config/
|
||||
deploy/forgejo/caddy/logs/
|
||||
deploy/forgejo/backups/
|
||||
|
||||
23
CLAUDE.md
Normal file
23
CLAUDE.md
Normal file
@ -0,0 +1,23 @@
|
||||
## Release + Update Source (Wichtig)
|
||||
|
||||
- Primäre Plattform ist `https://git.24-music.de`
|
||||
- Standard-Repo: `Administrator/real-debrid-downloader`
|
||||
- Nicht mehr primär über Codeberg/GitHub releasen
|
||||
|
||||
## Releasen
|
||||
|
||||
1. Token setzen:
|
||||
- PowerShell: `$env:GITEA_TOKEN="<token>"`
|
||||
2. Release ausführen:
|
||||
- `npm run release:gitea -- <version> [notes]`
|
||||
|
||||
Das Script:
|
||||
- bumped `package.json`
|
||||
- baut Windows-Artefakte
|
||||
- pusht `main` + Tag
|
||||
- erstellt Release auf `git.24-music.de`
|
||||
- lädt Assets hoch
|
||||
|
||||
## Auto-Update
|
||||
|
||||
- Updater nutzt aktuell `git.24-music.de` als Standardquelle
|
||||
33
README.md
33
README.md
@ -1,6 +1,6 @@
|
||||
# Multi Debrid Downloader
|
||||
|
||||
Desktop downloader for **Real-Debrid, Mega-Debrid, BestDebrid, and AllDebrid** with fast queue management, automatic extraction, and robust error handling.
|
||||
Desktop downloader with fast queue management, automatic extraction, and robust error handling.
|
||||
|
||||

|
||||

|
||||
@ -65,18 +65,18 @@ Desktop downloader for **Real-Debrid, Mega-Debrid, BestDebrid, and AllDebrid** w
|
||||
- Minimize-to-tray with tray menu controls.
|
||||
- Speed limits globally or per download.
|
||||
- Bandwidth schedules for time-based speed profiles.
|
||||
- Built-in auto-updater via Codeberg Releases.
|
||||
- Built-in auto-updater via `git.24-music.de` Releases.
|
||||
- Long path support (>260 characters) on Windows.
|
||||
|
||||
## Installation
|
||||
|
||||
### Option A: prebuilt releases (recommended)
|
||||
|
||||
1. Download a release from the Codeberg Releases page.
|
||||
1. Download a release from the `git.24-music.de` Releases page.
|
||||
2. Run the installer or portable build.
|
||||
3. Add your debrid tokens in Settings.
|
||||
|
||||
Releases: `https://codeberg.org/Sucukdeluxe/real-debrid-downloader/releases`
|
||||
Releases: `https://git.24-music.de/Administrator/real-debrid-downloader/releases`
|
||||
|
||||
### Option B: build from source
|
||||
|
||||
@ -103,21 +103,34 @@ npm run dev
|
||||
| `npm test` | Runs Vitest unit tests |
|
||||
| `npm run self-check` | Runs integrated end-to-end self-checks |
|
||||
| `npm run release:win` | Creates Windows installer and portable build |
|
||||
| `npm run release:codeberg -- <version> [notes]` | One-command version bump + build + tag + Codeberg release upload |
|
||||
| `npm run release:gitea -- <version> [notes]` | One-command version bump + build + tag + release upload to `git.24-music.de` |
|
||||
| `npm run release:codeberg -- <version> [notes]` | Legacy path for old Codeberg workflow |
|
||||
|
||||
### One-command Codeberg release
|
||||
### One-command git.24-music release
|
||||
|
||||
```bash
|
||||
npm run release:codeberg -- 1.4.42 "- Maintenance update"
|
||||
npm run release:gitea -- 1.6.31 "- Maintenance update"
|
||||
```
|
||||
|
||||
This command will:
|
||||
|
||||
1. Bump `package.json` version.
|
||||
2. Build setup/portable artifacts (`npm run release:win`).
|
||||
3. Commit and push `main` to your Codeberg remote.
|
||||
3. Commit and push `main` to your `git.24-music.de` remote.
|
||||
4. Create and push tag `v<version>`.
|
||||
5. Create/update the Codeberg release and upload required assets.
|
||||
5. Create/update the Gitea release and upload required assets.
|
||||
|
||||
Required once before release:
|
||||
|
||||
```bash
|
||||
git remote add gitea https://git.24-music.de/<user>/<repo>.git
|
||||
```
|
||||
|
||||
PowerShell token setup:
|
||||
|
||||
```powershell
|
||||
$env:GITEA_TOKEN="<dein-token>"
|
||||
```
|
||||
|
||||
## Typical workflow
|
||||
|
||||
@ -154,7 +167,7 @@ The app stores runtime files in Electron's `userData` directory, including:
|
||||
|
||||
## Changelog
|
||||
|
||||
Release history is available on [Codeberg Releases](https://codeberg.org/Sucukdeluxe/real-debrid-downloader/releases).
|
||||
Release history is available on [git.24-music.de Releases](https://git.24-music.de/Administrator/real-debrid-downloader/releases).
|
||||
|
||||
## License
|
||||
|
||||
|
||||
@ -1,75 +0,0 @@
|
||||
import fs from "node:fs";
|
||||
import path from "node:path";
|
||||
import { spawnSync } from "node:child_process";
|
||||
|
||||
const credResult = spawnSync("git", ["credential", "fill"], {
|
||||
input: "protocol=https\nhost=codeberg.org\n\n",
|
||||
encoding: "utf8",
|
||||
stdio: ["pipe", "pipe", "pipe"]
|
||||
});
|
||||
const creds = new Map();
|
||||
for (const line of credResult.stdout.split(/\r?\n/)) {
|
||||
if (line.includes("=")) {
|
||||
const [k, v] = line.split("=", 2);
|
||||
creds.set(k, v);
|
||||
}
|
||||
}
|
||||
const auth = "Basic " + Buffer.from(creds.get("username") + ":" + creds.get("password")).toString("base64");
|
||||
const owner = "Sucukdeluxe";
|
||||
const repo = "real-debrid-downloader";
|
||||
const tag = "v1.5.35";
|
||||
const baseApi = `https://codeberg.org/api/v1/repos/${owner}/${repo}`;
|
||||
|
||||
async function main() {
|
||||
await fetch(baseApi, {
|
||||
method: "PATCH",
|
||||
headers: { Authorization: auth, "Content-Type": "application/json" },
|
||||
body: JSON.stringify({ has_releases: true })
|
||||
});
|
||||
|
||||
const createRes = await fetch(`${baseApi}/releases`, {
|
||||
method: "POST",
|
||||
headers: { Authorization: auth, "Content-Type": "application/json", Accept: "application/json" },
|
||||
body: JSON.stringify({
|
||||
tag_name: tag,
|
||||
target_commitish: "main",
|
||||
name: tag,
|
||||
body: "- Fix: Fortschritt zeigt jetzt kombinierten Wert (Download + Entpacken)\n- Fix: Pausieren zeigt nicht mehr 'Warte auf Daten'\n- Pixel-perfekte Dual-Layer Progress-Bar Texte (clip-path)",
|
||||
draft: false,
|
||||
prerelease: false
|
||||
})
|
||||
});
|
||||
const release = await createRes.json();
|
||||
if (!createRes.ok) {
|
||||
console.error("Create failed:", JSON.stringify(release));
|
||||
process.exit(1);
|
||||
}
|
||||
console.log("Release created:", release.id);
|
||||
|
||||
const files = [
|
||||
"Real-Debrid-Downloader Setup 1.5.35.exe",
|
||||
"Real-Debrid-Downloader 1.5.35.exe",
|
||||
"latest.yml",
|
||||
"Real-Debrid-Downloader Setup 1.5.35.exe.blockmap"
|
||||
];
|
||||
for (const f of files) {
|
||||
const filePath = path.join("release", f);
|
||||
const data = fs.readFileSync(filePath);
|
||||
const uploadUrl = `${baseApi}/releases/${release.id}/assets?name=${encodeURIComponent(f)}`;
|
||||
const res = await fetch(uploadUrl, {
|
||||
method: "POST",
|
||||
headers: { Authorization: auth, "Content-Type": "application/octet-stream" },
|
||||
body: data
|
||||
});
|
||||
if (res.ok) {
|
||||
console.log("Uploaded:", f);
|
||||
} else if (res.status === 409 || res.status === 422) {
|
||||
console.log("Skipped existing:", f);
|
||||
} else {
|
||||
console.error("Upload failed for", f, ":", res.status);
|
||||
}
|
||||
}
|
||||
console.log(`Done! https://codeberg.org/${owner}/${repo}/releases/tag/${tag}`);
|
||||
}
|
||||
|
||||
main().catch(e => { console.error(e.message); process.exit(1); });
|
||||
@ -25,11 +25,11 @@ AppPublisher=Sucukdeluxe
|
||||
DefaultDirName={autopf}\{#MyAppName}
|
||||
DefaultGroupName={#MyAppName}
|
||||
OutputDir={#MyOutputDir}
|
||||
OutputBaseFilename=Real-Debrid-Downloader-Setup-{#MyAppVersion}
|
||||
OutputBaseFilename=Real-Debrid-Downloader Setup {#MyAppVersion}
|
||||
Compression=lzma
|
||||
SolidCompression=yes
|
||||
WizardStyle=modern
|
||||
PrivilegesRequired=admin
|
||||
PrivilegesRequired=lowest
|
||||
ArchitecturesInstallIn64BitMode=x64compatible
|
||||
UninstallDisplayIcon={app}\{#MyAppExeName}
|
||||
SetupIconFile={#MyIconFile}
|
||||
@ -39,8 +39,8 @@ Name: "german"; MessagesFile: "compiler:Languages\German.isl"
|
||||
Name: "english"; MessagesFile: "compiler:Default.isl"
|
||||
|
||||
[Files]
|
||||
Source: "{#MySourceDir}\\*"; DestDir: "{app}"; Flags: ignoreversion recursesubdirs createallsubdirs
|
||||
Source: "{#MyIconFile}"; DestDir: "{app}"; DestName: "app_icon.ico"; Flags: ignoreversion
|
||||
Source: "{#MySourceDir}\\*"; DestDir: "{app}"; Flags: recursesubdirs createallsubdirs
|
||||
Source: "{#MyIconFile}"; DestDir: "{app}"; DestName: "app_icon.ico"
|
||||
|
||||
[Icons]
|
||||
Name: "{group}\{#MyAppName}"; Filename: "{app}\{#MyAppExeName}"; IconFilename: "{app}\app_icon.ico"
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
{
|
||||
"name": "real-debrid-downloader",
|
||||
"version": "1.5.74",
|
||||
"description": "Real-Debrid Downloader Desktop (Electron + React + TypeScript)",
|
||||
"version": "1.6.55",
|
||||
"description": "Desktop downloader",
|
||||
"main": "build/main/main/main.js",
|
||||
"author": "Sucukdeluxe",
|
||||
"license": "MIT",
|
||||
@ -17,7 +17,8 @@
|
||||
"test": "vitest run",
|
||||
"self-check": "tsx tests/self-check.ts",
|
||||
"release:win": "npm run build && electron-builder --publish never --win nsis portable",
|
||||
"release:codeberg": "node scripts/release_codeberg.mjs"
|
||||
"release:gitea": "node scripts/release_gitea.mjs",
|
||||
"release:forgejo": "node scripts/release_gitea.mjs"
|
||||
},
|
||||
"dependencies": {
|
||||
"adm-zip": "^0.5.16",
|
||||
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@ -3,7 +3,9 @@ package com.sucukdeluxe.extractor;
|
||||
import net.lingala.zip4j.ZipFile;
|
||||
import net.lingala.zip4j.exception.ZipException;
|
||||
import net.lingala.zip4j.model.FileHeader;
|
||||
import net.sf.sevenzipjbinding.ExtractAskMode;
|
||||
import net.sf.sevenzipjbinding.ExtractOperationResult;
|
||||
import net.sf.sevenzipjbinding.IArchiveExtractCallback;
|
||||
import net.sf.sevenzipjbinding.IArchiveOpenCallback;
|
||||
import net.sf.sevenzipjbinding.IArchiveOpenVolumeCallback;
|
||||
import net.sf.sevenzipjbinding.IInArchive;
|
||||
@ -26,6 +28,7 @@ import java.io.InputStream;
|
||||
import java.io.OutputStream;
|
||||
import java.io.RandomAccessFile;
|
||||
import java.nio.charset.StandardCharsets;
|
||||
import java.nio.file.Files;
|
||||
import java.util.ArrayList;
|
||||
import java.util.Base64;
|
||||
import java.util.HashMap;
|
||||
@ -42,12 +45,18 @@ public final class JBindExtractorMain {
|
||||
private static final Pattern NUMBERED_ZIP_SPLIT_RE = Pattern.compile("(?i).*\\.zip\\.\\d{3}$");
|
||||
private static final Pattern OLD_ZIP_SPLIT_RE = Pattern.compile("(?i).*\\.z\\d{2,3}$");
|
||||
private static final Pattern SEVEN_ZIP_SPLIT_RE = Pattern.compile("(?i).*\\.7z\\.001$");
|
||||
private static final Pattern DIGIT_SUFFIX_RE = Pattern.compile("\\d{2,3}");
|
||||
private static final Pattern WINDOWS_SPECIAL_CHARS_RE = Pattern.compile("[:<>*?\"\\|]");
|
||||
private static volatile boolean sevenZipInitialized = false;
|
||||
|
||||
private JBindExtractorMain() {
|
||||
}
|
||||
|
||||
public static void main(String[] args) {
|
||||
if (args.length == 1 && "--daemon".equals(args[0])) {
|
||||
runDaemon();
|
||||
return;
|
||||
}
|
||||
int exit = 1;
|
||||
try {
|
||||
ExtractionRequest request = parseArgs(args);
|
||||
@ -62,6 +71,127 @@ public final class JBindExtractorMain {
|
||||
System.exit(exit);
|
||||
}
|
||||
|
||||
private static void runDaemon() {
|
||||
System.out.println("RD_DAEMON_READY");
|
||||
System.out.flush();
|
||||
java.io.BufferedReader reader = new java.io.BufferedReader(
|
||||
new java.io.InputStreamReader(System.in, StandardCharsets.UTF_8));
|
||||
try {
|
||||
String line;
|
||||
while ((line = reader.readLine()) != null) {
|
||||
line = line.trim();
|
||||
if (line.isEmpty()) {
|
||||
continue;
|
||||
}
|
||||
int exitCode = 1;
|
||||
try {
|
||||
ExtractionRequest request = parseDaemonRequest(line);
|
||||
exitCode = runExtraction(request);
|
||||
} catch (IllegalArgumentException error) {
|
||||
emitError("Argumentfehler: " + safeMessage(error));
|
||||
exitCode = 2;
|
||||
} catch (Throwable error) {
|
||||
emitError(safeMessage(error));
|
||||
exitCode = 1;
|
||||
}
|
||||
System.out.println("RD_REQUEST_DONE " + exitCode);
|
||||
System.out.flush();
|
||||
}
|
||||
} catch (IOException ignored) {
|
||||
// stdin closed — parent process exited
|
||||
}
|
||||
}
|
||||
|
||||
private static ExtractionRequest parseDaemonRequest(String jsonLine) {
|
||||
// Minimal JSON parsing without external dependencies.
|
||||
// Expected format: {"archive":"...","target":"...","conflict":"...","backend":"...","passwords":["...","..."]}
|
||||
ExtractionRequest request = new ExtractionRequest();
|
||||
request.archiveFile = new File(extractJsonString(jsonLine, "archive"));
|
||||
request.targetDir = new File(extractJsonString(jsonLine, "target"));
|
||||
String conflict = extractJsonString(jsonLine, "conflict");
|
||||
if (conflict.length() > 0) {
|
||||
request.conflictMode = ConflictMode.fromValue(conflict);
|
||||
}
|
||||
String backend = extractJsonString(jsonLine, "backend");
|
||||
if (backend.length() > 0) {
|
||||
request.backend = Backend.fromValue(backend);
|
||||
}
|
||||
// Parse passwords array
|
||||
int pwStart = jsonLine.indexOf("\"passwords\"");
|
||||
if (pwStart >= 0) {
|
||||
int arrStart = jsonLine.indexOf('[', pwStart);
|
||||
int arrEnd = jsonLine.indexOf(']', arrStart);
|
||||
if (arrStart >= 0 && arrEnd > arrStart) {
|
||||
String arrContent = jsonLine.substring(arrStart + 1, arrEnd);
|
||||
int idx = 0;
|
||||
while (idx < arrContent.length()) {
|
||||
int qStart = arrContent.indexOf('"', idx);
|
||||
if (qStart < 0) break;
|
||||
int qEnd = findClosingQuote(arrContent, qStart + 1);
|
||||
if (qEnd < 0) break;
|
||||
request.passwords.add(unescapeJsonString(arrContent.substring(qStart + 1, qEnd)));
|
||||
idx = qEnd + 1;
|
||||
}
|
||||
}
|
||||
}
|
||||
if (request.archiveFile == null || !request.archiveFile.exists() || !request.archiveFile.isFile()) {
|
||||
throw new IllegalArgumentException("Archiv nicht gefunden: " +
|
||||
(request.archiveFile == null ? "null" : request.archiveFile.getAbsolutePath()));
|
||||
}
|
||||
if (request.targetDir == null) {
|
||||
throw new IllegalArgumentException("--target fehlt");
|
||||
}
|
||||
return request;
|
||||
}
|
||||
|
||||
private static String extractJsonString(String json, String key) {
|
||||
String search = "\"" + key + "\"";
|
||||
int keyIdx = json.indexOf(search);
|
||||
if (keyIdx < 0) return "";
|
||||
int colonIdx = json.indexOf(':', keyIdx + search.length());
|
||||
if (colonIdx < 0) return "";
|
||||
int qStart = json.indexOf('"', colonIdx + 1);
|
||||
if (qStart < 0) return "";
|
||||
int qEnd = findClosingQuote(json, qStart + 1);
|
||||
if (qEnd < 0) return "";
|
||||
return unescapeJsonString(json.substring(qStart + 1, qEnd));
|
||||
}
|
||||
|
||||
private static int findClosingQuote(String s, int from) {
|
||||
for (int i = from; i < s.length(); i++) {
|
||||
char c = s.charAt(i);
|
||||
if (c == '\\') {
|
||||
i++; // skip escaped character
|
||||
continue;
|
||||
}
|
||||
if (c == '"') return i;
|
||||
}
|
||||
return -1;
|
||||
}
|
||||
|
||||
private static String unescapeJsonString(String s) {
|
||||
if (s.indexOf('\\') < 0) return s;
|
||||
StringBuilder sb = new StringBuilder(s.length());
|
||||
for (int i = 0; i < s.length(); i++) {
|
||||
char c = s.charAt(i);
|
||||
if (c == '\\' && i + 1 < s.length()) {
|
||||
char next = s.charAt(i + 1);
|
||||
switch (next) {
|
||||
case '"': sb.append('"'); i++; break;
|
||||
case '\\': sb.append('\\'); i++; break;
|
||||
case '/': sb.append('/'); i++; break;
|
||||
case 'n': sb.append('\n'); i++; break;
|
||||
case 'r': sb.append('\r'); i++; break;
|
||||
case 't': sb.append('\t'); i++; break;
|
||||
default: sb.append(c); break;
|
||||
}
|
||||
} else {
|
||||
sb.append(c);
|
||||
}
|
||||
}
|
||||
return sb.toString();
|
||||
}
|
||||
|
||||
private static int runExtraction(ExtractionRequest request) throws Exception {
|
||||
List<String> passwords = normalizePasswords(request.passwords);
|
||||
Exception lastError = null;
|
||||
@ -152,30 +282,35 @@ public final class JBindExtractorMain {
|
||||
}
|
||||
|
||||
ensureDirectory(output.getParentFile());
|
||||
rejectSymlink(output);
|
||||
long[] remaining = new long[] { itemUnits };
|
||||
boolean extractionSuccess = false;
|
||||
try {
|
||||
InputStream in = zipFile.getInputStream(header);
|
||||
OutputStream out = new FileOutputStream(output);
|
||||
try {
|
||||
byte[] buffer = new byte[BUFFER_SIZE];
|
||||
while (true) {
|
||||
int read = in.read(buffer);
|
||||
if (read < 0) {
|
||||
break;
|
||||
OutputStream out = new FileOutputStream(output);
|
||||
try {
|
||||
byte[] buffer = new byte[BUFFER_SIZE];
|
||||
while (true) {
|
||||
int read = in.read(buffer);
|
||||
if (read < 0) {
|
||||
break;
|
||||
}
|
||||
if (read == 0) {
|
||||
continue;
|
||||
}
|
||||
out.write(buffer, 0, read);
|
||||
long accounted = Math.min(remaining[0], (long) read);
|
||||
remaining[0] -= accounted;
|
||||
progress.advance(accounted);
|
||||
}
|
||||
if (read == 0) {
|
||||
continue;
|
||||
} finally {
|
||||
try {
|
||||
out.close();
|
||||
} catch (Throwable ignored) {
|
||||
}
|
||||
out.write(buffer, 0, read);
|
||||
long accounted = Math.min(remaining[0], (long) read);
|
||||
remaining[0] -= accounted;
|
||||
progress.advance(accounted);
|
||||
}
|
||||
} finally {
|
||||
try {
|
||||
out.close();
|
||||
} catch (Throwable ignored) {
|
||||
}
|
||||
try {
|
||||
in.close();
|
||||
} catch (Throwable ignored) {
|
||||
@ -188,11 +323,19 @@ public final class JBindExtractorMain {
|
||||
if (modified > 0) {
|
||||
output.setLastModified(modified);
|
||||
}
|
||||
extractionSuccess = true;
|
||||
} catch (ZipException error) {
|
||||
if (isWrongPassword(error, encrypted)) {
|
||||
throw new WrongPasswordException(error);
|
||||
}
|
||||
throw error;
|
||||
} finally {
|
||||
if (!extractionSuccess && output.exists()) {
|
||||
try {
|
||||
output.delete();
|
||||
} catch (Throwable ignored) {
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@ -219,98 +362,99 @@ public final class JBindExtractorMain {
|
||||
try {
|
||||
context = openSevenZipArchive(request.archiveFile, password);
|
||||
IInArchive archive = context.archive;
|
||||
ISimpleInArchive simple = archive.getSimpleInterface();
|
||||
ISimpleInArchiveItem[] items = simple.getArchiveItems();
|
||||
int itemCount = archive.getNumberOfItems();
|
||||
if (itemCount <= 0) {
|
||||
throw new IOException("Archiv enthalt keine Eintrage oder konnte nicht gelesen werden: " + request.archiveFile.getAbsolutePath());
|
||||
}
|
||||
|
||||
// Pre-scan: collect file indices, sizes, output paths, and detect encryption
|
||||
long totalUnits = 0;
|
||||
boolean encrypted = false;
|
||||
for (ISimpleInArchiveItem item : items) {
|
||||
if (item == null || item.isFolder()) {
|
||||
continue;
|
||||
}
|
||||
try {
|
||||
encrypted = encrypted || item.isEncrypted();
|
||||
} catch (Throwable ignored) {
|
||||
// ignore encrypted flag read issues
|
||||
}
|
||||
totalUnits += safeSize(item.getSize());
|
||||
}
|
||||
ProgressTracker progress = new ProgressTracker(totalUnits);
|
||||
progress.emitStart();
|
||||
|
||||
List<Integer> fileIndices = new ArrayList<Integer>();
|
||||
List<File> outputFiles = new ArrayList<File>();
|
||||
List<Long> fileSizes = new ArrayList<Long>();
|
||||
Set<String> reserved = new HashSet<String>();
|
||||
for (ISimpleInArchiveItem item : items) {
|
||||
if (item == null) {
|
||||
continue;
|
||||
}
|
||||
|
||||
String entryName = normalizeEntryName(item.getPath(), "item-" + item.getItemIndex());
|
||||
if (item.isFolder()) {
|
||||
for (int i = 0; i < itemCount; i++) {
|
||||
Boolean isFolder = (Boolean) archive.getProperty(i, PropID.IS_FOLDER);
|
||||
String entryPath = (String) archive.getProperty(i, PropID.PATH);
|
||||
String entryName = normalizeEntryName(entryPath, "item-" + i);
|
||||
|
||||
if (Boolean.TRUE.equals(isFolder)) {
|
||||
File dir = resolveDirectory(request.targetDir, entryName);
|
||||
ensureDirectory(dir);
|
||||
reserved.add(pathKey(dir));
|
||||
continue;
|
||||
}
|
||||
|
||||
long itemUnits = safeSize(item.getSize());
|
||||
File output = resolveOutputFile(request.targetDir, entryName, request.conflictMode, reserved);
|
||||
if (output == null) {
|
||||
progress.advance(itemUnits);
|
||||
continue;
|
||||
}
|
||||
|
||||
ensureDirectory(output.getParentFile());
|
||||
final FileOutputStream out = new FileOutputStream(output);
|
||||
final long[] remaining = new long[] { itemUnits };
|
||||
try {
|
||||
ExtractOperationResult result = item.extractSlow(new ISequentialOutStream() {
|
||||
@Override
|
||||
public int write(byte[] data) throws SevenZipException {
|
||||
if (data == null || data.length == 0) {
|
||||
return 0;
|
||||
}
|
||||
try {
|
||||
out.write(data);
|
||||
} catch (IOException error) {
|
||||
throw new SevenZipException("Fehler beim Schreiben: " + error.getMessage(), error);
|
||||
}
|
||||
long accounted = Math.min(remaining[0], (long) data.length);
|
||||
remaining[0] -= accounted;
|
||||
progress.advance(accounted);
|
||||
return data.length;
|
||||
}
|
||||
}, password == null ? "" : password);
|
||||
|
||||
if (remaining[0] > 0) {
|
||||
progress.advance(remaining[0]);
|
||||
}
|
||||
|
||||
if (result != ExtractOperationResult.OK) {
|
||||
if (isPasswordFailure(result, encrypted)) {
|
||||
throw new WrongPasswordException(new IOException("Falsches Passwort"));
|
||||
}
|
||||
throw new IOException("7z-Fehler: " + result.name());
|
||||
}
|
||||
} catch (SevenZipException error) {
|
||||
if (looksLikeWrongPassword(error, encrypted)) {
|
||||
throw new WrongPasswordException(error);
|
||||
}
|
||||
throw error;
|
||||
} finally {
|
||||
try {
|
||||
out.close();
|
||||
} catch (Throwable ignored) {
|
||||
}
|
||||
}
|
||||
|
||||
try {
|
||||
java.util.Date modified = item.getLastWriteTime();
|
||||
if (modified != null) {
|
||||
output.setLastModified(modified.getTime());
|
||||
}
|
||||
Boolean isEncrypted = (Boolean) archive.getProperty(i, PropID.ENCRYPTED);
|
||||
encrypted = encrypted || Boolean.TRUE.equals(isEncrypted);
|
||||
} catch (Throwable ignored) {
|
||||
// best effort
|
||||
// ignore encrypted flag read issues
|
||||
}
|
||||
|
||||
Long rawSize = (Long) archive.getProperty(i, PropID.SIZE);
|
||||
long itemSize = safeSize(rawSize);
|
||||
totalUnits += itemSize;
|
||||
|
||||
File output = resolveOutputFile(request.targetDir, entryName, request.conflictMode, reserved);
|
||||
fileIndices.add(i);
|
||||
outputFiles.add(output); // null if skipped
|
||||
fileSizes.add(itemSize);
|
||||
}
|
||||
|
||||
if (fileIndices.isEmpty()) {
|
||||
// All items are folders or skipped
|
||||
ProgressTracker progress = new ProgressTracker(1);
|
||||
progress.emitStart();
|
||||
progress.emitDone();
|
||||
return;
|
||||
}
|
||||
|
||||
ProgressTracker progress = new ProgressTracker(totalUnits);
|
||||
progress.emitStart();
|
||||
|
||||
// Build index array for bulk extract
|
||||
int[] indices = new int[fileIndices.size()];
|
||||
for (int i = 0; i < fileIndices.size(); i++) {
|
||||
indices[i] = fileIndices.get(i);
|
||||
}
|
||||
|
||||
// Map from archive index to our position in fileIndices/outputFiles
|
||||
Map<Integer, Integer> indexToPos = new HashMap<Integer, Integer>();
|
||||
for (int i = 0; i < fileIndices.size(); i++) {
|
||||
indexToPos.put(fileIndices.get(i), i);
|
||||
}
|
||||
|
||||
// Bulk extraction state
|
||||
final boolean encryptedFinal = encrypted;
|
||||
final String effectivePassword = password == null ? "" : password;
|
||||
final File[] currentOutput = new File[1];
|
||||
final FileOutputStream[] currentStream = new FileOutputStream[1];
|
||||
final boolean[] currentSuccess = new boolean[1];
|
||||
final long[] currentRemaining = new long[1];
|
||||
final Throwable[] firstError = new Throwable[1];
|
||||
final int[] currentPos = new int[] { -1 };
|
||||
|
||||
try {
|
||||
archive.extract(indices, false, new BulkExtractCallback(
|
||||
archive, indexToPos, fileIndices, outputFiles, fileSizes,
|
||||
progress, encryptedFinal, effectivePassword, currentOutput,
|
||||
currentStream, currentSuccess, currentRemaining, currentPos, firstError
|
||||
));
|
||||
} catch (SevenZipException error) {
|
||||
if (looksLikeWrongPassword(error, encryptedFinal)) {
|
||||
throw new WrongPasswordException(error);
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
|
||||
if (firstError[0] != null) {
|
||||
if (firstError[0] instanceof WrongPasswordException) {
|
||||
throw (WrongPasswordException) firstError[0];
|
||||
}
|
||||
throw (Exception) firstError[0];
|
||||
}
|
||||
|
||||
progress.emitDone();
|
||||
@ -328,14 +472,31 @@ public final class JBindExtractorMain {
|
||||
|
||||
if (SEVEN_ZIP_SPLIT_RE.matcher(nameLower).matches()) {
|
||||
VolumedArchiveInStream volumed = new VolumedArchiveInStream(archiveFile.getName(), callback);
|
||||
IInArchive archive = SevenZip.openInArchive(null, volumed, callback);
|
||||
return new SevenZipArchiveContext(archive, null, volumed, callback);
|
||||
try {
|
||||
IInArchive archive = SevenZip.openInArchive(null, volumed, callback);
|
||||
return new SevenZipArchiveContext(archive, null, volumed, callback);
|
||||
} catch (Exception error) {
|
||||
callback.close();
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
RandomAccessFile raf = new RandomAccessFile(archiveFile, "r");
|
||||
RandomAccessFileInStream stream = new RandomAccessFileInStream(raf);
|
||||
IInArchive archive = SevenZip.openInArchive(null, stream, callback);
|
||||
return new SevenZipArchiveContext(archive, stream, null, callback);
|
||||
try {
|
||||
IInArchive archive = SevenZip.openInArchive(null, stream, callback);
|
||||
return new SevenZipArchiveContext(archive, stream, null, callback);
|
||||
} catch (Exception error) {
|
||||
try {
|
||||
stream.close();
|
||||
} catch (Throwable ignored) {
|
||||
}
|
||||
try {
|
||||
raf.close();
|
||||
} catch (Throwable ignored) {
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
private static boolean isWrongPassword(ZipException error, boolean encrypted) {
|
||||
@ -396,7 +557,7 @@ public final class JBindExtractorMain {
|
||||
}
|
||||
if (siblingName.startsWith(prefix) && siblingName.length() >= prefix.length() + 2) {
|
||||
String suffix = siblingName.substring(prefix.length());
|
||||
if (suffix.matches("\\d{2,3}")) {
|
||||
if (DIGIT_SUFFIX_RE.matcher(suffix).matches()) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
@ -480,6 +641,12 @@ public final class JBindExtractorMain {
|
||||
}
|
||||
if (normalized.matches("^[a-zA-Z]:.*")) {
|
||||
normalized = normalized.substring(2);
|
||||
while (normalized.startsWith("/")) {
|
||||
normalized = normalized.substring(1);
|
||||
}
|
||||
while (normalized.startsWith("\\")) {
|
||||
normalized = normalized.substring(1);
|
||||
}
|
||||
}
|
||||
File targetCanonical = targetDir.getCanonicalFile();
|
||||
File output = new File(targetCanonical, normalized);
|
||||
@ -488,7 +655,8 @@ public final class JBindExtractorMain {
|
||||
String outputPath = outputCanonical.getPath();
|
||||
String targetPathNorm = isWindows() ? targetPath.toLowerCase(Locale.ROOT) : targetPath;
|
||||
String outputPathNorm = isWindows() ? outputPath.toLowerCase(Locale.ROOT) : outputPath;
|
||||
if (!outputPathNorm.equals(targetPathNorm) && !outputPathNorm.startsWith(targetPathNorm + File.separator)) {
|
||||
String targetPrefix = targetPathNorm.endsWith(File.separator) ? targetPathNorm : targetPathNorm + File.separator;
|
||||
if (!outputPathNorm.equals(targetPathNorm) && !outputPathNorm.startsWith(targetPrefix)) {
|
||||
throw new IOException("Path Traversal blockiert: " + entryName);
|
||||
}
|
||||
return outputCanonical;
|
||||
@ -506,20 +674,50 @@ public final class JBindExtractorMain {
|
||||
if (entry.length() == 0) {
|
||||
return fallback;
|
||||
}
|
||||
// Sanitize Windows special characters from each path segment
|
||||
String[] segments = entry.split("/", -1);
|
||||
StringBuilder sanitized = new StringBuilder();
|
||||
for (int i = 0; i < segments.length; i++) {
|
||||
if (i > 0) {
|
||||
sanitized.append('/');
|
||||
}
|
||||
sanitized.append(WINDOWS_SPECIAL_CHARS_RE.matcher(segments[i]).replaceAll("_"));
|
||||
}
|
||||
entry = sanitized.toString();
|
||||
if (entry.length() == 0) {
|
||||
return fallback;
|
||||
}
|
||||
return entry;
|
||||
}
|
||||
|
||||
private static long safeSize(Long value) {
|
||||
if (value == null) {
|
||||
return 1;
|
||||
return 0;
|
||||
}
|
||||
long size = value.longValue();
|
||||
if (size <= 0) {
|
||||
return 1;
|
||||
return 0;
|
||||
}
|
||||
return size;
|
||||
}
|
||||
|
||||
private static void rejectSymlink(File file) throws IOException {
|
||||
if (file == null) {
|
||||
return;
|
||||
}
|
||||
if (Files.isSymbolicLink(file.toPath())) {
|
||||
throw new IOException("Zieldatei ist ein Symlink, Schreiben verweigert: " + file.getAbsolutePath());
|
||||
}
|
||||
// Also check parent directories for symlinks
|
||||
File parent = file.getParentFile();
|
||||
while (parent != null) {
|
||||
if (Files.isSymbolicLink(parent.toPath())) {
|
||||
throw new IOException("Elternverzeichnis ist ein Symlink, Schreiben verweigert: " + parent.getAbsolutePath());
|
||||
}
|
||||
parent = parent.getParentFile();
|
||||
}
|
||||
}
|
||||
|
||||
private static void ensureDirectory(File dir) throws IOException {
|
||||
if (dir == null) {
|
||||
return;
|
||||
@ -681,6 +879,176 @@ public final class JBindExtractorMain {
|
||||
private final List<String> passwords = new ArrayList<String>();
|
||||
}
|
||||
|
||||
/**
|
||||
* Bulk extraction callback that implements both IArchiveExtractCallback and
|
||||
* ICryptoGetTextPassword. Using the bulk IInArchive.extract() API instead of
|
||||
* per-item extractSlow() is critical for performance — solid RAR archives
|
||||
* otherwise re-decode from the beginning for every single item.
|
||||
*/
|
||||
private static final class BulkExtractCallback implements IArchiveExtractCallback, ICryptoGetTextPassword {
|
||||
private final IInArchive archive;
|
||||
private final Map<Integer, Integer> indexToPos;
|
||||
private final List<Integer> fileIndices;
|
||||
private final List<File> outputFiles;
|
||||
private final List<Long> fileSizes;
|
||||
private final ProgressTracker progress;
|
||||
private final boolean encrypted;
|
||||
private final String password;
|
||||
private final File[] currentOutput;
|
||||
private final FileOutputStream[] currentStream;
|
||||
private final boolean[] currentSuccess;
|
||||
private final long[] currentRemaining;
|
||||
private final int[] currentPos;
|
||||
private final Throwable[] firstError;
|
||||
|
||||
BulkExtractCallback(IInArchive archive, Map<Integer, Integer> indexToPos,
|
||||
List<Integer> fileIndices, List<File> outputFiles, List<Long> fileSizes,
|
||||
ProgressTracker progress, boolean encrypted, String password,
|
||||
File[] currentOutput, FileOutputStream[] currentStream,
|
||||
boolean[] currentSuccess, long[] currentRemaining, int[] currentPos,
|
||||
Throwable[] firstError) {
|
||||
this.archive = archive;
|
||||
this.indexToPos = indexToPos;
|
||||
this.fileIndices = fileIndices;
|
||||
this.outputFiles = outputFiles;
|
||||
this.fileSizes = fileSizes;
|
||||
this.progress = progress;
|
||||
this.encrypted = encrypted;
|
||||
this.password = password;
|
||||
this.currentOutput = currentOutput;
|
||||
this.currentStream = currentStream;
|
||||
this.currentSuccess = currentSuccess;
|
||||
this.currentRemaining = currentRemaining;
|
||||
this.currentPos = currentPos;
|
||||
this.firstError = firstError;
|
||||
}
|
||||
|
||||
@Override
|
||||
public String cryptoGetTextPassword() {
|
||||
return password;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void setTotal(long total) {
|
||||
// 7z reports total compressed bytes; we track uncompressed via ProgressTracker
|
||||
}
|
||||
|
||||
@Override
|
||||
public void setCompleted(long complete) {
|
||||
// Not used — we track per-write progress
|
||||
}
|
||||
|
||||
@Override
|
||||
public ISequentialOutStream getStream(int index, ExtractAskMode extractAskMode) throws SevenZipException {
|
||||
closeCurrentStream();
|
||||
|
||||
Integer pos = indexToPos.get(index);
|
||||
if (pos == null) {
|
||||
return null;
|
||||
}
|
||||
currentPos[0] = pos;
|
||||
currentOutput[0] = outputFiles.get(pos);
|
||||
currentSuccess[0] = false;
|
||||
currentRemaining[0] = fileSizes.get(pos);
|
||||
|
||||
if (extractAskMode != ExtractAskMode.EXTRACT) {
|
||||
currentOutput[0] = null;
|
||||
return null;
|
||||
}
|
||||
|
||||
if (currentOutput[0] == null) {
|
||||
progress.advance(currentRemaining[0]);
|
||||
return null;
|
||||
}
|
||||
|
||||
try {
|
||||
ensureDirectory(currentOutput[0].getParentFile());
|
||||
rejectSymlink(currentOutput[0]);
|
||||
currentStream[0] = new FileOutputStream(currentOutput[0]);
|
||||
} catch (IOException error) {
|
||||
throw new SevenZipException("Fehler beim Erstellen: " + error.getMessage(), error);
|
||||
}
|
||||
|
||||
return new ISequentialOutStream() {
|
||||
@Override
|
||||
public int write(byte[] data) throws SevenZipException {
|
||||
if (data == null || data.length == 0) {
|
||||
return 0;
|
||||
}
|
||||
try {
|
||||
currentStream[0].write(data);
|
||||
} catch (IOException error) {
|
||||
throw new SevenZipException("Fehler beim Schreiben: " + error.getMessage(), error);
|
||||
}
|
||||
long accounted = Math.min(currentRemaining[0], (long) data.length);
|
||||
currentRemaining[0] -= accounted;
|
||||
progress.advance(accounted);
|
||||
return data.length;
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
@Override
|
||||
public void prepareOperation(ExtractAskMode extractAskMode) {
|
||||
// no-op
|
||||
}
|
||||
|
||||
@Override
|
||||
public void setOperationResult(ExtractOperationResult result) throws SevenZipException {
|
||||
if (currentRemaining[0] > 0) {
|
||||
progress.advance(currentRemaining[0]);
|
||||
currentRemaining[0] = 0;
|
||||
}
|
||||
|
||||
if (result == ExtractOperationResult.OK) {
|
||||
currentSuccess[0] = true;
|
||||
closeCurrentStream();
|
||||
if (currentPos[0] >= 0 && currentOutput[0] != null) {
|
||||
try {
|
||||
int archiveIndex = fileIndices.get(currentPos[0]);
|
||||
java.util.Date modified = (java.util.Date) archive.getProperty(archiveIndex, PropID.LAST_MODIFICATION_TIME);
|
||||
if (modified != null) {
|
||||
currentOutput[0].setLastModified(modified.getTime());
|
||||
}
|
||||
} catch (Throwable ignored) {
|
||||
// best effort
|
||||
}
|
||||
}
|
||||
} else {
|
||||
closeCurrentStream();
|
||||
if (currentOutput[0] != null && currentOutput[0].exists()) {
|
||||
try {
|
||||
currentOutput[0].delete();
|
||||
} catch (Throwable ignored) {
|
||||
}
|
||||
}
|
||||
if (firstError[0] == null) {
|
||||
if (isPasswordFailure(result, encrypted)) {
|
||||
firstError[0] = new WrongPasswordException(new IOException("Falsches Passwort"));
|
||||
} else {
|
||||
firstError[0] = new IOException("7z-Fehler: " + result.name());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private void closeCurrentStream() {
|
||||
if (currentStream[0] != null) {
|
||||
try {
|
||||
currentStream[0].close();
|
||||
} catch (Throwable ignored) {
|
||||
}
|
||||
currentStream[0] = null;
|
||||
}
|
||||
if (!currentSuccess[0] && currentOutput[0] != null && currentOutput[0].exists()) {
|
||||
try {
|
||||
currentOutput[0].delete();
|
||||
} catch (Throwable ignored) {
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private static final class WrongPasswordException extends Exception {
|
||||
private static final long serialVersionUID = 1L;
|
||||
|
||||
@ -828,12 +1196,11 @@ public final class JBindExtractorMain {
|
||||
if (filename == null || filename.trim().length() == 0) {
|
||||
return null;
|
||||
}
|
||||
File direct = new File(filename);
|
||||
if (direct.isAbsolute() && direct.exists()) {
|
||||
return direct;
|
||||
}
|
||||
// Always resolve relative to the archive's parent directory.
|
||||
// Never accept absolute paths to prevent path traversal.
|
||||
String baseName = new File(filename).getName();
|
||||
if (archiveDir != null) {
|
||||
File relative = new File(archiveDir, filename);
|
||||
File relative = new File(archiveDir, baseName);
|
||||
if (relative.exists()) {
|
||||
return relative;
|
||||
}
|
||||
@ -843,13 +1210,13 @@ public final class JBindExtractorMain {
|
||||
if (!sibling.isFile()) {
|
||||
continue;
|
||||
}
|
||||
if (sibling.getName().equalsIgnoreCase(filename)) {
|
||||
if (sibling.getName().equalsIgnoreCase(baseName)) {
|
||||
return sibling;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
return direct.exists() ? direct : null;
|
||||
return null;
|
||||
}
|
||||
|
||||
@Override
|
||||
|
||||
@ -2,8 +2,17 @@ const path = require("path");
|
||||
const { rcedit } = require("rcedit");
|
||||
|
||||
module.exports = async function afterPack(context) {
|
||||
const exePath = path.join(context.appOutDir, `${context.packager.appInfo.productFilename}.exe`);
|
||||
const productFilename = context.packager?.appInfo?.productFilename;
|
||||
if (!productFilename) {
|
||||
console.warn(" • rcedit: skipped — productFilename not available");
|
||||
return;
|
||||
}
|
||||
const exePath = path.join(context.appOutDir, `${productFilename}.exe`);
|
||||
const iconPath = path.resolve(__dirname, "..", "assets", "app_icon.ico");
|
||||
console.log(` • rcedit: patching icon → ${exePath}`);
|
||||
await rcedit(exePath, { icon: iconPath });
|
||||
try {
|
||||
await rcedit(exePath, { icon: iconPath });
|
||||
} catch (error) {
|
||||
console.warn(` • rcedit: failed — ${String(error)}`);
|
||||
}
|
||||
};
|
||||
|
||||
@ -31,18 +31,21 @@ async function main(): Promise<void> {
|
||||
login: settings.megaLogin,
|
||||
password: settings.megaPassword
|
||||
}));
|
||||
const service = new DebridService(settings, {
|
||||
megaWebUnrestrict: (link) => megaWeb.unrestrict(link)
|
||||
});
|
||||
for (const link of links) {
|
||||
try {
|
||||
const result = await service.unrestrictLink(link);
|
||||
console.log(`[OK] ${result.providerLabel} -> ${result.fileName}`);
|
||||
} catch (error) {
|
||||
console.log(`[FAIL] ${String(error)}`);
|
||||
try {
|
||||
const service = new DebridService(settings, {
|
||||
megaWebUnrestrict: (link) => megaWeb.unrestrict(link)
|
||||
});
|
||||
for (const link of links) {
|
||||
try {
|
||||
const result = await service.unrestrictLink(link);
|
||||
console.log(`[OK] ${result.providerLabel} -> ${result.fileName}`);
|
||||
} catch (error) {
|
||||
console.log(`[FAIL] ${String(error)}`);
|
||||
}
|
||||
}
|
||||
} finally {
|
||||
megaWeb.dispose();
|
||||
}
|
||||
megaWeb.dispose();
|
||||
}
|
||||
|
||||
void main();
|
||||
main().catch(e => { console.error(e); process.exit(1); });
|
||||
|
||||
@ -16,8 +16,8 @@ function sleep(ms) {
|
||||
}
|
||||
|
||||
function cookieFrom(headers) {
|
||||
const raw = headers.get("set-cookie") || "";
|
||||
return raw.split(",").map((x) => x.split(";")[0].trim()).filter(Boolean).join("; ");
|
||||
const cookies = headers.getSetCookie();
|
||||
return cookies.map((x) => x.split(";")[0].trim()).filter(Boolean).join("; ");
|
||||
}
|
||||
|
||||
function parseDebridCodes(html) {
|
||||
@ -47,6 +47,9 @@ async function resolveCode(cookie, code) {
|
||||
});
|
||||
const text = (await res.text()).trim();
|
||||
if (text === "reload") {
|
||||
if (attempt % 5 === 0) {
|
||||
console.log(` [retry] code=${code} attempt=${attempt}/50 (waiting for server)`);
|
||||
}
|
||||
await sleep(800);
|
||||
continue;
|
||||
}
|
||||
@ -98,7 +101,13 @@ async function main() {
|
||||
redirect: "manual"
|
||||
});
|
||||
|
||||
if (loginRes.status >= 400) {
|
||||
throw new Error(`Login failed with HTTP ${loginRes.status}`);
|
||||
}
|
||||
const cookie = cookieFrom(loginRes.headers);
|
||||
if (!cookie) {
|
||||
throw new Error("Login returned no session cookie");
|
||||
}
|
||||
console.log("login", loginRes.status, loginRes.headers.get("location") || "");
|
||||
|
||||
const debridRes = await fetch("https://www.mega-debrid.eu/index.php?form=debrid", {
|
||||
@ -136,4 +145,4 @@ async function main() {
|
||||
}
|
||||
}
|
||||
|
||||
await main();
|
||||
await main().catch((e) => { console.error(e); process.exit(1); });
|
||||
|
||||
@ -66,6 +66,8 @@ async function callRealDebrid(link) {
|
||||
};
|
||||
}
|
||||
|
||||
// megaCookie is intentionally cached at module scope so that multiple
|
||||
// callMegaDebrid() invocations reuse the same session cookie.
|
||||
async function callMegaDebrid(link) {
|
||||
if (!megaCookie) {
|
||||
const loginRes = await fetch("https://www.mega-debrid.eu/index.php?form=login", {
|
||||
@ -77,13 +79,15 @@ async function callMegaDebrid(link) {
|
||||
body: new URLSearchParams({ login: megaLogin, password: megaPassword, remember: "on" }),
|
||||
redirect: "manual"
|
||||
});
|
||||
megaCookie = (loginRes.headers.get("set-cookie") || "")
|
||||
.split(",")
|
||||
if (loginRes.status >= 400) {
|
||||
return { ok: false, error: `Mega-Web login failed with HTTP ${loginRes.status}` };
|
||||
}
|
||||
megaCookie = loginRes.headers.getSetCookie()
|
||||
.map((chunk) => chunk.split(";")[0].trim())
|
||||
.filter(Boolean)
|
||||
.join("; ");
|
||||
if (!megaCookie) {
|
||||
return { ok: false, error: "Mega-Web login failed" };
|
||||
return { ok: false, error: "Mega-Web login returned no session cookie" };
|
||||
}
|
||||
}
|
||||
|
||||
@ -290,4 +294,4 @@ async function main() {
|
||||
}
|
||||
}
|
||||
|
||||
await main();
|
||||
await main().catch((e) => { console.error(e); process.exit(1); });
|
||||
|
||||
@ -37,7 +37,8 @@ function runWithInput(command, args, input) {
|
||||
cwd: process.cwd(),
|
||||
encoding: "utf8",
|
||||
input,
|
||||
stdio: ["pipe", "pipe", "pipe"]
|
||||
stdio: ["pipe", "pipe", "pipe"],
|
||||
timeout: 10000
|
||||
});
|
||||
if (result.status !== 0) {
|
||||
const stderr = String(result.stderr || "").trim();
|
||||
@ -59,37 +60,74 @@ function parseArgs(argv) {
|
||||
return { help: false, dryRun, version, notes };
|
||||
}
|
||||
|
||||
function parseCodebergRemote(url) {
|
||||
function parseRemoteUrl(url) {
|
||||
const raw = String(url || "").trim();
|
||||
const httpsMatch = raw.match(/^https?:\/\/(?:www\.)?codeberg\.org\/([^/]+)\/([^/]+?)(?:\.git)?$/i);
|
||||
const httpsMatch = raw.match(/^https?:\/\/([^/]+)\/([^/]+)\/([^/]+?)(?:\.git)?$/i);
|
||||
if (httpsMatch) {
|
||||
return { owner: httpsMatch[1], repo: httpsMatch[2] };
|
||||
return { host: httpsMatch[1], owner: httpsMatch[2], repo: httpsMatch[3] };
|
||||
}
|
||||
const sshMatch = raw.match(/^git@codeberg\.org:([^/]+)\/([^/]+?)(?:\.git)?$/i);
|
||||
const sshMatch = raw.match(/^git@([^:]+):([^/]+)\/([^/]+?)(?:\.git)?$/i);
|
||||
if (sshMatch) {
|
||||
return { owner: sshMatch[1], repo: sshMatch[2] };
|
||||
return { host: sshMatch[1], owner: sshMatch[2], repo: sshMatch[3] };
|
||||
}
|
||||
throw new Error(`Cannot parse Codeberg remote URL: ${raw}`);
|
||||
const sshAltMatch = raw.match(/^ssh:\/\/git@([^/:]+)(?::\d+)?\/([^/]+)\/([^/]+?)(?:\.git)?$/i);
|
||||
if (sshAltMatch) {
|
||||
return { host: sshAltMatch[1], owner: sshAltMatch[2], repo: sshAltMatch[3] };
|
||||
}
|
||||
throw new Error(`Cannot parse remote URL: ${raw}`);
|
||||
}
|
||||
|
||||
function getCodebergRepo() {
|
||||
const remotes = ["codeberg", "origin"];
|
||||
function normalizeBaseUrl(url) {
|
||||
const raw = String(url || "").trim().replace(/\/+$/, "");
|
||||
if (!raw) {
|
||||
return "";
|
||||
}
|
||||
if (!/^https?:\/\//i.test(raw)) {
|
||||
throw new Error("GITEA_BASE_URL must start with http:// or https://");
|
||||
}
|
||||
return raw;
|
||||
}
|
||||
|
||||
function getGiteaRepo() {
|
||||
const forcedRemote = String(process.env.GITEA_REMOTE || process.env.FORGEJO_REMOTE || "").trim();
|
||||
const remotes = forcedRemote
|
||||
? [forcedRemote]
|
||||
: ["gitea", "forgejo", "origin", "github-new", "codeberg"];
|
||||
|
||||
const preferredBase = normalizeBaseUrl(process.env.GITEA_BASE_URL || process.env.FORGEJO_BASE_URL || "https://git.24-music.de");
|
||||
|
||||
const preferredProtocol = preferredBase ? new URL(preferredBase).protocol : "https:";
|
||||
|
||||
for (const remote of remotes) {
|
||||
try {
|
||||
const remoteUrl = runCapture("git", ["remote", "get-url", remote]);
|
||||
if (/codeberg\.org/i.test(remoteUrl)) {
|
||||
const parsed = parseCodebergRemote(remoteUrl);
|
||||
return { remote, ...parsed };
|
||||
const parsed = parseRemoteUrl(remoteUrl);
|
||||
const remoteBase = `https://${parsed.host}`.toLowerCase();
|
||||
if (preferredBase && remoteBase !== preferredBase.toLowerCase().replace(/^http:/, "https:")) {
|
||||
continue;
|
||||
}
|
||||
return { remote, ...parsed, baseUrl: `${preferredProtocol}//${parsed.host}` };
|
||||
} catch {
|
||||
// try next remote
|
||||
}
|
||||
}
|
||||
throw new Error("No Codeberg remote found. Add one with: git remote add codeberg https://codeberg.org/<owner>/<repo>.git");
|
||||
|
||||
if (preferredBase) {
|
||||
throw new Error(
|
||||
`No remote found for ${preferredBase}. Add one with: git remote add gitea ${preferredBase}/<owner>/<repo>.git`
|
||||
);
|
||||
}
|
||||
|
||||
throw new Error("No suitable remote found. Set GITEA_REMOTE or GITEA_BASE_URL.");
|
||||
}
|
||||
|
||||
function getCodebergAuthHeader() {
|
||||
const credentialText = runWithInput("git", ["credential", "fill"], "protocol=https\nhost=codeberg.org\n\n");
|
||||
function getAuthHeader(host) {
|
||||
const explicitToken = String(process.env.GITEA_TOKEN || process.env.FORGEJO_TOKEN || "").trim();
|
||||
if (explicitToken) {
|
||||
return `token ${explicitToken}`;
|
||||
}
|
||||
|
||||
const credentialText = runWithInput("git", ["credential", "fill"], `protocol=https\nhost=${host}\n\n`);
|
||||
const map = new Map();
|
||||
for (const line of credentialText.split(/\r?\n/)) {
|
||||
if (!line.includes("=")) {
|
||||
@ -101,7 +139,9 @@ function getCodebergAuthHeader() {
|
||||
const username = map.get("username") || "";
|
||||
const password = map.get("password") || "";
|
||||
if (!username || !password) {
|
||||
throw new Error("Missing Codeberg credentials in git credential helper");
|
||||
throw new Error(
|
||||
`Missing credentials for ${host}. Set GITEA_TOKEN or store credentials for this host in git credential helper.`
|
||||
);
|
||||
}
|
||||
const token = Buffer.from(`${username}:${password}`, "utf8").toString("base64");
|
||||
return `Basic ${token}`;
|
||||
@ -142,7 +182,8 @@ function updatePackageVersion(rootDir, version) {
|
||||
const packagePath = path.join(rootDir, "package.json");
|
||||
const packageJson = JSON.parse(fs.readFileSync(packagePath, "utf8"));
|
||||
if (String(packageJson.version || "") === version) {
|
||||
throw new Error(`package.json is already at version ${version}`);
|
||||
process.stdout.write(`package.json is already at version ${version}, skipping update.\n`);
|
||||
return;
|
||||
}
|
||||
packageJson.version = version;
|
||||
fs.writeFileSync(packagePath, `${JSON.stringify(packageJson, null, 2)}\n`, "utf8");
|
||||
@ -197,8 +238,7 @@ function ensureTagMissing(tag) {
|
||||
}
|
||||
}
|
||||
|
||||
async function createOrGetRelease(owner, repo, tag, authHeader, notes) {
|
||||
const baseApi = `https://codeberg.org/api/v1/repos/${owner}/${repo}`;
|
||||
async function createOrGetRelease(baseApi, tag, authHeader, notes) {
|
||||
const byTag = await apiRequest("GET", `${baseApi}/releases/tags/${encodeURIComponent(tag)}`, authHeader);
|
||||
if (byTag.ok) {
|
||||
return byTag.body;
|
||||
@ -218,13 +258,34 @@ async function createOrGetRelease(owner, repo, tag, authHeader, notes) {
|
||||
return created.body;
|
||||
}
|
||||
|
||||
async function uploadReleaseAssets(owner, repo, releaseId, authHeader, releaseDir, files) {
|
||||
const baseApi = `https://codeberg.org/api/v1/repos/${owner}/${repo}`;
|
||||
async function uploadReleaseAssets(baseApi, releaseId, authHeader, releaseDir, files) {
|
||||
for (const fileName of files) {
|
||||
const filePath = path.join(releaseDir, fileName);
|
||||
const fileData = fs.readFileSync(filePath);
|
||||
const fileSize = fs.statSync(filePath).size;
|
||||
const uploadUrl = `${baseApi}/releases/${releaseId}/assets?name=${encodeURIComponent(fileName)}`;
|
||||
const response = await apiRequest("POST", uploadUrl, authHeader, fileData, "application/octet-stream");
|
||||
|
||||
// Stream large files instead of loading them entirely into memory
|
||||
const fileStream = fs.createReadStream(filePath);
|
||||
const response = await fetch(uploadUrl, {
|
||||
method: "POST",
|
||||
headers: {
|
||||
Accept: "application/json",
|
||||
Authorization: authHeader,
|
||||
"Content-Type": "application/octet-stream",
|
||||
"Content-Length": String(fileSize)
|
||||
},
|
||||
body: fileStream,
|
||||
duplex: "half"
|
||||
});
|
||||
|
||||
const text = await response.text();
|
||||
let parsed;
|
||||
try {
|
||||
parsed = text ? JSON.parse(text) : null;
|
||||
} catch {
|
||||
parsed = text;
|
||||
}
|
||||
|
||||
if (response.ok) {
|
||||
process.stdout.write(`Uploaded: ${fileName}\n`);
|
||||
continue;
|
||||
@ -233,7 +294,7 @@ async function uploadReleaseAssets(owner, repo, releaseId, authHeader, releaseDi
|
||||
process.stdout.write(`Skipped existing asset: ${fileName}\n`);
|
||||
continue;
|
||||
}
|
||||
throw new Error(`Asset upload failed for ${fileName} (${response.status}): ${JSON.stringify(response.body)}`);
|
||||
throw new Error(`Asset upload failed for ${fileName} (${response.status}): ${JSON.stringify(parsed)}`);
|
||||
}
|
||||
}
|
||||
|
||||
@ -241,46 +302,44 @@ async function main() {
|
||||
const rootDir = process.cwd();
|
||||
const args = parseArgs(process.argv);
|
||||
if (args.help) {
|
||||
process.stdout.write("Usage: npm run release:codeberg -- <version> [release notes] [--dry-run]\n");
|
||||
process.stdout.write("Example: npm run release:codeberg -- 1.4.42 \"- Small fixes\"\n");
|
||||
process.stdout.write("Usage: npm run release:gitea -- <version> [release notes] [--dry-run]\n");
|
||||
process.stdout.write("Env: GITEA_BASE_URL, GITEA_REMOTE, GITEA_TOKEN\n");
|
||||
process.stdout.write("Compatibility envs still supported: FORGEJO_BASE_URL, FORGEJO_REMOTE, FORGEJO_TOKEN\n");
|
||||
process.stdout.write("Example: npm run release:gitea -- 1.6.31 \"- Bugfixes\"\n");
|
||||
return;
|
||||
}
|
||||
|
||||
const version = ensureVersionString(args.version);
|
||||
const tag = `v${version}`;
|
||||
const releaseNotes = args.notes || `- Release ${tag}`;
|
||||
const { remote, owner, repo } = getCodebergRepo();
|
||||
const repo = getGiteaRepo();
|
||||
|
||||
ensureNoTrackedChanges();
|
||||
ensureTagMissing(tag);
|
||||
|
||||
if (args.dryRun) {
|
||||
process.stdout.write(`Dry run: would release ${tag}. No changes made.\n`);
|
||||
return;
|
||||
}
|
||||
|
||||
updatePackageVersion(rootDir, version);
|
||||
|
||||
process.stdout.write(`Building release artifacts for ${tag}...\n`);
|
||||
run(NPM_EXECUTABLE, ["run", "release:win"]);
|
||||
const assets = ensureAssetsExist(rootDir, version);
|
||||
|
||||
if (args.dryRun) {
|
||||
process.stdout.write(`Dry run complete. Assets exist for ${tag}.\n`);
|
||||
return;
|
||||
}
|
||||
|
||||
run("git", ["add", "package.json"]);
|
||||
run("git", ["commit", "-m", `Release ${tag}`]);
|
||||
run("git", ["push", remote, "main"]);
|
||||
run("git", ["push", repo.remote, "main"]);
|
||||
run("git", ["tag", tag]);
|
||||
run("git", ["push", remote, tag]);
|
||||
run("git", ["push", repo.remote, tag]);
|
||||
|
||||
const authHeader = getCodebergAuthHeader();
|
||||
const baseRepoApi = `https://codeberg.org/api/v1/repos/${owner}/${repo}`;
|
||||
const patchReleaseEnabled = await apiRequest("PATCH", baseRepoApi, authHeader, JSON.stringify({ has_releases: true }));
|
||||
if (!patchReleaseEnabled.ok) {
|
||||
throw new Error(`Failed to enable releases (${patchReleaseEnabled.status}): ${JSON.stringify(patchReleaseEnabled.body)}`);
|
||||
}
|
||||
const authHeader = getAuthHeader(repo.host);
|
||||
const baseApi = `${repo.baseUrl}/api/v1/repos/${repo.owner}/${repo.repo}`;
|
||||
const release = await createOrGetRelease(baseApi, tag, authHeader, releaseNotes);
|
||||
await uploadReleaseAssets(baseApi, release.id, authHeader, assets.releaseDir, assets.files);
|
||||
|
||||
const release = await createOrGetRelease(owner, repo, tag, authHeader, releaseNotes);
|
||||
await uploadReleaseAssets(owner, repo, release.id, authHeader, assets.releaseDir, assets.files);
|
||||
|
||||
process.stdout.write(`Release published: ${release.html_url}\n`);
|
||||
process.stdout.write(`Release published: ${release.html_url || `${repo.baseUrl}/${repo.owner}/${repo.repo}/releases/tag/${tag}`}\n`);
|
||||
}
|
||||
|
||||
main().catch((error) => {
|
||||
@ -1,24 +0,0 @@
|
||||
import fs from "node:fs";
|
||||
import path from "node:path";
|
||||
|
||||
const version = process.argv[2];
|
||||
if (!version) {
|
||||
console.error("Usage: node scripts/set_version_node.mjs <version>");
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const root = process.cwd();
|
||||
|
||||
const packageJsonPath = path.join(root, "package.json");
|
||||
const packageJson = JSON.parse(fs.readFileSync(packageJsonPath, "utf8"));
|
||||
packageJson.version = version;
|
||||
fs.writeFileSync(packageJsonPath, `${JSON.stringify(packageJson, null, 2)}\n`, "utf8");
|
||||
|
||||
const constantsPath = path.join(root, "src", "main", "constants.ts");
|
||||
const constants = fs.readFileSync(constantsPath, "utf8").replace(
|
||||
/APP_VERSION = "[^"]+"/,
|
||||
`APP_VERSION = "${version}"`
|
||||
);
|
||||
fs.writeFileSync(constantsPath, constants, "utf8");
|
||||
|
||||
console.log(`Set version to ${version}`);
|
||||
@ -5,6 +5,7 @@ import {
|
||||
AppSettings,
|
||||
DuplicatePolicy,
|
||||
HistoryEntry,
|
||||
PackagePriority,
|
||||
ParsedPackageInput,
|
||||
SessionStats,
|
||||
StartConflictEntry,
|
||||
@ -19,8 +20,9 @@ import { APP_VERSION } from "./constants";
|
||||
import { DownloadManager } from "./download-manager";
|
||||
import { parseCollectorInput } from "./link-parser";
|
||||
import { configureLogger, getLogFilePath, logger } from "./logger";
|
||||
import { initSessionLog, getSessionLogPath, shutdownSessionLog } from "./session-log";
|
||||
import { MegaWebFallback } from "./mega-web-fallback";
|
||||
import { addHistoryEntry, clearHistory, createStoragePaths, loadHistory, loadSession, loadSettings, normalizeSettings, removeHistoryEntry, saveSession, saveSettings } from "./storage";
|
||||
import { addHistoryEntry, cancelPendingAsyncSaves, clearHistory, createStoragePaths, loadHistory, loadSession, loadSettings, normalizeLoadedSession, normalizeLoadedSessionTransientFields, normalizeSettings, removeHistoryEntry, saveSession, saveSettings } from "./storage";
|
||||
import { abortActiveUpdateDownload, checkGitHubUpdate, installLatestUpdate } from "./update";
|
||||
import { startDebugServer, stopDebugServer } from "./debug-server";
|
||||
|
||||
@ -52,6 +54,7 @@ export class AppController {
|
||||
|
||||
public constructor() {
|
||||
configureLogger(this.storagePaths.baseDir);
|
||||
initSessionLog(this.storagePaths.baseDir);
|
||||
this.settings = loadSettings(this.storagePaths);
|
||||
const session = loadSession(this.storagePaths);
|
||||
this.megaWebFallback = new MegaWebFallback(() => ({
|
||||
@ -79,8 +82,15 @@ export class AppController {
|
||||
void this.manager.getStartConflicts().then((conflicts) => {
|
||||
const hasConflicts = conflicts.length > 0;
|
||||
if (this.hasAnyProviderToken(this.settings) && !hasConflicts) {
|
||||
this.autoResumePending = true;
|
||||
logger.info("Auto-Resume beim Start vorgemerkt");
|
||||
// If the onState handler is already set (renderer connected), start immediately.
|
||||
// Otherwise mark as pending so the onState setter triggers the start.
|
||||
if (this.onStateHandler) {
|
||||
logger.info("Auto-Resume beim Start aktiviert (nach Konflikt-Check)");
|
||||
void this.manager.start().catch((err) => logger.warn(`Auto-Resume Start Fehler: ${String(err)}`));
|
||||
} else {
|
||||
this.autoResumePending = true;
|
||||
logger.info("Auto-Resume beim Start vorgemerkt");
|
||||
}
|
||||
} else if (hasConflicts) {
|
||||
logger.info("Auto-Resume übersprungen: Start-Konflikte erkannt");
|
||||
}
|
||||
@ -95,6 +105,7 @@ export class AppController {
|
||||
|| (settings.megaLogin.trim() && settings.megaPassword.trim())
|
||||
|| settings.bestToken.trim()
|
||||
|| settings.allDebridToken.trim()
|
||||
|| (settings.ddownloadLogin.trim() && settings.ddownloadPassword.trim())
|
||||
);
|
||||
}
|
||||
|
||||
@ -110,6 +121,9 @@ export class AppController {
|
||||
this.autoResumePending = false;
|
||||
void this.manager.start().catch((err) => logger.warn(`Auto-Resume Start Fehler: ${String(err)}`));
|
||||
logger.info("Auto-Resume beim Start aktiviert");
|
||||
} else {
|
||||
// Trigger pending extractions without starting the session
|
||||
this.manager.triggerIdleExtractions();
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -156,6 +170,12 @@ export class AppController {
|
||||
}
|
||||
|
||||
public async installUpdate(onProgress?: (progress: UpdateInstallProgress) => void): Promise<UpdateInstallResult> {
|
||||
// Stop active downloads before installing. Extractions may continue briefly
|
||||
// until prepareForShutdown() is called during app quit.
|
||||
if (this.manager.isSessionRunning()) {
|
||||
this.manager.stop();
|
||||
}
|
||||
|
||||
const cacheAgeMs = Date.now() - this.lastUpdateCheckAt;
|
||||
const cached = this.lastUpdateCheck && !this.lastUpdateCheck.error && cacheAgeMs <= 10 * 60 * 1000
|
||||
? this.lastUpdateCheck
|
||||
@ -208,6 +228,10 @@ export class AppController {
|
||||
await this.manager.startPackages(packageIds);
|
||||
}
|
||||
|
||||
public async startItems(itemIds: string[]): Promise<void> {
|
||||
await this.manager.startItems(itemIds);
|
||||
}
|
||||
|
||||
public stop(): void {
|
||||
this.manager.stop();
|
||||
}
|
||||
@ -224,6 +248,10 @@ export class AppController {
|
||||
this.manager.extractNow(packageId);
|
||||
}
|
||||
|
||||
public resetPackage(packageId: string): void {
|
||||
this.manager.resetPackage(packageId);
|
||||
}
|
||||
|
||||
public cancelPackage(packageId: string): void {
|
||||
this.manager.cancelPackage(packageId);
|
||||
}
|
||||
@ -257,7 +285,14 @@ export class AppController {
|
||||
}
|
||||
|
||||
public exportBackup(): string {
|
||||
const settings = this.settings;
|
||||
const settings = { ...this.settings };
|
||||
const SENSITIVE_KEYS: (keyof AppSettings)[] = ["token", "megaLogin", "megaPassword", "bestToken", "allDebridToken", "ddownloadLogin", "ddownloadPassword"];
|
||||
for (const key of SENSITIVE_KEYS) {
|
||||
const val = settings[key];
|
||||
if (typeof val === "string" && val.length > 0) {
|
||||
(settings as Record<string, unknown>)[key] = `***${val.slice(-4)}`;
|
||||
}
|
||||
}
|
||||
const session = this.manager.getSession();
|
||||
return JSON.stringify({ version: 1, settings, session }, null, 2);
|
||||
}
|
||||
@ -272,20 +307,49 @@ export class AppController {
|
||||
if (!parsed || typeof parsed !== "object" || !parsed.settings || !parsed.session) {
|
||||
return { restored: false, message: "Kein gültiges Backup (settings/session fehlen)" };
|
||||
}
|
||||
const restoredSettings = normalizeSettings(parsed.settings as AppSettings);
|
||||
const importedSettings = parsed.settings as AppSettings;
|
||||
const SENSITIVE_KEYS: (keyof AppSettings)[] = ["token", "megaLogin", "megaPassword", "bestToken", "allDebridToken", "ddownloadLogin", "ddownloadPassword"];
|
||||
for (const key of SENSITIVE_KEYS) {
|
||||
const val = (importedSettings as Record<string, unknown>)[key];
|
||||
if (typeof val === "string" && val.startsWith("***")) {
|
||||
(importedSettings as Record<string, unknown>)[key] = (this.settings as Record<string, unknown>)[key];
|
||||
}
|
||||
}
|
||||
const restoredSettings = normalizeSettings(importedSettings);
|
||||
this.settings = restoredSettings;
|
||||
saveSettings(this.storagePaths, this.settings);
|
||||
this.manager.setSettings(this.settings);
|
||||
const restoredSession = parsed.session as ReturnType<typeof loadSession>;
|
||||
// Full stop including extraction abort — the old session is being replaced,
|
||||
// so no extraction tasks from it should keep running.
|
||||
this.manager.stop();
|
||||
this.manager.abortAllPostProcessing();
|
||||
// Cancel any deferred persist timer and queued async writes so the old
|
||||
// in-memory session does not overwrite the restored session file on disk.
|
||||
this.manager.clearPersistTimer();
|
||||
cancelPendingAsyncSaves();
|
||||
const restoredSession = normalizeLoadedSessionTransientFields(
|
||||
normalizeLoadedSession(parsed.session)
|
||||
);
|
||||
saveSession(this.storagePaths, restoredSession);
|
||||
// Prevent prepareForShutdown from overwriting the restored session file
|
||||
// with the old in-memory session when the app quits after backup restore.
|
||||
this.manager.skipShutdownPersist = true;
|
||||
// Block all persistence (including persistSoon from any IPC operations
|
||||
// the user might trigger before restarting) to protect the restored backup.
|
||||
this.manager.blockAllPersistence = true;
|
||||
return { restored: true, message: "Backup wiederhergestellt. Bitte App neustarten." };
|
||||
}
|
||||
|
||||
public getSessionLogPath(): string | null {
|
||||
return getSessionLogPath();
|
||||
}
|
||||
|
||||
public shutdown(): void {
|
||||
stopDebugServer();
|
||||
abortActiveUpdateDownload();
|
||||
this.manager.prepareForShutdown();
|
||||
this.megaWebFallback.dispose();
|
||||
shutdownSessionLog();
|
||||
logger.info("App beendet");
|
||||
}
|
||||
|
||||
@ -297,6 +361,18 @@ export class AppController {
|
||||
clearHistory(this.storagePaths);
|
||||
}
|
||||
|
||||
public setPackagePriority(packageId: string, priority: PackagePriority): void {
|
||||
this.manager.setPackagePriority(packageId, priority);
|
||||
}
|
||||
|
||||
public skipItems(itemIds: string[]): void {
|
||||
this.manager.skipItems(itemIds);
|
||||
}
|
||||
|
||||
public resetItems(itemIds: string[]): void {
|
||||
this.manager.resetItems(itemIds);
|
||||
}
|
||||
|
||||
public removeHistoryEntry(entryId: string): void {
|
||||
removeHistoryEntry(this.storagePaths, entryId);
|
||||
}
|
||||
|
||||
@ -16,6 +16,12 @@ export const DLC_AES_IV = Buffer.from("9bc24cb995cb8db3", "utf8");
|
||||
export const REQUEST_RETRIES = 3;
|
||||
export const CHUNK_SIZE = 512 * 1024;
|
||||
|
||||
export const WRITE_BUFFER_SIZE = 512 * 1024; // 512 KB write buffer (JDownloader: 500 KB)
|
||||
export const WRITE_FLUSH_TIMEOUT_MS = 2000; // 2s flush timeout
|
||||
export const ALLOCATION_UNIT_SIZE = 4096; // 4 KB NTFS alignment
|
||||
export const STREAM_HIGH_WATER_MARK = 512 * 1024; // 512 KB stream buffer — lower than before (2 MB) so backpressure triggers sooner when disk is slow
|
||||
export const DISK_BUSY_THRESHOLD_MS = 300; // Show "Warte auf Festplatte" if writableLength > 0 for this long
|
||||
|
||||
export const SAMPLE_DIR_NAMES = new Set(["sample", "samples"]);
|
||||
export const SAMPLE_VIDEO_EXTENSIONS = new Set([".mkv", ".mp4", ".avi", ".mov", ".wmv", ".m4v", ".ts", ".m2ts", ".webm"]);
|
||||
export const LINK_ARTIFACT_EXTENSIONS = new Set([".url", ".webloc", ".dlc", ".rsdf", ".ccf"]);
|
||||
@ -26,10 +32,10 @@ export const RAR_SPLIT_RE = /\.r\d{2,3}$/i;
|
||||
|
||||
export const MAX_MANIFEST_FILE_BYTES = 5 * 1024 * 1024;
|
||||
export const MAX_LINK_ARTIFACT_BYTES = 256 * 1024;
|
||||
export const SPEED_WINDOW_SECONDS = 3;
|
||||
export const SPEED_WINDOW_SECONDS = 1;
|
||||
export const CLIPBOARD_POLL_INTERVAL_MS = 2000;
|
||||
|
||||
export const DEFAULT_UPDATE_REPO = "Sucukdeluxe/real-debrid-downloader";
|
||||
export const DEFAULT_UPDATE_REPO = "Administrator/real-debrid-downloader";
|
||||
|
||||
export function defaultSettings(): AppSettings {
|
||||
const baseDir = path.join(os.homedir(), "Downloads", "RealDebrid");
|
||||
@ -39,6 +45,8 @@ export function defaultSettings(): AppSettings {
|
||||
megaPassword: "",
|
||||
bestToken: "",
|
||||
allDebridToken: "",
|
||||
ddownloadLogin: "",
|
||||
ddownloadPassword: "",
|
||||
archivePasswordList: "",
|
||||
rememberToken: true,
|
||||
providerPrimary: "realdebrid",
|
||||
@ -78,6 +86,9 @@ export function defaultSettings(): AppSettings {
|
||||
autoSkipExtracted: false,
|
||||
confirmDeleteSelection: true,
|
||||
totalDownloadedAllTime: 0,
|
||||
bandwidthSchedules: []
|
||||
bandwidthSchedules: [],
|
||||
columnOrder: ["name", "size", "progress", "hoster", "account", "prio", "status", "speed"],
|
||||
extractCpuPriority: "high",
|
||||
autoExtractWhenStopped: true
|
||||
};
|
||||
}
|
||||
|
||||
@ -164,7 +164,7 @@ async function decryptDlcLocal(filePath: string): Promise<ParsedPackageInput[]>
|
||||
const dlcData = content.slice(0, -88);
|
||||
|
||||
const rcUrl = DLC_SERVICE_URL.replace("{KEY}", encodeURIComponent(dlcKey));
|
||||
const rcResponse = await fetch(rcUrl, { method: "GET" });
|
||||
const rcResponse = await fetch(rcUrl, { method: "GET", signal: AbortSignal.timeout(30000) });
|
||||
if (!rcResponse.ok) {
|
||||
return [];
|
||||
}
|
||||
@ -217,7 +217,8 @@ async function tryDcryptUpload(fileContent: Buffer, fileName: string): Promise<s
|
||||
|
||||
const response = await fetch(DCRYPT_UPLOAD_URL, {
|
||||
method: "POST",
|
||||
body: form
|
||||
body: form,
|
||||
signal: AbortSignal.timeout(30000)
|
||||
});
|
||||
if (response.status === 413) {
|
||||
return null;
|
||||
@ -235,7 +236,8 @@ async function tryDcryptPaste(fileContent: Buffer): Promise<string[] | null> {
|
||||
|
||||
const response = await fetch(DCRYPT_PASTE_URL, {
|
||||
method: "POST",
|
||||
body: form
|
||||
body: form,
|
||||
signal: AbortSignal.timeout(30000)
|
||||
});
|
||||
if (response.status === 413) {
|
||||
return null;
|
||||
|
||||
@ -15,7 +15,8 @@ const PROVIDER_LABELS: Record<DebridProvider, string> = {
|
||||
realdebrid: "Real-Debrid",
|
||||
megadebrid: "Mega-Debrid",
|
||||
bestdebrid: "BestDebrid",
|
||||
alldebrid: "AllDebrid"
|
||||
alldebrid: "AllDebrid",
|
||||
ddownload: "DDownload"
|
||||
};
|
||||
|
||||
interface ProviderUnrestrictedLink extends UnrestrictedLink {
|
||||
@ -226,7 +227,9 @@ function isRapidgatorLink(link: string): boolean {
|
||||
return hostname === "rapidgator.net"
|
||||
|| hostname.endsWith(".rapidgator.net")
|
||||
|| hostname === "rg.to"
|
||||
|| hostname.endsWith(".rg.to");
|
||||
|| hostname.endsWith(".rg.to")
|
||||
|| hostname === "rapidgator.asia"
|
||||
|| hostname.endsWith(".rapidgator.asia");
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
@ -315,7 +318,7 @@ async function runWithConcurrency<T>(items: T[], concurrency: number, worker: (i
|
||||
let index = 0;
|
||||
let firstError: unknown = null;
|
||||
const next = (): T | undefined => {
|
||||
if (index >= items.length) {
|
||||
if (firstError || index >= items.length) {
|
||||
return undefined;
|
||||
}
|
||||
const item = items[index];
|
||||
@ -415,6 +418,7 @@ async function resolveRapidgatorFilename(link: string, signal?: AbortSignal): Pr
|
||||
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
|
||||
});
|
||||
if (!response.ok) {
|
||||
try { await response.body?.cancel(); } catch { /* drain socket */ }
|
||||
if (shouldRetryStatus(response.status) && attempt < REQUEST_RETRIES + 2) {
|
||||
await sleepWithSignal(retryDelayForResponse(response, attempt), signal);
|
||||
continue;
|
||||
@ -430,9 +434,11 @@ async function resolveRapidgatorFilename(link: string, signal?: AbortSignal): Pr
|
||||
&& !contentType.includes("text/plain")
|
||||
&& !contentType.includes("text/xml")
|
||||
&& !contentType.includes("application/xml")) {
|
||||
try { await response.body?.cancel(); } catch { /* drain socket */ }
|
||||
return "";
|
||||
}
|
||||
if (!contentType && Number.isFinite(contentLength) && contentLength > RAPIDGATOR_SCAN_MAX_BYTES) {
|
||||
try { await response.body?.cancel(); } catch { /* drain socket */ }
|
||||
return "";
|
||||
}
|
||||
|
||||
@ -444,7 +450,7 @@ async function resolveRapidgatorFilename(link: string, signal?: AbortSignal): Pr
|
||||
return "";
|
||||
} catch (error) {
|
||||
const errorText = compactErrorText(error);
|
||||
if (/aborted/i.test(errorText)) {
|
||||
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
|
||||
throw error;
|
||||
}
|
||||
if (attempt >= REQUEST_RETRIES + 2 || !isRetryableErrorText(errorText)) {
|
||||
@ -460,6 +466,138 @@ async function resolveRapidgatorFilename(link: string, signal?: AbortSignal): Pr
|
||||
return "";
|
||||
}
|
||||
|
||||
export interface RapidgatorCheckResult {
|
||||
online: boolean;
|
||||
fileName: string;
|
||||
fileSize: string | null;
|
||||
}
|
||||
|
||||
const RG_FILE_ID_RE = /\/file\/([a-z0-9]{32}|\d+)/i;
|
||||
const RG_FILE_NOT_FOUND_RE = />\s*404\s*File not found/i;
|
||||
const RG_FILESIZE_RE = /File\s*size:\s*<strong>([^<>"]+)<\/strong>/i;
|
||||
|
||||
export async function checkRapidgatorOnline(
|
||||
link: string,
|
||||
signal?: AbortSignal
|
||||
): Promise<RapidgatorCheckResult | null> {
|
||||
if (!isRapidgatorLink(link)) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const fileIdMatch = link.match(RG_FILE_ID_RE);
|
||||
if (!fileIdMatch) {
|
||||
return null;
|
||||
}
|
||||
const fileId = fileIdMatch[1];
|
||||
const headers = {
|
||||
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/133.0.0.0 Safari/537.36",
|
||||
Accept: "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8",
|
||||
"Accept-Language": "en-US,en;q=0.9,de;q=0.8"
|
||||
};
|
||||
|
||||
// Fast path: HEAD request (no body download, much faster)
|
||||
for (let attempt = 1; attempt <= REQUEST_RETRIES + 1; attempt += 1) {
|
||||
try {
|
||||
if (signal?.aborted) throw new Error("aborted:debrid");
|
||||
|
||||
const response = await fetch(link, {
|
||||
method: "HEAD",
|
||||
redirect: "follow",
|
||||
headers,
|
||||
signal: withTimeoutSignal(signal, 15000)
|
||||
});
|
||||
|
||||
if (response.status === 404) {
|
||||
return { online: false, fileName: "", fileSize: null };
|
||||
}
|
||||
|
||||
if (response.ok) {
|
||||
const finalUrl = response.url || link;
|
||||
if (!finalUrl.includes(fileId)) {
|
||||
return { online: false, fileName: "", fileSize: null };
|
||||
}
|
||||
// HEAD 200 + URL still contains file ID → online
|
||||
const fileName = filenameFromRapidgatorUrlPath(link);
|
||||
return { online: true, fileName, fileSize: null };
|
||||
}
|
||||
|
||||
// Non-OK, non-404: retry or give up
|
||||
if (shouldRetryStatus(response.status) && attempt <= REQUEST_RETRIES) {
|
||||
await sleepWithSignal(retryDelayForResponse(response, attempt), signal);
|
||||
continue;
|
||||
}
|
||||
|
||||
// HEAD inconclusive — fall through to GET
|
||||
break;
|
||||
} catch (error) {
|
||||
const errorText = compactErrorText(error);
|
||||
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) throw error;
|
||||
if (attempt > REQUEST_RETRIES || !isRetryableErrorText(errorText)) {
|
||||
break; // fall through to GET
|
||||
}
|
||||
await sleepWithSignal(retryDelay(attempt), signal);
|
||||
}
|
||||
}
|
||||
|
||||
// Slow path: GET request (downloads HTML, more thorough)
|
||||
for (let attempt = 1; attempt <= REQUEST_RETRIES + 1; attempt += 1) {
|
||||
try {
|
||||
if (signal?.aborted) throw new Error("aborted:debrid");
|
||||
|
||||
const response = await fetch(link, {
|
||||
method: "GET",
|
||||
redirect: "follow",
|
||||
headers,
|
||||
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
|
||||
});
|
||||
|
||||
if (response.status === 404) {
|
||||
try { await response.body?.cancel(); } catch { /* drain socket */ }
|
||||
return { online: false, fileName: "", fileSize: null };
|
||||
}
|
||||
|
||||
if (!response.ok) {
|
||||
try { await response.body?.cancel(); } catch { /* drain socket */ }
|
||||
if (shouldRetryStatus(response.status) && attempt <= REQUEST_RETRIES) {
|
||||
await sleepWithSignal(retryDelayForResponse(response, attempt), signal);
|
||||
continue;
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
const finalUrl = response.url || link;
|
||||
if (!finalUrl.includes(fileId)) {
|
||||
try { await response.body?.cancel(); } catch { /* drain socket */ }
|
||||
return { online: false, fileName: "", fileSize: null };
|
||||
}
|
||||
|
||||
const html = await readResponseTextLimited(response, RAPIDGATOR_SCAN_MAX_BYTES, signal);
|
||||
|
||||
if (RG_FILE_NOT_FOUND_RE.test(html)) {
|
||||
return { online: false, fileName: "", fileSize: null };
|
||||
}
|
||||
|
||||
const fileName = extractRapidgatorFilenameFromHtml(html) || filenameFromRapidgatorUrlPath(link);
|
||||
const sizeMatch = html.match(RG_FILESIZE_RE);
|
||||
const fileSize = sizeMatch ? sizeMatch[1].trim() : null;
|
||||
|
||||
return { online: true, fileName, fileSize };
|
||||
} catch (error) {
|
||||
const errorText = compactErrorText(error);
|
||||
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) throw error;
|
||||
if (attempt > REQUEST_RETRIES || !isRetryableErrorText(errorText)) {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
if (attempt <= REQUEST_RETRIES) {
|
||||
await sleepWithSignal(retryDelay(attempt), signal);
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
function buildBestDebridRequests(link: string, token: string): BestDebridRequest[] {
|
||||
const linkParam = encodeURIComponent(link);
|
||||
const safeToken = String(token || "").trim();
|
||||
@ -503,7 +641,7 @@ class MegaDebridClient {
|
||||
throw new Error("Mega-Web Antwort ohne Download-Link");
|
||||
}
|
||||
if (!lastError) {
|
||||
lastError = web ? "Mega-Web Antwort ohne Download-Link" : "Mega-Web Antwort leer";
|
||||
lastError = "Mega-Web Antwort leer";
|
||||
}
|
||||
// Don't retry permanent hoster errors (dead link, file removed, etc.)
|
||||
if (/permanent ungültig|hosternotavailable|file.?not.?found|file.?unavailable|link.?is.?dead/i.test(lastError)) {
|
||||
@ -513,7 +651,7 @@ class MegaDebridClient {
|
||||
await sleepWithSignal(retryDelay(attempt), signal);
|
||||
}
|
||||
}
|
||||
throw new Error(lastError || "Mega-Web Unrestrict fehlgeschlagen");
|
||||
throw new Error(String(lastError || "Mega-Web Unrestrict fehlgeschlagen").replace(/^Error:\s*/i, ""));
|
||||
}
|
||||
}
|
||||
|
||||
@ -532,7 +670,11 @@ class BestDebridClient {
|
||||
try {
|
||||
return await this.tryRequest(request, link, signal);
|
||||
} catch (error) {
|
||||
lastError = compactErrorText(error);
|
||||
const errorText = compactErrorText(error);
|
||||
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
|
||||
throw error;
|
||||
}
|
||||
lastError = errorText;
|
||||
}
|
||||
}
|
||||
|
||||
@ -597,7 +739,7 @@ class BestDebridClient {
|
||||
throw new Error("BestDebrid Antwort ohne Download-Link");
|
||||
} catch (error) {
|
||||
lastError = compactErrorText(error);
|
||||
if (signal?.aborted || /aborted/i.test(lastError)) {
|
||||
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
|
||||
break;
|
||||
}
|
||||
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(lastError)) {
|
||||
@ -691,7 +833,7 @@ class AllDebridClient {
|
||||
break;
|
||||
} catch (error) {
|
||||
const errorText = compactErrorText(error);
|
||||
if (signal?.aborted || /aborted/i.test(errorText)) {
|
||||
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
|
||||
throw error;
|
||||
}
|
||||
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(errorText)) {
|
||||
@ -803,7 +945,7 @@ class AllDebridClient {
|
||||
};
|
||||
} catch (error) {
|
||||
lastError = compactErrorText(error);
|
||||
if (signal?.aborted || /aborted/i.test(lastError)) {
|
||||
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
|
||||
break;
|
||||
}
|
||||
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(lastError)) {
|
||||
@ -813,7 +955,197 @@ class AllDebridClient {
|
||||
}
|
||||
}
|
||||
|
||||
throw new Error(lastError || "AllDebrid Unrestrict fehlgeschlagen");
|
||||
throw new Error(String(lastError || "AllDebrid Unrestrict fehlgeschlagen").replace(/^Error:\s*/i, ""));
|
||||
}
|
||||
}
|
||||
|
||||
const DDOWNLOAD_URL_RE = /^https?:\/\/(?:www\.)?(?:ddownload\.com|ddl\.to)\/([a-z0-9]+)/i;
|
||||
const DDOWNLOAD_WEB_BASE = "https://ddownload.com";
|
||||
const DDOWNLOAD_WEB_UA = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/133.0.0.0 Safari/537.36";
|
||||
|
||||
class DdownloadClient {
|
||||
private login: string;
|
||||
private password: string;
|
||||
private cookies: string = "";
|
||||
|
||||
public constructor(login: string, password: string) {
|
||||
this.login = login;
|
||||
this.password = password;
|
||||
}
|
||||
|
||||
private async webLogin(signal?: AbortSignal): Promise<void> {
|
||||
// Step 1: GET login page to extract form token
|
||||
const loginPageRes = await fetch(`${DDOWNLOAD_WEB_BASE}/login.html`, {
|
||||
headers: { "User-Agent": DDOWNLOAD_WEB_UA },
|
||||
redirect: "manual",
|
||||
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
|
||||
});
|
||||
const loginPageHtml = await loginPageRes.text();
|
||||
const tokenMatch = loginPageHtml.match(/name="token" value="([^"]+)"/);
|
||||
const pageCookies = (loginPageRes.headers.getSetCookie?.() || []).map((c: string) => c.split(";")[0]).join("; ");
|
||||
|
||||
// Step 2: POST login
|
||||
const body = new URLSearchParams({
|
||||
op: "login",
|
||||
token: tokenMatch?.[1] || "",
|
||||
rand: "",
|
||||
redirect: "",
|
||||
login: this.login,
|
||||
password: this.password
|
||||
});
|
||||
const loginRes = await fetch(`${DDOWNLOAD_WEB_BASE}/`, {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"User-Agent": DDOWNLOAD_WEB_UA,
|
||||
"Content-Type": "application/x-www-form-urlencoded",
|
||||
...(pageCookies ? { Cookie: pageCookies } : {})
|
||||
},
|
||||
body: body.toString(),
|
||||
redirect: "manual",
|
||||
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
|
||||
});
|
||||
|
||||
// Drain body
|
||||
try { await loginRes.text(); } catch { /* ignore */ }
|
||||
|
||||
const setCookies = loginRes.headers.getSetCookie?.() || [];
|
||||
const xfss = setCookies.find((c: string) => c.startsWith("xfss="));
|
||||
const loginCookie = setCookies.find((c: string) => c.startsWith("login="));
|
||||
if (!xfss) {
|
||||
throw new Error("DDownload Login fehlgeschlagen (kein Session-Cookie)");
|
||||
}
|
||||
this.cookies = [loginCookie, xfss].filter(Boolean).map((c: string) => c.split(";")[0]).join("; ");
|
||||
}
|
||||
|
||||
public async unrestrictLink(link: string, signal?: AbortSignal): Promise<UnrestrictedLink> {
|
||||
const match = link.match(DDOWNLOAD_URL_RE);
|
||||
if (!match) {
|
||||
throw new Error("Kein DDownload-Link");
|
||||
}
|
||||
const fileCode = match[1];
|
||||
let lastError = "";
|
||||
|
||||
for (let attempt = 1; attempt <= REQUEST_RETRIES; attempt += 1) {
|
||||
try {
|
||||
if (signal?.aborted) throw new Error("aborted:debrid");
|
||||
|
||||
// Login if no session yet
|
||||
if (!this.cookies) {
|
||||
await this.webLogin(signal);
|
||||
}
|
||||
|
||||
// Step 1: GET file page to extract form fields
|
||||
const filePageRes = await fetch(`${DDOWNLOAD_WEB_BASE}/${fileCode}`, {
|
||||
headers: {
|
||||
"User-Agent": DDOWNLOAD_WEB_UA,
|
||||
Cookie: this.cookies
|
||||
},
|
||||
redirect: "manual",
|
||||
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
|
||||
});
|
||||
|
||||
// Premium with direct downloads enabled → redirect immediately
|
||||
if (filePageRes.status >= 300 && filePageRes.status < 400) {
|
||||
const directUrl = filePageRes.headers.get("location") || "";
|
||||
try { await filePageRes.text(); } catch { /* drain */ }
|
||||
if (directUrl) {
|
||||
return {
|
||||
fileName: filenameFromUrl(directUrl) || filenameFromUrl(link),
|
||||
directUrl,
|
||||
fileSize: null,
|
||||
retriesUsed: attempt - 1,
|
||||
skipTlsVerify: true
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
const html = await filePageRes.text();
|
||||
|
||||
// Check for file not found
|
||||
if (/File Not Found|file was removed|file was banned/i.test(html)) {
|
||||
throw new Error("DDownload: Datei nicht gefunden");
|
||||
}
|
||||
|
||||
// Extract form fields
|
||||
const idVal = html.match(/name="id" value="([^"]+)"/)?.[1] || fileCode;
|
||||
const randVal = html.match(/name="rand" value="([^"]+)"/)?.[1] || "";
|
||||
const fileNameMatch = html.match(/class="file-info-name"[^>]*>([^<]+)</);
|
||||
const fileName = fileNameMatch?.[1]?.trim() || filenameFromUrl(link);
|
||||
|
||||
// Step 2: POST download2 for premium download
|
||||
const dlBody = new URLSearchParams({
|
||||
op: "download2",
|
||||
id: idVal,
|
||||
rand: randVal,
|
||||
referer: "",
|
||||
method_premium: "1",
|
||||
adblock_detected: "0"
|
||||
});
|
||||
|
||||
const dlRes = await fetch(`${DDOWNLOAD_WEB_BASE}/${fileCode}`, {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"User-Agent": DDOWNLOAD_WEB_UA,
|
||||
"Content-Type": "application/x-www-form-urlencoded",
|
||||
Cookie: this.cookies,
|
||||
Referer: `${DDOWNLOAD_WEB_BASE}/${fileCode}`
|
||||
},
|
||||
body: dlBody.toString(),
|
||||
redirect: "manual",
|
||||
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
|
||||
});
|
||||
|
||||
if (dlRes.status >= 300 && dlRes.status < 400) {
|
||||
const directUrl = dlRes.headers.get("location") || "";
|
||||
try { await dlRes.text(); } catch { /* drain */ }
|
||||
if (directUrl) {
|
||||
return {
|
||||
fileName: fileName || filenameFromUrl(directUrl),
|
||||
directUrl,
|
||||
fileSize: null,
|
||||
retriesUsed: attempt - 1,
|
||||
skipTlsVerify: true
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
const dlHtml = await dlRes.text();
|
||||
// Try to find direct URL in response HTML
|
||||
const directMatch = dlHtml.match(/https?:\/\/[a-z0-9]+\.(?:dstorage\.org|ddownload\.com|ddl\.to|ucdn\.to)[^\s"'<>]+/i);
|
||||
if (directMatch) {
|
||||
return {
|
||||
fileName,
|
||||
directUrl: directMatch[0],
|
||||
fileSize: null,
|
||||
retriesUsed: attempt - 1,
|
||||
skipTlsVerify: true
|
||||
};
|
||||
}
|
||||
|
||||
// Check for error messages
|
||||
const errMatch = dlHtml.match(/class="err"[^>]*>([^<]+)</i);
|
||||
if (errMatch) {
|
||||
throw new Error(`DDownload: ${errMatch[1].trim()}`);
|
||||
}
|
||||
|
||||
throw new Error("DDownload: Kein Download-Link erhalten");
|
||||
} catch (error) {
|
||||
lastError = compactErrorText(error);
|
||||
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
|
||||
break;
|
||||
}
|
||||
// Re-login on auth errors
|
||||
if (/login|session|cookie/i.test(lastError)) {
|
||||
this.cookies = "";
|
||||
}
|
||||
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(lastError)) {
|
||||
break;
|
||||
}
|
||||
await sleepWithSignal(retryDelay(attempt), signal);
|
||||
}
|
||||
}
|
||||
|
||||
throw new Error(String(lastError || "DDownload Unrestrict fehlgeschlagen").replace(/^Error:\s*/i, ""));
|
||||
}
|
||||
}
|
||||
|
||||
@ -822,6 +1154,9 @@ export class DebridService {
|
||||
|
||||
private options: DebridServiceOptions;
|
||||
|
||||
private cachedDdownloadClient: DdownloadClient | null = null;
|
||||
private cachedDdownloadKey = "";
|
||||
|
||||
public constructor(settings: AppSettings, options: DebridServiceOptions = {}) {
|
||||
this.settings = cloneSettings(settings);
|
||||
this.options = options;
|
||||
@ -831,6 +1166,16 @@ export class DebridService {
|
||||
this.settings = cloneSettings(next);
|
||||
}
|
||||
|
||||
private getDdownloadClient(login: string, password: string): DdownloadClient {
|
||||
const key = `${login}\0${password}`;
|
||||
if (this.cachedDdownloadClient && this.cachedDdownloadKey === key) {
|
||||
return this.cachedDdownloadClient;
|
||||
}
|
||||
this.cachedDdownloadClient = new DdownloadClient(login, password);
|
||||
this.cachedDdownloadKey = key;
|
||||
return this.cachedDdownloadClient;
|
||||
}
|
||||
|
||||
public async resolveFilenames(
|
||||
links: string[],
|
||||
onResolved?: (link: string, fileName: string) => void,
|
||||
@ -865,7 +1210,7 @@ export class DebridService {
|
||||
}
|
||||
} catch (error) {
|
||||
const errorText = compactErrorText(error);
|
||||
if (signal?.aborted || /aborted/i.test(errorText)) {
|
||||
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
|
||||
throw error;
|
||||
}
|
||||
// ignore and continue with host page fallback
|
||||
@ -883,6 +1228,27 @@ export class DebridService {
|
||||
|
||||
public async unrestrictLink(link: string, signal?: AbortSignal, settingsSnapshot?: AppSettings): Promise<ProviderUnrestrictedLink> {
|
||||
const settings = settingsSnapshot ? cloneSettings(settingsSnapshot) : cloneSettings(this.settings);
|
||||
|
||||
// DDownload is a direct file hoster, not a debrid service.
|
||||
// If the link is a ddownload.com/ddl.to URL and the account is configured,
|
||||
// use DDownload directly before trying any debrid providers.
|
||||
if (DDOWNLOAD_URL_RE.test(link) && this.isProviderConfiguredFor(settings, "ddownload")) {
|
||||
try {
|
||||
const result = await this.unrestrictViaProvider(settings, "ddownload", link, signal);
|
||||
return {
|
||||
...result,
|
||||
provider: "ddownload",
|
||||
providerLabel: PROVIDER_LABELS["ddownload"]
|
||||
};
|
||||
} catch (error) {
|
||||
const errorText = compactErrorText(error);
|
||||
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
|
||||
throw error;
|
||||
}
|
||||
// Fall through to normal provider chain (debrid services may also support ddownload links)
|
||||
}
|
||||
}
|
||||
|
||||
const order = toProviderOrder(
|
||||
settings.providerPrimary,
|
||||
settings.providerSecondary,
|
||||
@ -910,7 +1276,11 @@ export class DebridService {
|
||||
providerLabel: PROVIDER_LABELS[primary]
|
||||
};
|
||||
} catch (error) {
|
||||
throw new Error(`Unrestrict fehlgeschlagen: ${PROVIDER_LABELS[primary]}: ${compactErrorText(error)}`);
|
||||
const errorText = compactErrorText(error);
|
||||
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
|
||||
throw error;
|
||||
}
|
||||
throw new Error(`Unrestrict fehlgeschlagen: ${PROVIDER_LABELS[primary]}: ${errorText}`);
|
||||
}
|
||||
}
|
||||
|
||||
@ -940,7 +1310,7 @@ export class DebridService {
|
||||
};
|
||||
} catch (error) {
|
||||
const errorText = compactErrorText(error);
|
||||
if (signal?.aborted || /aborted/i.test(errorText)) {
|
||||
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
|
||||
throw error;
|
||||
}
|
||||
attempts.push(`${PROVIDER_LABELS[provider]}: ${compactErrorText(error)}`);
|
||||
@ -964,6 +1334,9 @@ export class DebridService {
|
||||
if (provider === "alldebrid") {
|
||||
return Boolean(settings.allDebridToken.trim());
|
||||
}
|
||||
if (provider === "ddownload") {
|
||||
return Boolean(settings.ddownloadLogin.trim() && settings.ddownloadPassword.trim());
|
||||
}
|
||||
return Boolean(settings.bestToken.trim());
|
||||
}
|
||||
|
||||
@ -977,6 +1350,9 @@ export class DebridService {
|
||||
if (provider === "alldebrid") {
|
||||
return new AllDebridClient(settings.allDebridToken).unrestrictLink(link, signal);
|
||||
}
|
||||
if (provider === "ddownload") {
|
||||
return this.getDdownloadClient(settings.ddownloadLogin, settings.ddownloadPassword).unrestrictLink(link, signal);
|
||||
}
|
||||
return new BestDebridClient(settings.bestToken).unrestrictLink(link, signal);
|
||||
}
|
||||
}
|
||||
|
||||
@ -261,7 +261,7 @@ export function startDebugServer(mgr: DownloadManager, baseDir: string): void {
|
||||
const port = getPort(baseDir);
|
||||
|
||||
server = http.createServer(handleRequest);
|
||||
server.listen(port, "0.0.0.0", () => {
|
||||
server.listen(port, "127.0.0.1", () => {
|
||||
logger.info(`Debug-Server gestartet auf Port ${port}`);
|
||||
});
|
||||
server.on("error", (err) => {
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@ -1,7 +1,7 @@
|
||||
import fs from "node:fs";
|
||||
import path from "node:path";
|
||||
import os from "node:os";
|
||||
import { spawn, spawnSync } from "node:child_process";
|
||||
import { spawn, spawnSync, type ChildProcess } from "node:child_process";
|
||||
import AdmZip from "adm-zip";
|
||||
import { CleanupMode, ConflictMode } from "../shared/types";
|
||||
import { logger } from "./logger";
|
||||
@ -22,7 +22,7 @@ const JVM_EXTRACTOR_REQUIRED_LIBS = [
|
||||
];
|
||||
|
||||
// ── subst drive mapping for long paths on Windows ──
|
||||
const SUBST_THRESHOLD = 100;
|
||||
const SUBST_THRESHOLD = 200;
|
||||
const activeSubstDrives = new Set<string>();
|
||||
|
||||
function findFreeSubstDrive(): string | null {
|
||||
@ -62,6 +62,26 @@ function removeSubstMapping(mapping: SubstMapping): void {
|
||||
logger.info(`subst ${mapping.drive}: entfernt`);
|
||||
}
|
||||
|
||||
export function cleanupStaleSubstDrives(): void {
|
||||
if (process.platform !== "win32") return;
|
||||
try {
|
||||
const result = spawnSync("subst", [], { stdio: "pipe", timeout: 5000 });
|
||||
const output = String(result.stdout || "");
|
||||
for (const line of output.split("\n")) {
|
||||
const match = line.match(/^([A-Z]):\\: => (.+)/i);
|
||||
if (!match) continue;
|
||||
const drive = match[1].toUpperCase();
|
||||
const target = match[2].trim();
|
||||
if (/\\rd-extract-|\\Real-Debrid-Downloader/i.test(target)) {
|
||||
spawnSync("subst", [`${drive}:`, "/d"], { stdio: "pipe", timeout: 5000 });
|
||||
logger.info(`Stale subst ${drive}: entfernt (${target})`);
|
||||
}
|
||||
}
|
||||
} catch {
|
||||
// ignore — subst cleanup is best-effort
|
||||
}
|
||||
}
|
||||
|
||||
let resolvedExtractorCommand: string | null = null;
|
||||
let resolveFailureReason = "";
|
||||
let resolveFailureAt = 0;
|
||||
@ -72,6 +92,7 @@ const EXTRACTOR_RETRY_AFTER_MS = 30_000;
|
||||
const DEFAULT_ZIP_ENTRY_MEMORY_LIMIT_MB = 256;
|
||||
const EXTRACTOR_PROBE_TIMEOUT_MS = 8_000;
|
||||
const DEFAULT_EXTRACT_CPU_BUDGET_PERCENT = 80;
|
||||
let currentExtractCpuPriority: string | undefined;
|
||||
|
||||
export interface ExtractOptions {
|
||||
packageDir: string;
|
||||
@ -88,6 +109,7 @@ export interface ExtractOptions {
|
||||
packageId?: string;
|
||||
hybridMode?: boolean;
|
||||
maxParallel?: number;
|
||||
extractCpuPriority?: string;
|
||||
}
|
||||
|
||||
export interface ExtractProgressUpdate {
|
||||
@ -97,7 +119,10 @@ export interface ExtractProgressUpdate {
|
||||
archiveName: string;
|
||||
archivePercent?: number;
|
||||
elapsedMs?: number;
|
||||
phase: "extracting" | "done";
|
||||
phase: "extracting" | "done" | "preparing";
|
||||
passwordAttempt?: number;
|
||||
passwordTotal?: number;
|
||||
passwordFound?: boolean;
|
||||
}
|
||||
|
||||
const MAX_EXTRACT_OUTPUT_BUFFER = 48 * 1024;
|
||||
@ -204,7 +229,7 @@ function archiveSortKey(filePath: string): string {
|
||||
.replace(/\.zip\.\d{3}$/i, "")
|
||||
.replace(/\.7z\.\d{3}$/i, "")
|
||||
.replace(/\.\d{3}$/i, "")
|
||||
.replace(/\.tar\.(gz|bz2|xz)$/i, "")
|
||||
.replace(/\.(?:tar\.(?:gz|bz2|xz)|tgz|tbz2|txz)$/i, "")
|
||||
.replace(/\.rar$/i, "")
|
||||
.replace(/\.zip$/i, "")
|
||||
.replace(/\.7z$/i, "")
|
||||
@ -225,7 +250,7 @@ function archiveTypeRank(filePath: string): number {
|
||||
if (/\.7z(?:\.\d{3})?$/i.test(fileName)) {
|
||||
return 3;
|
||||
}
|
||||
if (/\.tar\.(gz|bz2|xz)$/i.test(fileName)) {
|
||||
if (/\.(?:tar\.(?:gz|bz2|xz)|tgz|tbz2|txz)$/i.test(fileName)) {
|
||||
return 4;
|
||||
}
|
||||
if (/\.\d{3}$/i.test(fileName)) {
|
||||
@ -276,7 +301,7 @@ export async function findArchiveCandidates(packageDir: string): Promise<string[
|
||||
}
|
||||
return !fileNamesLower.has(`${fileName}.001`.toLowerCase());
|
||||
});
|
||||
const tarCompressed = files.filter((filePath) => /\.tar\.(gz|bz2|xz)$/i.test(filePath));
|
||||
const tarCompressed = files.filter((filePath) => /\.(?:tar\.(?:gz|bz2|xz)|tgz|tbz2|txz)$/i.test(filePath));
|
||||
// Generic .001 splits (HJSplit etc.) — exclude already-recognized .zip.001 and .7z.001
|
||||
const genericSplit = files.filter((filePath) => {
|
||||
const fileName = path.basename(filePath).toLowerCase();
|
||||
@ -419,15 +444,18 @@ async function writeExtractResumeState(packageDir: string, completedArchives: Se
|
||||
.map((name) => archiveNameKey(name))
|
||||
.sort((a, b) => a.localeCompare(b))
|
||||
};
|
||||
const tmpPath = progressPath + ".tmp";
|
||||
const tmpPath = progressPath + "." + Date.now() + "." + Math.random().toString(36).slice(2, 8) + ".tmp";
|
||||
await fs.promises.writeFile(tmpPath, JSON.stringify(payload, null, 2), "utf8");
|
||||
await fs.promises.rename(tmpPath, progressPath);
|
||||
await fs.promises.rename(tmpPath, progressPath).catch(async () => {
|
||||
// rename may fail if another writer renamed tmpPath first (parallel workers)
|
||||
await fs.promises.rm(tmpPath, { force: true }).catch(() => {});
|
||||
});
|
||||
} catch (error) {
|
||||
logger.warn(`ExtractResumeState schreiben fehlgeschlagen: ${String(error)}`);
|
||||
}
|
||||
}
|
||||
|
||||
async function clearExtractResumeState(packageDir: string, packageId?: string): Promise<void> {
|
||||
export async function clearExtractResumeState(packageDir: string, packageId?: string): Promise<void> {
|
||||
try {
|
||||
await fs.promises.rm(extractProgressFilePath(packageDir, packageId), { force: true });
|
||||
} catch {
|
||||
@ -461,7 +489,7 @@ export function classifyExtractionError(errorText: string): ExtractErrorCategory
|
||||
|
||||
function isExtractAbortError(errorText: string): boolean {
|
||||
const text = String(errorText || "").toLowerCase();
|
||||
return text.includes("aborted:extract") || text.includes("extract_aborted");
|
||||
return text.includes("aborted:extract") || text.includes("extract_aborted") || text.includes("noextractor:skipped");
|
||||
}
|
||||
|
||||
export function archiveFilenamePasswords(archiveName: string): string[] {
|
||||
@ -472,7 +500,7 @@ export function archiveFilenamePasswords(archiveName: string): string[] {
|
||||
.replace(/\.zip\.\d{3}$/i, "")
|
||||
.replace(/\.7z\.\d{3}$/i, "")
|
||||
.replace(/\.\d{3}$/i, "")
|
||||
.replace(/\.tar\.(gz|bz2|xz)$/i, "")
|
||||
.replace(/\.(?:tar\.(?:gz|bz2|xz)|tgz|tbz2|txz)$/i, "")
|
||||
.replace(/\.(rar|zip|7z|tar|gz|bz2|xz)$/i, "");
|
||||
if (!stem) return [];
|
||||
const candidates = [stem];
|
||||
@ -517,7 +545,6 @@ function winRarCandidates(): string[] {
|
||||
|
||||
const installed = [
|
||||
path.join(programFiles, "WinRAR", "UnRAR.exe"),
|
||||
path.join(programFilesX86, "WinRAR", "UnRAR.exe"),
|
||||
path.join(programFilesX86, "WinRAR", "UnRAR.exe")
|
||||
];
|
||||
|
||||
@ -563,33 +590,53 @@ function shouldUseExtractorPerformanceFlags(): boolean {
|
||||
return raw !== "0" && raw !== "false" && raw !== "off" && raw !== "no";
|
||||
}
|
||||
|
||||
function extractCpuBudgetPercent(): number {
|
||||
function extractCpuBudgetFromPriority(priority?: string): number {
|
||||
switch (priority) {
|
||||
case "low": return 25;
|
||||
case "middle": return 50;
|
||||
default: return 80;
|
||||
}
|
||||
}
|
||||
|
||||
function extractOsPriority(priority?: string): number {
|
||||
switch (priority) {
|
||||
case "high": return os.constants.priority.PRIORITY_NORMAL;
|
||||
default: return os.constants.priority.PRIORITY_BELOW_NORMAL;
|
||||
}
|
||||
}
|
||||
|
||||
function extractCpuBudgetPercent(priority?: string): number {
|
||||
const envValue = Number(process.env.RD_EXTRACT_CPU_BUDGET_PERCENT ?? NaN);
|
||||
if (Number.isFinite(envValue) && envValue >= 40 && envValue <= 95) {
|
||||
return Math.floor(envValue);
|
||||
}
|
||||
return DEFAULT_EXTRACT_CPU_BUDGET_PERCENT;
|
||||
return extractCpuBudgetFromPriority(priority);
|
||||
}
|
||||
|
||||
function extractorThreadSwitch(hybridMode = false): string {
|
||||
function extractorThreadSwitch(hybridMode = false, priority?: string): string {
|
||||
if (hybridMode) {
|
||||
// 2 threads during hybrid extraction (download + extract simultaneously).
|
||||
// JDownloader 2 uses in-process 7-Zip-JBinding which naturally limits throughput
|
||||
// to ~16 MB/s write. 2 UnRAR threads produce similar controlled disk load.
|
||||
return "-mt2";
|
||||
// Use half the CPU budget during hybrid extraction to leave headroom for
|
||||
// concurrent downloads. Falls back to at least 2 threads.
|
||||
const envValue = Number(process.env.RD_EXTRACT_THREADS ?? NaN);
|
||||
if (Number.isFinite(envValue) && envValue >= 1 && envValue <= 32) {
|
||||
return `-mt${Math.floor(envValue)}`;
|
||||
}
|
||||
const cpuCount = Math.max(1, os.cpus().length || 1);
|
||||
const hybridThreads = Math.max(2, Math.min(8, Math.floor(cpuCount / 2)));
|
||||
return `-mt${hybridThreads}`;
|
||||
}
|
||||
const envValue = Number(process.env.RD_EXTRACT_THREADS ?? NaN);
|
||||
if (Number.isFinite(envValue) && envValue >= 1 && envValue <= 32) {
|
||||
return `-mt${Math.floor(envValue)}`;
|
||||
}
|
||||
const cpuCount = Math.max(1, os.cpus().length || 1);
|
||||
const budgetPercent = extractCpuBudgetPercent();
|
||||
const budgetPercent = extractCpuBudgetPercent(priority);
|
||||
const budgetedThreads = Math.floor((cpuCount * budgetPercent) / 100);
|
||||
const threadCount = Math.max(1, Math.min(16, Math.max(1, budgetedThreads)));
|
||||
return `-mt${threadCount}`;
|
||||
}
|
||||
|
||||
function lowerExtractProcessPriority(childPid: number | undefined): void {
|
||||
function lowerExtractProcessPriority(childPid: number | undefined, cpuPriority?: string): void {
|
||||
if (process.platform !== "win32") {
|
||||
return;
|
||||
}
|
||||
@ -598,9 +645,9 @@ function lowerExtractProcessPriority(childPid: number | undefined): void {
|
||||
return;
|
||||
}
|
||||
try {
|
||||
// IDLE_PRIORITY_CLASS: lowers CPU scheduling priority so extraction
|
||||
// doesn't starve other processes. I/O priority stays Normal (like JDownloader 2).
|
||||
os.setPriority(pid, os.constants.priority.PRIORITY_LOW);
|
||||
// Sets CPU scheduling priority for the extraction process.
|
||||
// high → NORMAL (full speed), default → BELOW_NORMAL. I/O priority stays Normal.
|
||||
os.setPriority(pid, extractOsPriority(cpuPriority));
|
||||
} catch {
|
||||
// ignore: priority lowering is best-effort
|
||||
}
|
||||
@ -670,7 +717,7 @@ function runExtractCommand(
|
||||
let settled = false;
|
||||
let output = "";
|
||||
const child = spawn(command, args, { windowsHide: true });
|
||||
lowerExtractProcessPriority(child.pid);
|
||||
lowerExtractProcessPriority(child.pid, currentExtractCpuPriority);
|
||||
let timeoutId: NodeJS.Timeout | null = null;
|
||||
let timedOutByWatchdog = false;
|
||||
let abortedBySignal = false;
|
||||
@ -853,10 +900,17 @@ function resolveJvmExtractorRootCandidates(): string[] {
|
||||
}
|
||||
|
||||
let cachedJvmLayout: JvmExtractorLayout | null | undefined;
|
||||
let cachedJvmLayoutNullSince = 0;
|
||||
const JVM_LAYOUT_NULL_TTL_MS = 5 * 60 * 1000;
|
||||
|
||||
function resolveJvmExtractorLayout(): JvmExtractorLayout | null {
|
||||
if (cachedJvmLayout !== undefined) {
|
||||
return cachedJvmLayout;
|
||||
// Don't cache null permanently — retry after TTL in case Java was installed
|
||||
if (cachedJvmLayout === null && Date.now() - cachedJvmLayoutNullSince > JVM_LAYOUT_NULL_TTL_MS) {
|
||||
cachedJvmLayout = undefined;
|
||||
} else {
|
||||
return cachedJvmLayout;
|
||||
}
|
||||
}
|
||||
const javaCandidates = resolveJavaCommandCandidates();
|
||||
const javaCommand = javaCandidates.find((candidate) => {
|
||||
@ -870,6 +924,8 @@ function resolveJvmExtractorLayout(): JvmExtractorLayout | null {
|
||||
}) || "";
|
||||
|
||||
if (!javaCommand) {
|
||||
cachedJvmLayout = null;
|
||||
cachedJvmLayoutNullSince = Date.now();
|
||||
return null;
|
||||
}
|
||||
|
||||
@ -889,6 +945,7 @@ function resolveJvmExtractorLayout(): JvmExtractorLayout | null {
|
||||
}
|
||||
|
||||
cachedJvmLayout = null;
|
||||
cachedJvmLayoutNullSince = Date.now();
|
||||
return null;
|
||||
}
|
||||
|
||||
@ -931,6 +988,274 @@ function parseJvmLine(
|
||||
}
|
||||
}
|
||||
|
||||
// ── Persistent JVM Daemon ──
|
||||
// Keeps a single JVM process alive across multiple extraction requests,
|
||||
// eliminating the ~5s JVM boot overhead per archive.
|
||||
|
||||
interface DaemonRequest {
|
||||
resolve: (result: JvmExtractResult) => void;
|
||||
onArchiveProgress?: (percent: number) => void;
|
||||
signal?: AbortSignal;
|
||||
timeoutMs?: number;
|
||||
parseState: { bestPercent: number; usedPassword: string; backend: string; reportedError: string };
|
||||
}
|
||||
|
||||
let daemonProcess: ChildProcess | null = null;
|
||||
let daemonReady = false;
|
||||
let daemonBusy = false;
|
||||
let daemonCurrentRequest: DaemonRequest | null = null;
|
||||
let daemonStdoutBuffer = "";
|
||||
let daemonStderrBuffer = "";
|
||||
let daemonOutput = "";
|
||||
let daemonTimeoutId: NodeJS.Timeout | null = null;
|
||||
let daemonAbortHandler: (() => void) | null = null;
|
||||
let daemonLayout: JvmExtractorLayout | null = null;
|
||||
|
||||
export function shutdownDaemon(): void {
|
||||
if (daemonProcess) {
|
||||
try { daemonProcess.stdin?.end(); } catch { /* ignore */ }
|
||||
try { killProcessTree(daemonProcess); } catch { /* ignore */ }
|
||||
daemonProcess = null;
|
||||
}
|
||||
daemonReady = false;
|
||||
daemonBusy = false;
|
||||
daemonCurrentRequest = null;
|
||||
daemonStdoutBuffer = "";
|
||||
daemonStderrBuffer = "";
|
||||
daemonOutput = "";
|
||||
if (daemonTimeoutId) { clearTimeout(daemonTimeoutId); daemonTimeoutId = null; }
|
||||
if (daemonAbortHandler) { daemonAbortHandler = null; }
|
||||
daemonLayout = null;
|
||||
}
|
||||
|
||||
function finishDaemonRequest(result: JvmExtractResult): void {
|
||||
const req = daemonCurrentRequest;
|
||||
if (!req) return;
|
||||
daemonCurrentRequest = null;
|
||||
daemonBusy = false;
|
||||
daemonStdoutBuffer = "";
|
||||
daemonStderrBuffer = "";
|
||||
daemonOutput = "";
|
||||
if (daemonTimeoutId) { clearTimeout(daemonTimeoutId); daemonTimeoutId = null; }
|
||||
if (req.signal && daemonAbortHandler) {
|
||||
req.signal.removeEventListener("abort", daemonAbortHandler);
|
||||
daemonAbortHandler = null;
|
||||
}
|
||||
req.resolve(result);
|
||||
}
|
||||
|
||||
function handleDaemonLine(line: string): void {
|
||||
const trimmed = String(line || "").trim();
|
||||
if (!trimmed) return;
|
||||
|
||||
// Check for daemon ready signal
|
||||
if (trimmed === "RD_DAEMON_READY") {
|
||||
daemonReady = true;
|
||||
logger.info("JVM Daemon bereit (persistent)");
|
||||
return;
|
||||
}
|
||||
|
||||
// Check for request completion
|
||||
if (trimmed.startsWith("RD_REQUEST_DONE ")) {
|
||||
const code = parseInt(trimmed.slice("RD_REQUEST_DONE ".length).trim(), 10);
|
||||
const req = daemonCurrentRequest;
|
||||
if (!req) return;
|
||||
|
||||
if (code === 0) {
|
||||
req.onArchiveProgress?.(100);
|
||||
finishDaemonRequest({
|
||||
ok: true, missingCommand: false, missingRuntime: false,
|
||||
aborted: false, timedOut: false, errorText: "",
|
||||
usedPassword: req.parseState.usedPassword, backend: req.parseState.backend
|
||||
});
|
||||
} else {
|
||||
const message = cleanErrorText(req.parseState.reportedError || daemonOutput) || `Exit Code ${code}`;
|
||||
finishDaemonRequest({
|
||||
ok: false, missingCommand: false, missingRuntime: isJvmRuntimeMissingError(message),
|
||||
aborted: false, timedOut: false, errorText: message,
|
||||
usedPassword: req.parseState.usedPassword, backend: req.parseState.backend
|
||||
});
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
// Regular progress/status lines — delegate to parseJvmLine
|
||||
if (daemonCurrentRequest) {
|
||||
parseJvmLine(trimmed, daemonCurrentRequest.onArchiveProgress, daemonCurrentRequest.parseState);
|
||||
}
|
||||
}
|
||||
|
||||
function startDaemon(layout: JvmExtractorLayout): boolean {
|
||||
if (daemonProcess && daemonReady) return true;
|
||||
shutdownDaemon();
|
||||
|
||||
const jvmTmpDir = path.join(os.tmpdir(), `rd-extract-daemon-${crypto.randomUUID()}`);
|
||||
fs.mkdirSync(jvmTmpDir, { recursive: true });
|
||||
|
||||
const args = [
|
||||
"-Dfile.encoding=UTF-8",
|
||||
`-Djava.io.tmpdir=${jvmTmpDir}`,
|
||||
"-Xms512m",
|
||||
"-Xmx8g",
|
||||
"-XX:+UseSerialGC",
|
||||
"-cp",
|
||||
layout.classPath,
|
||||
JVM_EXTRACTOR_MAIN_CLASS,
|
||||
"--daemon"
|
||||
];
|
||||
|
||||
try {
|
||||
const child = spawn(layout.javaCommand, args, {
|
||||
windowsHide: true,
|
||||
stdio: ["pipe", "pipe", "pipe"]
|
||||
});
|
||||
lowerExtractProcessPriority(child.pid, currentExtractCpuPriority);
|
||||
daemonProcess = child;
|
||||
daemonLayout = layout;
|
||||
|
||||
child.stdout!.on("data", (chunk) => {
|
||||
const raw = String(chunk || "");
|
||||
daemonOutput = appendLimited(daemonOutput, raw);
|
||||
daemonStdoutBuffer += raw;
|
||||
const lines = daemonStdoutBuffer.split(/\r?\n/);
|
||||
daemonStdoutBuffer = lines.pop() || "";
|
||||
for (const line of lines) {
|
||||
handleDaemonLine(line);
|
||||
}
|
||||
});
|
||||
|
||||
child.stderr!.on("data", (chunk) => {
|
||||
const raw = String(chunk || "");
|
||||
daemonOutput = appendLimited(daemonOutput, raw);
|
||||
daemonStderrBuffer += raw;
|
||||
const lines = daemonStderrBuffer.split(/\r?\n/);
|
||||
daemonStderrBuffer = lines.pop() || "";
|
||||
for (const line of lines) {
|
||||
if (daemonCurrentRequest) {
|
||||
parseJvmLine(line, daemonCurrentRequest.onArchiveProgress, daemonCurrentRequest.parseState);
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
child.on("error", () => {
|
||||
if (daemonCurrentRequest) {
|
||||
finishDaemonRequest({
|
||||
ok: false, missingCommand: true, missingRuntime: true,
|
||||
aborted: false, timedOut: false, errorText: "Daemon process error",
|
||||
usedPassword: "", backend: ""
|
||||
});
|
||||
}
|
||||
shutdownDaemon();
|
||||
});
|
||||
|
||||
child.on("close", () => {
|
||||
if (daemonCurrentRequest) {
|
||||
const req = daemonCurrentRequest;
|
||||
finishDaemonRequest({
|
||||
ok: false, missingCommand: false, missingRuntime: false,
|
||||
aborted: false, timedOut: false,
|
||||
errorText: cleanErrorText(req.parseState.reportedError || daemonOutput) || "Daemon process exited unexpectedly",
|
||||
usedPassword: req.parseState.usedPassword, backend: req.parseState.backend
|
||||
});
|
||||
}
|
||||
// Clean up tmp dir
|
||||
fs.rm(jvmTmpDir, { recursive: true, force: true }, () => {});
|
||||
daemonProcess = null;
|
||||
daemonReady = false;
|
||||
daemonBusy = false;
|
||||
daemonLayout = null;
|
||||
});
|
||||
|
||||
logger.info(`JVM Daemon gestartet (PID ${child.pid})`);
|
||||
return true;
|
||||
} catch (error) {
|
||||
logger.warn(`JVM Daemon Start fehlgeschlagen: ${String(error)}`);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
function isDaemonAvailable(layout: JvmExtractorLayout): boolean {
|
||||
// Start daemon if not running yet
|
||||
if (!daemonProcess || !daemonReady) {
|
||||
startDaemon(layout);
|
||||
}
|
||||
return Boolean(daemonProcess && daemonReady && !daemonBusy);
|
||||
}
|
||||
|
||||
function sendDaemonRequest(
|
||||
archivePath: string,
|
||||
targetDir: string,
|
||||
conflictMode: ConflictMode,
|
||||
passwordCandidates: string[],
|
||||
onArchiveProgress?: (percent: number) => void,
|
||||
signal?: AbortSignal,
|
||||
timeoutMs?: number
|
||||
): Promise<JvmExtractResult> {
|
||||
return new Promise((resolve) => {
|
||||
const mode = effectiveConflictMode(conflictMode);
|
||||
const parseState = { bestPercent: 0, usedPassword: "", backend: "", reportedError: "" };
|
||||
|
||||
daemonBusy = true;
|
||||
daemonOutput = "";
|
||||
daemonCurrentRequest = { resolve, onArchiveProgress, signal, timeoutMs, parseState };
|
||||
|
||||
// Set up timeout
|
||||
if (timeoutMs && timeoutMs > 0) {
|
||||
daemonTimeoutId = setTimeout(() => {
|
||||
// Timeout — kill the daemon and restart fresh for next request
|
||||
const req = daemonCurrentRequest;
|
||||
if (req) {
|
||||
finishDaemonRequest({
|
||||
ok: false, missingCommand: false, missingRuntime: false,
|
||||
aborted: false, timedOut: true,
|
||||
errorText: `Entpacken Timeout nach ${Math.ceil(timeoutMs / 1000)}s`,
|
||||
usedPassword: parseState.usedPassword, backend: parseState.backend
|
||||
});
|
||||
}
|
||||
shutdownDaemon();
|
||||
}, timeoutMs);
|
||||
}
|
||||
|
||||
// Set up abort handler
|
||||
if (signal) {
|
||||
daemonAbortHandler = () => {
|
||||
const req = daemonCurrentRequest;
|
||||
if (req) {
|
||||
finishDaemonRequest({
|
||||
ok: false, missingCommand: false, missingRuntime: false,
|
||||
aborted: true, timedOut: false, errorText: "aborted:extract",
|
||||
usedPassword: parseState.usedPassword, backend: parseState.backend
|
||||
});
|
||||
}
|
||||
// Kill daemon on abort — cleaner than trying to interrupt mid-extraction
|
||||
shutdownDaemon();
|
||||
};
|
||||
signal.addEventListener("abort", daemonAbortHandler, { once: true });
|
||||
}
|
||||
|
||||
// Build and send JSON request
|
||||
const jsonRequest = JSON.stringify({
|
||||
archive: archivePath,
|
||||
target: targetDir,
|
||||
conflict: mode,
|
||||
backend: "auto",
|
||||
passwords: passwordCandidates
|
||||
});
|
||||
|
||||
try {
|
||||
daemonProcess!.stdin!.write(jsonRequest + "\n");
|
||||
} catch (error) {
|
||||
finishDaemonRequest({
|
||||
ok: false, missingCommand: false, missingRuntime: false,
|
||||
aborted: false, timedOut: false,
|
||||
errorText: `Daemon stdin write failed: ${String(error)}`,
|
||||
usedPassword: "", backend: ""
|
||||
});
|
||||
shutdownDaemon();
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
function runJvmExtractCommand(
|
||||
layout: JvmExtractorLayout,
|
||||
archivePath: string,
|
||||
@ -954,6 +1279,15 @@ function runJvmExtractCommand(
|
||||
});
|
||||
}
|
||||
|
||||
// Try persistent daemon first — saves ~5s JVM boot per archive
|
||||
if (isDaemonAvailable(layout)) {
|
||||
logger.info(`JVM Daemon: Sende Request für ${path.basename(archivePath)}`);
|
||||
return sendDaemonRequest(archivePath, targetDir, conflictMode, passwordCandidates, onArchiveProgress, signal, timeoutMs);
|
||||
}
|
||||
|
||||
// Fallback: spawn a new JVM process (daemon busy or not available)
|
||||
logger.info(`JVM Spawn: Neuer Prozess für ${path.basename(archivePath)}${daemonBusy ? " (Daemon busy)" : ""}`);
|
||||
|
||||
const mode = effectiveConflictMode(conflictMode);
|
||||
// Each JVM process needs its own temp dir so parallel SevenZipJBinding
|
||||
// instances don't fight over the same native DLL file lock.
|
||||
@ -962,8 +1296,9 @@ function runJvmExtractCommand(
|
||||
const args = [
|
||||
"-Dfile.encoding=UTF-8",
|
||||
`-Djava.io.tmpdir=${jvmTmpDir}`,
|
||||
"-Xms32m",
|
||||
"-Xmx512m",
|
||||
"-Xms512m",
|
||||
"-Xmx8g",
|
||||
"-XX:+UseSerialGC",
|
||||
"-cp",
|
||||
layout.classPath,
|
||||
JVM_EXTRACTOR_MAIN_CLASS,
|
||||
@ -992,7 +1327,7 @@ function runJvmExtractCommand(
|
||||
let stderrBuffer = "";
|
||||
|
||||
const child = spawn(layout.javaCommand, args, { windowsHide: true });
|
||||
lowerExtractProcessPriority(child.pid);
|
||||
lowerExtractProcessPriority(child.pid, currentExtractCpuPriority);
|
||||
|
||||
const flushLines = (rawChunk: string, fromStdErr = false): void => {
|
||||
if (!rawChunk) {
|
||||
@ -1171,7 +1506,7 @@ export function buildExternalExtractArgs(
|
||||
// On Windows (the target platform) this is less of a concern than on shared Unix systems.
|
||||
const pass = password ? `-p${password}` : "-p-";
|
||||
const perfArgs = usePerformanceFlags && shouldUseExtractorPerformanceFlags()
|
||||
? ["-idc", extractorThreadSwitch(hybridMode)]
|
||||
? ["-idc", extractorThreadSwitch(hybridMode, currentExtractCpuPriority)]
|
||||
: [];
|
||||
return ["x", overwrite, pass, "-y", ...perfArgs, archivePath, `${targetDir}${path.sep}`];
|
||||
}
|
||||
@ -1200,9 +1535,10 @@ async function resolveExtractorCommandInternal(): Promise<string> {
|
||||
if (isAbsoluteCommand(command) && !fs.existsSync(command)) {
|
||||
continue;
|
||||
}
|
||||
const probeArgs = command.toLowerCase().includes("winrar") ? ["-?"] : ["?"];
|
||||
const lower = command.toLowerCase();
|
||||
const probeArgs = (lower.includes("winrar") || lower.includes("unrar")) ? ["-?"] : ["?"];
|
||||
const probe = await runExtractCommand(command, probeArgs, undefined, undefined, EXTRACTOR_PROBE_TIMEOUT_MS);
|
||||
if (probe.ok) {
|
||||
if (probe.ok || (!probe.missingCommand && !probe.timedOut)) {
|
||||
resolvedExtractorCommand = command;
|
||||
resolveFailureReason = "";
|
||||
resolveFailureAt = 0;
|
||||
@ -1242,7 +1578,8 @@ async function runExternalExtract(
|
||||
passwordCandidates: string[],
|
||||
onArchiveProgress?: (percent: number) => void,
|
||||
signal?: AbortSignal,
|
||||
hybridMode = false
|
||||
hybridMode = false,
|
||||
onPasswordAttempt?: (attempt: number, total: number) => void
|
||||
): Promise<string> {
|
||||
const timeoutMs = await computeExtractTimeoutMs(archivePath);
|
||||
const backendMode = extractorBackendMode();
|
||||
@ -1316,7 +1653,7 @@ async function runExternalExtract(
|
||||
|
||||
// subst only needed for legacy UnRAR/7z (MAX_PATH limit)
|
||||
subst = createSubstMapping(targetDir);
|
||||
const effectiveTargetDir = subst ? `${subst.drive}:` : targetDir;
|
||||
const effectiveTargetDir = subst ? `${subst.drive}:\\` : targetDir;
|
||||
|
||||
const command = await resolveExtractorCommand();
|
||||
const password = await runExternalExtractInner(
|
||||
@ -1328,7 +1665,8 @@ async function runExternalExtract(
|
||||
onArchiveProgress,
|
||||
signal,
|
||||
timeoutMs,
|
||||
hybridMode
|
||||
hybridMode,
|
||||
onPasswordAttempt
|
||||
);
|
||||
const extractorName = path.basename(command).replace(/\.exe$/i, "");
|
||||
if (jvmFailureReason) {
|
||||
@ -1351,7 +1689,8 @@ async function runExternalExtractInner(
|
||||
onArchiveProgress: ((percent: number) => void) | undefined,
|
||||
signal: AbortSignal | undefined,
|
||||
timeoutMs: number,
|
||||
hybridMode = false
|
||||
hybridMode = false,
|
||||
onPasswordAttempt?: (attempt: number, total: number) => void
|
||||
): Promise<string> {
|
||||
const passwords = passwordCandidates;
|
||||
let lastError = "";
|
||||
@ -1375,6 +1714,9 @@ async function runExternalExtractInner(
|
||||
passwordAttempt += 1;
|
||||
const quotedPw = password === "" ? '""' : `"${password}"`;
|
||||
logger.info(`Legacy-Passwort-Versuch ${passwordAttempt}/${passwords.length} für ${path.basename(archivePath)}: ${quotedPw}`);
|
||||
if (passwords.length > 1) {
|
||||
onPasswordAttempt?.(passwordAttempt, passwords.length);
|
||||
}
|
||||
let args = buildExternalExtractArgs(command, archivePath, targetDir, conflictMode, password, usePerformanceFlags, hybridMode);
|
||||
let result = await runExtractCommand(command, args, (chunk) => {
|
||||
const parsed = parseProgressPercent(chunk);
|
||||
@ -1558,7 +1900,8 @@ async function extractZipArchive(archivePath: string, targetDir: string, conflic
|
||||
const limitMb = Math.ceil(memoryLimitBytes / (1024 * 1024));
|
||||
throw new Error(`ZIP-Eintrag zu groß für internen Entpacker (${entryMb} MB > ${limitMb} MB)`);
|
||||
}
|
||||
if (data.length > Math.max(uncompressedSize, compressedSize) * 20) {
|
||||
const maxDeclaredSize = Math.max(uncompressedSize, compressedSize);
|
||||
if (maxDeclaredSize > 0 && data.length > maxDeclaredSize * 20) {
|
||||
throw new Error(`ZIP-Eintrag verdächtig groß nach Entpacken (${entry.entryName})`);
|
||||
}
|
||||
await fs.promises.writeFile(outputPath, data);
|
||||
@ -1598,6 +1941,8 @@ export function collectArchiveCleanupTargets(sourceArchivePath: string, director
|
||||
if (multipartRar) {
|
||||
const prefix = escapeRegex(multipartRar[1]);
|
||||
addMatching(new RegExp(`^${prefix}\\.part\\d+\\.rar$`, "i"));
|
||||
// RAR5 recovery volumes: prefix.partN.rev AND legacy prefix.rev
|
||||
addMatching(new RegExp(`^${prefix}\\.part\\d+\\.rev$`, "i"));
|
||||
addMatching(new RegExp(`^${prefix}\\.rev$`, "i"));
|
||||
return Array.from(targets);
|
||||
}
|
||||
@ -1640,6 +1985,11 @@ export function collectArchiveCleanupTargets(sourceArchivePath: string, director
|
||||
return Array.from(targets);
|
||||
}
|
||||
|
||||
// Tar compound archives (.tar.gz, .tar.bz2, .tar.xz, .tgz, .tbz2, .txz)
|
||||
if (/\.(?:tar\.(?:gz|bz2|xz)|tgz|tbz2|txz)$/i.test(fileName)) {
|
||||
return Array.from(targets);
|
||||
}
|
||||
|
||||
// Generic .NNN split files (HJSplit etc.)
|
||||
const genericSplit = fileName.match(/^(.*)\.(\d{3})$/i);
|
||||
if (genericSplit) {
|
||||
@ -1651,7 +2001,7 @@ export function collectArchiveCleanupTargets(sourceArchivePath: string, director
|
||||
return Array.from(targets);
|
||||
}
|
||||
|
||||
async function cleanupArchives(sourceFiles: string[], cleanupMode: CleanupMode): Promise<number> {
|
||||
export async function cleanupArchives(sourceFiles: string[], cleanupMode: CleanupMode): Promise<number> {
|
||||
if (cleanupMode === "none") {
|
||||
return 0;
|
||||
}
|
||||
@ -1722,7 +2072,7 @@ async function cleanupArchives(sourceFiles: string[], cleanupMode: CleanupMode):
|
||||
return removed;
|
||||
}
|
||||
|
||||
async function hasAnyFilesRecursive(rootDir: string): Promise<boolean> {
|
||||
export async function hasAnyFilesRecursive(rootDir: string): Promise<boolean> {
|
||||
const rootExists = await fs.promises.access(rootDir).then(() => true, () => false);
|
||||
if (!rootExists) {
|
||||
return false;
|
||||
@ -1770,7 +2120,7 @@ async function hasAnyEntries(rootDir: string): Promise<boolean> {
|
||||
}
|
||||
}
|
||||
|
||||
async function removeEmptyDirectoryTree(rootDir: string): Promise<number> {
|
||||
export async function removeEmptyDirectoryTree(rootDir: string): Promise<number> {
|
||||
const rootExists = await fs.promises.access(rootDir).then(() => true, () => false);
|
||||
if (!rootExists) {
|
||||
return 0;
|
||||
@ -1815,7 +2165,7 @@ export async function extractPackageArchives(options: ExtractOptions): Promise<{
|
||||
if (options.signal?.aborted) {
|
||||
throw new Error("aborted:extract");
|
||||
}
|
||||
|
||||
options.onProgress?.({ current: 0, total: 0, percent: 0, archiveName: "Archive scannen...", phase: "preparing" });
|
||||
const allCandidates = await findArchiveCandidates(options.packageDir);
|
||||
const candidates = options.onlyArchives
|
||||
? allCandidates.filter((archivePath) => {
|
||||
@ -1827,6 +2177,7 @@ export async function extractPackageArchives(options: ExtractOptions): Promise<{
|
||||
|
||||
// Disk space pre-check
|
||||
if (candidates.length > 0) {
|
||||
options.onProgress?.({ current: 0, total: candidates.length, percent: 0, archiveName: "Speicherplatz prüfen...", phase: "preparing" });
|
||||
try {
|
||||
await fs.promises.mkdir(options.targetDir, { recursive: true });
|
||||
} catch { /* ignore */ }
|
||||
@ -1889,7 +2240,8 @@ export async function extractPackageArchives(options: ExtractOptions): Promise<{
|
||||
archiveName: string,
|
||||
phase: "extracting" | "done",
|
||||
archivePercent?: number,
|
||||
elapsedMs?: number
|
||||
elapsedMs?: number,
|
||||
pwInfo?: { passwordAttempt?: number; passwordTotal?: number; passwordFound?: boolean }
|
||||
): void => {
|
||||
if (!options.onProgress) {
|
||||
return;
|
||||
@ -1909,7 +2261,8 @@ export async function extractPackageArchives(options: ExtractOptions): Promise<{
|
||||
archiveName,
|
||||
archivePercent,
|
||||
elapsedMs,
|
||||
phase
|
||||
phase,
|
||||
...(pwInfo || {})
|
||||
});
|
||||
} catch (error) {
|
||||
logger.warn(`onProgress callback Fehler unterdrückt: ${cleanErrorText(String(error))}`);
|
||||
@ -1918,13 +2271,25 @@ export async function extractPackageArchives(options: ExtractOptions): Promise<{
|
||||
|
||||
emitProgress(extracted, "", "extracting");
|
||||
|
||||
// Emit "done" progress for archives already completed via resume state
|
||||
// so the caller's onProgress handler can mark their items as "Done" immediately
|
||||
// rather than leaving them as "Entpacken - Ausstehend" until all extraction finishes.
|
||||
for (const archivePath of candidates) {
|
||||
if (resumeCompleted.has(archiveNameKey(path.basename(archivePath)))) {
|
||||
emitProgress(extracted, path.basename(archivePath), "extracting", 100, 0);
|
||||
}
|
||||
}
|
||||
|
||||
const maxParallel = Math.max(1, options.maxParallel || 1);
|
||||
let noExtractorEncountered = false;
|
||||
|
||||
const extractSingleArchive = async (archivePath: string): Promise<void> => {
|
||||
if (options.signal?.aborted || noExtractorEncountered) {
|
||||
if (options.signal?.aborted) {
|
||||
throw new Error("aborted:extract");
|
||||
}
|
||||
if (noExtractorEncountered) {
|
||||
throw new Error("noextractor:skipped");
|
||||
}
|
||||
const archiveName = path.basename(archivePath);
|
||||
const archiveResumeKey = archiveNameKey(archiveName);
|
||||
const archiveStartedAt = Date.now();
|
||||
@ -1946,6 +2311,10 @@ export async function extractPackageArchives(options: ExtractOptions): Promise<{
|
||||
const sig = await detectArchiveSignature(archivePath);
|
||||
if (!sig) {
|
||||
logger.info(`Generische Split-Datei übersprungen (keine Archiv-Signatur): ${archiveName}`);
|
||||
extracted += 1;
|
||||
resumeCompleted.add(archiveResumeKey);
|
||||
extractedArchives.add(archivePath);
|
||||
await writeExtractResumeState(options.packageDir, resumeCompleted, options.packageId);
|
||||
clearInterval(pulseTimer);
|
||||
return;
|
||||
}
|
||||
@ -1953,7 +2322,16 @@ export async function extractPackageArchives(options: ExtractOptions): Promise<{
|
||||
}
|
||||
|
||||
logger.info(`Entpacke Archiv: ${path.basename(archivePath)} -> ${options.targetDir}${hybrid ? " (hybrid, reduced threads, low I/O)" : ""}`);
|
||||
const hasManyPasswords = archivePasswordCandidates.length > 1;
|
||||
if (hasManyPasswords) {
|
||||
emitProgress(extracted + failed, archiveName, "extracting", 0, 0, { passwordAttempt: 0, passwordTotal: archivePasswordCandidates.length });
|
||||
}
|
||||
const onPwAttempt = hasManyPasswords
|
||||
? (attempt: number, total: number) => { emitProgress(extracted + failed, archiveName, "extracting", archivePercent, Date.now() - archiveStartedAt, { passwordAttempt: attempt, passwordTotal: total }); }
|
||||
: undefined;
|
||||
try {
|
||||
// Set module-level priority before each extract call (race-safe: spawn is synchronous)
|
||||
currentExtractCpuPriority = options.extractCpuPriority;
|
||||
const ext = path.extname(archivePath).toLowerCase();
|
||||
if (ext === ".zip") {
|
||||
const preferExternal = await shouldPreferExternalZip(archivePath);
|
||||
@ -1962,7 +2340,7 @@ export async function extractPackageArchives(options: ExtractOptions): Promise<{
|
||||
const usedPassword = await runExternalExtract(archivePath, options.targetDir, options.conflictMode, archivePasswordCandidates, (value) => {
|
||||
archivePercent = Math.max(archivePercent, value);
|
||||
emitProgress(extracted + failed, archiveName, "extracting", archivePercent, Date.now() - archiveStartedAt);
|
||||
}, options.signal, hybrid);
|
||||
}, options.signal, hybrid, onPwAttempt);
|
||||
passwordCandidates = prioritizePassword(passwordCandidates, usedPassword);
|
||||
} catch (error) {
|
||||
if (isNoExtractorError(String(error))) {
|
||||
@ -1983,7 +2361,7 @@ export async function extractPackageArchives(options: ExtractOptions): Promise<{
|
||||
const usedPassword = await runExternalExtract(archivePath, options.targetDir, options.conflictMode, archivePasswordCandidates, (value) => {
|
||||
archivePercent = Math.max(archivePercent, value);
|
||||
emitProgress(extracted + failed, archiveName, "extracting", archivePercent, Date.now() - archiveStartedAt);
|
||||
}, options.signal, hybrid);
|
||||
}, options.signal, hybrid, onPwAttempt);
|
||||
passwordCandidates = prioritizePassword(passwordCandidates, usedPassword);
|
||||
} catch (externalError) {
|
||||
if (isNoExtractorError(String(externalError)) || isUnsupportedArchiveFormatError(String(externalError))) {
|
||||
@ -1997,7 +2375,7 @@ export async function extractPackageArchives(options: ExtractOptions): Promise<{
|
||||
const usedPassword = await runExternalExtract(archivePath, options.targetDir, options.conflictMode, archivePasswordCandidates, (value) => {
|
||||
archivePercent = Math.max(archivePercent, value);
|
||||
emitProgress(extracted + failed, archiveName, "extracting", archivePercent, Date.now() - archiveStartedAt);
|
||||
}, options.signal, hybrid);
|
||||
}, options.signal, hybrid, onPwAttempt);
|
||||
passwordCandidates = prioritizePassword(passwordCandidates, usedPassword);
|
||||
}
|
||||
extracted += 1;
|
||||
@ -2006,13 +2384,17 @@ export async function extractPackageArchives(options: ExtractOptions): Promise<{
|
||||
await writeExtractResumeState(options.packageDir, resumeCompleted, options.packageId);
|
||||
logger.info(`Entpacken erfolgreich: ${path.basename(archivePath)}`);
|
||||
archivePercent = 100;
|
||||
emitProgress(extracted + failed, archiveName, "extracting", archivePercent, Date.now() - archiveStartedAt);
|
||||
if (hasManyPasswords) {
|
||||
emitProgress(extracted + failed, archiveName, "extracting", archivePercent, Date.now() - archiveStartedAt, { passwordFound: true });
|
||||
} else {
|
||||
emitProgress(extracted + failed, archiveName, "extracting", archivePercent, Date.now() - archiveStartedAt);
|
||||
}
|
||||
} catch (error) {
|
||||
failed += 1;
|
||||
const errorText = String(error);
|
||||
if (isExtractAbortError(errorText)) {
|
||||
throw error;
|
||||
}
|
||||
failed += 1;
|
||||
lastError = errorText;
|
||||
const errorCategory = classifyExtractionError(errorText);
|
||||
logger.error(`Entpack-Fehler ${path.basename(archivePath)} [${errorCategory}]: ${errorText}`);
|
||||
@ -2030,34 +2412,73 @@ export async function extractPackageArchives(options: ExtractOptions): Promise<{
|
||||
if (options.signal?.aborted || noExtractorEncountered) break;
|
||||
await extractSingleArchive(archivePath);
|
||||
}
|
||||
} else {
|
||||
// Parallel extraction pool: N workers pull from a shared queue
|
||||
const queue = [...pendingCandidates];
|
||||
let nextIdx = 0;
|
||||
let abortError: Error | null = null;
|
||||
|
||||
const worker = async (): Promise<void> => {
|
||||
while (nextIdx < queue.length && !abortError && !noExtractorEncountered) {
|
||||
if (options.signal?.aborted) break;
|
||||
const idx = nextIdx;
|
||||
nextIdx += 1;
|
||||
try {
|
||||
await extractSingleArchive(queue[idx]);
|
||||
} catch (error) {
|
||||
if (isExtractAbortError(String(error))) {
|
||||
abortError = error instanceof Error ? error : new Error(String(error));
|
||||
break;
|
||||
}
|
||||
// Non-abort errors are already handled inside extractSingleArchive
|
||||
}
|
||||
// Count remaining archives as failed when no extractor was found
|
||||
if (noExtractorEncountered) {
|
||||
const remaining = candidates.length - (extracted + failed);
|
||||
if (remaining > 0) {
|
||||
failed += remaining;
|
||||
emitProgress(candidates.length, "", "extracting", 0, 0);
|
||||
}
|
||||
};
|
||||
}
|
||||
} else {
|
||||
// Password discovery: extract first archive serially to find the correct password,
|
||||
// then run remaining archives in parallel with the promoted password order.
|
||||
let parallelQueue = pendingCandidates;
|
||||
if (passwordCandidates.length > 1 && pendingCandidates.length > 1) {
|
||||
logger.info(`Passwort-Discovery: Extrahiere erstes Archiv seriell (${passwordCandidates.length} Passwort-Kandidaten)...`);
|
||||
const first = pendingCandidates[0];
|
||||
try {
|
||||
await extractSingleArchive(first);
|
||||
} catch (err) {
|
||||
const errText = String(err);
|
||||
if (/aborted:extract/i.test(errText)) throw err;
|
||||
// noextractor:skipped — handled by noExtractorEncountered flag below
|
||||
}
|
||||
parallelQueue = pendingCandidates.slice(1);
|
||||
if (parallelQueue.length > 0) {
|
||||
logger.info(`Passwort-Discovery abgeschlossen, starte parallele Extraktion für ${parallelQueue.length} verbleibende Archive`);
|
||||
}
|
||||
}
|
||||
|
||||
const workerCount = Math.min(maxParallel, pendingCandidates.length);
|
||||
logger.info(`Parallele Extraktion: ${workerCount} gleichzeitige Worker für ${pendingCandidates.length} Archive`);
|
||||
await Promise.all(Array.from({ length: workerCount }, () => worker()));
|
||||
if (parallelQueue.length > 0 && !options.signal?.aborted && !noExtractorEncountered) {
|
||||
// Parallel extraction pool: N workers pull from a shared queue
|
||||
const queue = [...parallelQueue];
|
||||
let nextIdx = 0;
|
||||
let abortError: Error | null = null;
|
||||
|
||||
const worker = async (): Promise<void> => {
|
||||
while (nextIdx < queue.length && !abortError && !noExtractorEncountered) {
|
||||
if (options.signal?.aborted) break;
|
||||
const idx = nextIdx;
|
||||
nextIdx += 1;
|
||||
try {
|
||||
await extractSingleArchive(queue[idx]);
|
||||
} catch (error) {
|
||||
const errText = String(error);
|
||||
if (errText.includes("noextractor:skipped")) {
|
||||
break; // handled by noExtractorEncountered flag after the pool
|
||||
}
|
||||
if (isExtractAbortError(errText)) {
|
||||
abortError = error instanceof Error ? error : new Error(errText);
|
||||
break;
|
||||
}
|
||||
// Non-abort errors are already handled inside extractSingleArchive
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
const workerCount = Math.min(maxParallel, parallelQueue.length);
|
||||
logger.info(`Parallele Extraktion: ${workerCount} gleichzeitige Worker für ${parallelQueue.length} Archive`);
|
||||
// Snapshot passwordCandidates before parallel extraction to avoid concurrent mutation.
|
||||
// Each worker reads the same promoted order from the serial password-discovery pass.
|
||||
const frozenPasswords = [...passwordCandidates];
|
||||
await Promise.all(Array.from({ length: workerCount }, () => worker()));
|
||||
// Restore passwordCandidates from frozen snapshot (parallel mutations are discarded).
|
||||
passwordCandidates = frozenPasswords;
|
||||
|
||||
if (abortError) throw new Error("aborted:extract");
|
||||
}
|
||||
|
||||
if (abortError) throw new Error("aborted:extract");
|
||||
if (noExtractorEncountered) {
|
||||
const remaining = candidates.length - (extracted + failed);
|
||||
if (remaining > 0) {
|
||||
@ -2085,7 +2506,7 @@ export async function extractPackageArchives(options: ExtractOptions): Promise<{
|
||||
for (const nestedArchive of nestedCandidates) {
|
||||
if (options.signal?.aborted) throw new Error("aborted:extract");
|
||||
const nestedName = path.basename(nestedArchive);
|
||||
const nestedKey = archiveNameKey(nestedName);
|
||||
const nestedKey = archiveNameKey(`nested:${nestedName}`);
|
||||
if (resumeCompleted.has(nestedKey)) {
|
||||
logger.info(`Nested-Extraction übersprungen (bereits entpackt): ${nestedName}`);
|
||||
continue;
|
||||
@ -2119,10 +2540,8 @@ export async function extractPackageArchives(options: ExtractOptions): Promise<{
|
||||
resumeCompleted.add(nestedKey);
|
||||
await writeExtractResumeState(options.packageDir, resumeCompleted, options.packageId);
|
||||
logger.info(`Nested-Entpacken erfolgreich: ${nestedName}`);
|
||||
if (options.cleanupMode === "delete") {
|
||||
for (const part of collectArchiveCleanupTargets(nestedArchive)) {
|
||||
try { await fs.promises.unlink(part); } catch { /* ignore */ }
|
||||
}
|
||||
if (options.cleanupMode !== "none") {
|
||||
await cleanupArchives([nestedArchive], options.cleanupMode);
|
||||
}
|
||||
} catch (nestedErr) {
|
||||
const errText = String(nestedErr);
|
||||
@ -2150,7 +2569,7 @@ export async function extractPackageArchives(options: ExtractOptions): Promise<{
|
||||
}
|
||||
|
||||
if (extracted > 0) {
|
||||
const hasOutputAfter = await hasAnyEntries(options.targetDir);
|
||||
const hasOutputAfter = await hasAnyFilesRecursive(options.targetDir);
|
||||
const hadResumeProgress = resumeCompletedAtStart > 0;
|
||||
if (!hasOutputAfter && conflictMode !== "skip" && !hadResumeProgress) {
|
||||
lastError = "Keine entpackten Dateien erkannt";
|
||||
|
||||
@ -8,12 +8,19 @@ const LOG_BUFFER_LIMIT_CHARS = 1_000_000;
|
||||
const LOG_MAX_FILE_BYTES = 10 * 1024 * 1024;
|
||||
const rotateCheckAtByFile = new Map<string, number>();
|
||||
|
||||
type LogListener = (line: string) => void;
|
||||
let logListener: LogListener | null = null;
|
||||
|
||||
let pendingLines: string[] = [];
|
||||
let pendingChars = 0;
|
||||
let flushTimer: NodeJS.Timeout | null = null;
|
||||
let flushInFlight = false;
|
||||
let exitHookAttached = false;
|
||||
|
||||
export function setLogListener(listener: LogListener | null): void {
|
||||
logListener = listener;
|
||||
}
|
||||
|
||||
export function configureLogger(baseDir: string): void {
|
||||
logFilePath = path.join(baseDir, "rd_downloader.log");
|
||||
const cwdLogPath = path.resolve(process.cwd(), "rd_downloader.log");
|
||||
@ -188,6 +195,10 @@ function write(level: "INFO" | "WARN" | "ERROR", message: string): void {
|
||||
pendingLines.push(line);
|
||||
pendingChars += line.length;
|
||||
|
||||
if (logListener) {
|
||||
try { logListener(line); } catch { /* ignore */ }
|
||||
}
|
||||
|
||||
while (pendingChars > LOG_BUFFER_LIMIT_CHARS && pendingLines.length > 1) {
|
||||
const removed = pendingLines.shift();
|
||||
if (!removed) {
|
||||
|
||||
@ -7,6 +7,7 @@ import { IPC_CHANNELS } from "../shared/ipc";
|
||||
import { getLogFilePath, logger } from "./logger";
|
||||
import { APP_NAME } from "./constants";
|
||||
import { extractHttpLinksFromText } from "./utils";
|
||||
import { cleanupStaleSubstDrives, shutdownDaemon } from "./extractor";
|
||||
|
||||
/* ── IPC validation helpers ────────────────────────────────────── */
|
||||
function validateString(value: unknown, name: string): string {
|
||||
@ -50,6 +51,7 @@ process.on("unhandledRejection", (reason) => {
|
||||
let mainWindow: BrowserWindow | null = null;
|
||||
let tray: Tray | null = null;
|
||||
let clipboardTimer: ReturnType<typeof setInterval> | null = null;
|
||||
let updateQuitTimer: ReturnType<typeof setTimeout> | null = null;
|
||||
let lastClipboardText = "";
|
||||
const controller = new AppController();
|
||||
const CLIPBOARD_MAX_TEXT_CHARS = 50_000;
|
||||
@ -80,7 +82,7 @@ function createWindow(): BrowserWindow {
|
||||
responseHeaders: {
|
||||
...details.responseHeaders,
|
||||
"Content-Security-Policy": [
|
||||
"default-src 'self'; script-src 'self'; style-src 'self' 'unsafe-inline'; img-src 'self' data:; connect-src 'self' https://api.real-debrid.com https://codeberg.org https://bestdebrid.com https://api.alldebrid.com https://www.mega-debrid.eu"
|
||||
"default-src 'self'; script-src 'self'; style-src 'self' 'unsafe-inline'; img-src 'self' data:; connect-src 'self' https://api.real-debrid.com https://codeberg.org https://bestdebrid.com https://api.alldebrid.com https://www.mega-debrid.eu https://git.24-music.de https://ddownload.com https://ddl.to"
|
||||
]
|
||||
}
|
||||
});
|
||||
@ -129,7 +131,7 @@ function createTray(): void {
|
||||
const contextMenu = Menu.buildFromTemplate([
|
||||
{ label: "Anzeigen", click: () => { mainWindow?.show(); mainWindow?.focus(); } },
|
||||
{ type: "separator" },
|
||||
{ label: "Start", click: () => { controller.start(); } },
|
||||
{ label: "Start", click: () => { void controller.start().catch((err) => logger.warn(`Tray Start Fehler: ${String(err)}`)); } },
|
||||
{ label: "Stop", click: () => { controller.stop(); } },
|
||||
{ type: "separator" },
|
||||
{ label: "Beenden", click: () => { app.quit(); } }
|
||||
@ -187,7 +189,12 @@ function startClipboardWatcher(): void {
|
||||
}
|
||||
lastClipboardText = normalizeClipboardText(clipboard.readText());
|
||||
clipboardTimer = setInterval(() => {
|
||||
const text = normalizeClipboardText(clipboard.readText());
|
||||
let text: string;
|
||||
try {
|
||||
text = normalizeClipboardText(clipboard.readText());
|
||||
} catch {
|
||||
return;
|
||||
}
|
||||
if (text === lastClipboardText || !text.trim()) {
|
||||
return;
|
||||
}
|
||||
@ -236,9 +243,9 @@ function registerIpcHandlers(): void {
|
||||
mainWindow.webContents.send(IPC_CHANNELS.UPDATE_INSTALL_PROGRESS, progress);
|
||||
});
|
||||
if (result.started) {
|
||||
setTimeout(() => {
|
||||
updateQuitTimer = setTimeout(() => {
|
||||
app.quit();
|
||||
}, 800);
|
||||
}, 2500);
|
||||
}
|
||||
return result;
|
||||
});
|
||||
@ -289,8 +296,12 @@ function registerIpcHandlers(): void {
|
||||
ipcMain.handle(IPC_CHANNELS.CLEAR_ALL, () => controller.clearAll());
|
||||
ipcMain.handle(IPC_CHANNELS.START, () => controller.start());
|
||||
ipcMain.handle(IPC_CHANNELS.START_PACKAGES, (_event: IpcMainInvokeEvent, packageIds: string[]) => {
|
||||
if (!Array.isArray(packageIds)) throw new Error("packageIds muss ein Array sein");
|
||||
return controller.startPackages(packageIds);
|
||||
validateStringArray(packageIds ?? [], "packageIds");
|
||||
return controller.startPackages(packageIds ?? []);
|
||||
});
|
||||
ipcMain.handle(IPC_CHANNELS.START_ITEMS, (_event: IpcMainInvokeEvent, itemIds: string[]) => {
|
||||
validateStringArray(itemIds ?? [], "itemIds");
|
||||
return controller.startItems(itemIds ?? []);
|
||||
});
|
||||
ipcMain.handle(IPC_CHANNELS.STOP, () => controller.stop());
|
||||
ipcMain.handle(IPC_CHANNELS.TOGGLE_PAUSE, () => controller.togglePause());
|
||||
@ -326,13 +337,45 @@ function registerIpcHandlers(): void {
|
||||
validateString(packageId, "packageId");
|
||||
return controller.extractNow(packageId);
|
||||
});
|
||||
ipcMain.handle(IPC_CHANNELS.RESET_PACKAGE, (_event: IpcMainInvokeEvent, packageId: string) => {
|
||||
validateString(packageId, "packageId");
|
||||
return controller.resetPackage(packageId);
|
||||
});
|
||||
ipcMain.handle(IPC_CHANNELS.SET_PACKAGE_PRIORITY, (_event: IpcMainInvokeEvent, packageId: string, priority: string) => {
|
||||
validateString(packageId, "packageId");
|
||||
validateString(priority, "priority");
|
||||
if (priority !== "high" && priority !== "normal" && priority !== "low") {
|
||||
throw new Error("priority muss 'high', 'normal' oder 'low' sein");
|
||||
}
|
||||
return controller.setPackagePriority(packageId, priority);
|
||||
});
|
||||
ipcMain.handle(IPC_CHANNELS.SKIP_ITEMS, (_event: IpcMainInvokeEvent, itemIds: string[]) => {
|
||||
validateStringArray(itemIds ?? [], "itemIds");
|
||||
return controller.skipItems(itemIds ?? []);
|
||||
});
|
||||
ipcMain.handle(IPC_CHANNELS.RESET_ITEMS, (_event: IpcMainInvokeEvent, itemIds: string[]) => {
|
||||
validateStringArray(itemIds ?? [], "itemIds");
|
||||
return controller.resetItems(itemIds ?? []);
|
||||
});
|
||||
ipcMain.handle(IPC_CHANNELS.GET_HISTORY, () => controller.getHistory());
|
||||
ipcMain.handle(IPC_CHANNELS.CLEAR_HISTORY, () => controller.clearHistory());
|
||||
ipcMain.handle(IPC_CHANNELS.REMOVE_HISTORY_ENTRY, (_event: IpcMainInvokeEvent, entryId: string) => {
|
||||
validateString(entryId, "entryId");
|
||||
return controller.removeHistoryEntry(entryId);
|
||||
});
|
||||
ipcMain.handle(IPC_CHANNELS.EXPORT_QUEUE, () => controller.exportQueue());
|
||||
ipcMain.handle(IPC_CHANNELS.EXPORT_QUEUE, async () => {
|
||||
const options = {
|
||||
defaultPath: `rd-queue-export.json`,
|
||||
filters: [{ name: "Queue Export", extensions: ["json"] }]
|
||||
};
|
||||
const result = mainWindow ? await dialog.showSaveDialog(mainWindow, options) : await dialog.showSaveDialog(options);
|
||||
if (result.canceled || !result.filePath) {
|
||||
return { saved: false };
|
||||
}
|
||||
const json = controller.exportQueue();
|
||||
await fs.promises.writeFile(result.filePath, json, "utf8");
|
||||
return { saved: true };
|
||||
});
|
||||
ipcMain.handle(IPC_CHANNELS.IMPORT_QUEUE, (_event: IpcMainInvokeEvent, json: string) => {
|
||||
validateString(json, "json");
|
||||
const bytes = Buffer.byteLength(json, "utf8");
|
||||
@ -396,6 +439,13 @@ function registerIpcHandlers(): void {
|
||||
await shell.openPath(logPath);
|
||||
});
|
||||
|
||||
ipcMain.handle(IPC_CHANNELS.OPEN_SESSION_LOG, async () => {
|
||||
const logPath = controller.getSessionLogPath();
|
||||
if (logPath) {
|
||||
await shell.openPath(logPath);
|
||||
}
|
||||
});
|
||||
|
||||
ipcMain.handle(IPC_CHANNELS.IMPORT_BACKUP, async () => {
|
||||
const options = {
|
||||
properties: ["openFile"] as Array<"openFile">,
|
||||
@ -409,6 +459,11 @@ function registerIpcHandlers(): void {
|
||||
return { restored: false, message: "Abgebrochen" };
|
||||
}
|
||||
const filePath = result.filePaths[0];
|
||||
const stat = await fs.promises.stat(filePath);
|
||||
const BACKUP_MAX_BYTES = 50 * 1024 * 1024;
|
||||
if (stat.size > BACKUP_MAX_BYTES) {
|
||||
return { restored: false, message: `Backup-Datei zu groß (max 50 MB, Datei hat ${(stat.size / 1024 / 1024).toFixed(1)} MB)` };
|
||||
}
|
||||
const json = await fs.promises.readFile(filePath, "utf8");
|
||||
return controller.importBackup(json);
|
||||
});
|
||||
@ -432,6 +487,7 @@ app.on("second-instance", () => {
|
||||
});
|
||||
|
||||
app.whenReady().then(() => {
|
||||
cleanupStaleSubstDrives();
|
||||
registerIpcHandlers();
|
||||
mainWindow = createWindow();
|
||||
bindMainWindowLifecycle(mainWindow);
|
||||
@ -444,6 +500,9 @@ app.whenReady().then(() => {
|
||||
bindMainWindowLifecycle(mainWindow);
|
||||
}
|
||||
});
|
||||
}).catch((error) => {
|
||||
console.error("App startup failed:", error);
|
||||
app.quit();
|
||||
});
|
||||
|
||||
app.on("window-all-closed", () => {
|
||||
@ -453,8 +512,10 @@ app.on("window-all-closed", () => {
|
||||
});
|
||||
|
||||
app.on("before-quit", () => {
|
||||
if (updateQuitTimer) { clearTimeout(updateQuitTimer); updateQuitTimer = null; }
|
||||
stopClipboardWatcher();
|
||||
destroyTray();
|
||||
shutdownDaemon();
|
||||
try {
|
||||
controller.shutdown();
|
||||
} catch (error) {
|
||||
|
||||
@ -228,22 +228,23 @@ export class MegaWebFallback {
|
||||
}
|
||||
|
||||
public async unrestrict(link: string, signal?: AbortSignal): Promise<UnrestrictedLink | null> {
|
||||
const overallSignal = withTimeoutSignal(signal, 180000);
|
||||
return this.runExclusive(async () => {
|
||||
throwIfAborted(signal);
|
||||
throwIfAborted(overallSignal);
|
||||
const creds = this.getCredentials();
|
||||
if (!creds.login.trim() || !creds.password.trim()) {
|
||||
return null;
|
||||
}
|
||||
|
||||
if (!this.cookie || Date.now() - this.cookieSetAt > 20 * 60 * 1000) {
|
||||
await this.login(creds.login, creds.password, signal);
|
||||
await this.login(creds.login, creds.password, overallSignal);
|
||||
}
|
||||
|
||||
const generated = await this.generate(link, signal);
|
||||
const generated = await this.generate(link, overallSignal);
|
||||
if (!generated) {
|
||||
this.cookie = "";
|
||||
await this.login(creds.login, creds.password, signal);
|
||||
const retry = await this.generate(link, signal);
|
||||
await this.login(creds.login, creds.password, overallSignal);
|
||||
const retry = await this.generate(link, overallSignal);
|
||||
if (!retry) {
|
||||
return null;
|
||||
}
|
||||
@ -261,7 +262,7 @@ export class MegaWebFallback {
|
||||
fileSize: null,
|
||||
retriesUsed: 0
|
||||
};
|
||||
}, signal);
|
||||
}, overallSignal);
|
||||
}
|
||||
|
||||
public invalidateSession(): void {
|
||||
|
||||
@ -8,6 +8,7 @@ export interface UnrestrictedLink {
|
||||
directUrl: string;
|
||||
fileSize: number | null;
|
||||
retriesUsed: number;
|
||||
skipTlsVerify?: boolean;
|
||||
}
|
||||
|
||||
function shouldRetryStatus(status: number): boolean {
|
||||
@ -62,7 +63,8 @@ function isRetryableErrorText(text: string): boolean {
|
||||
|| lower.includes("aborted")
|
||||
|| lower.includes("econnreset")
|
||||
|| lower.includes("enotfound")
|
||||
|| lower.includes("etimedout");
|
||||
|| lower.includes("etimedout")
|
||||
|| lower.includes("html statt json");
|
||||
}
|
||||
|
||||
function withTimeoutSignal(signal: AbortSignal | undefined, timeoutMs: number): AbortSignal {
|
||||
@ -77,6 +79,11 @@ async function sleepWithSignal(ms: number, signal?: AbortSignal): Promise<void>
|
||||
await sleep(ms);
|
||||
return;
|
||||
}
|
||||
// Check before entering the Promise constructor to avoid a race where the timer
|
||||
// resolves before the aborted check runs (especially when ms=0).
|
||||
if (signal.aborted) {
|
||||
throw new Error("aborted");
|
||||
}
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
let timer: NodeJS.Timeout | null = setTimeout(() => {
|
||||
timer = null;
|
||||
@ -93,10 +100,6 @@ async function sleepWithSignal(ms: number, signal?: AbortSignal): Promise<void>
|
||||
reject(new Error("aborted"));
|
||||
};
|
||||
|
||||
if (signal.aborted) {
|
||||
onAbort();
|
||||
return;
|
||||
}
|
||||
signal.addEventListener("abort", onAbort, { once: true });
|
||||
});
|
||||
}
|
||||
@ -165,6 +168,15 @@ export class RealDebridClient {
|
||||
if (!directUrl) {
|
||||
throw new Error("Unrestrict ohne Download-URL");
|
||||
}
|
||||
try {
|
||||
const parsedUrl = new URL(directUrl);
|
||||
if (parsedUrl.protocol !== "https:" && parsedUrl.protocol !== "http:") {
|
||||
throw new Error(`Ungültiges Download-URL-Protokoll (${parsedUrl.protocol})`);
|
||||
}
|
||||
} catch (urlError) {
|
||||
if (urlError instanceof Error && urlError.message.includes("Protokoll")) throw urlError;
|
||||
throw new Error("Real-Debrid Antwort enthält keine gültige Download-URL");
|
||||
}
|
||||
|
||||
const fileName = String(payload.filename || "download.bin").trim() || "download.bin";
|
||||
const fileSizeRaw = Number(payload.filesize ?? NaN);
|
||||
@ -176,7 +188,7 @@ export class RealDebridClient {
|
||||
};
|
||||
} catch (error) {
|
||||
lastError = compactErrorText(error);
|
||||
if (signal?.aborted || /aborted/i.test(lastError)) {
|
||||
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
|
||||
break;
|
||||
}
|
||||
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(lastError)) {
|
||||
@ -186,6 +198,6 @@ export class RealDebridClient {
|
||||
}
|
||||
}
|
||||
|
||||
throw new Error(lastError || "Unrestrict fehlgeschlagen");
|
||||
throw new Error(String(lastError || "Unrestrict fehlgeschlagen").replace(/^Error:\s*/i, ""));
|
||||
}
|
||||
}
|
||||
|
||||
128
src/main/session-log.ts
Normal file
128
src/main/session-log.ts
Normal file
@ -0,0 +1,128 @@
|
||||
import fs from "node:fs";
|
||||
import path from "node:path";
|
||||
import { setLogListener } from "./logger";
|
||||
|
||||
const SESSION_LOG_FLUSH_INTERVAL_MS = 200;
|
||||
|
||||
let sessionLogPath: string | null = null;
|
||||
let sessionLogsDir: string | null = null;
|
||||
let pendingLines: string[] = [];
|
||||
let flushTimer: NodeJS.Timeout | null = null;
|
||||
|
||||
function formatTimestamp(): string {
|
||||
const now = new Date();
|
||||
const y = now.getFullYear();
|
||||
const mo = String(now.getMonth() + 1).padStart(2, "0");
|
||||
const d = String(now.getDate()).padStart(2, "0");
|
||||
const h = String(now.getHours()).padStart(2, "0");
|
||||
const mi = String(now.getMinutes()).padStart(2, "0");
|
||||
const s = String(now.getSeconds()).padStart(2, "0");
|
||||
return `${y}-${mo}-${d}_${h}-${mi}-${s}`;
|
||||
}
|
||||
|
||||
function flushPending(): void {
|
||||
if (pendingLines.length === 0 || !sessionLogPath) {
|
||||
return;
|
||||
}
|
||||
const chunk = pendingLines.join("");
|
||||
pendingLines = [];
|
||||
try {
|
||||
fs.appendFileSync(sessionLogPath, chunk, "utf8");
|
||||
} catch {
|
||||
// ignore write errors
|
||||
}
|
||||
}
|
||||
|
||||
function scheduleFlush(): void {
|
||||
if (flushTimer) {
|
||||
return;
|
||||
}
|
||||
flushTimer = setTimeout(() => {
|
||||
flushTimer = null;
|
||||
flushPending();
|
||||
}, SESSION_LOG_FLUSH_INTERVAL_MS);
|
||||
}
|
||||
|
||||
function appendToSessionLog(line: string): void {
|
||||
if (!sessionLogPath) {
|
||||
return;
|
||||
}
|
||||
pendingLines.push(line);
|
||||
scheduleFlush();
|
||||
}
|
||||
|
||||
async function cleanupOldSessionLogs(dir: string, maxAgeDays: number): Promise<void> {
|
||||
try {
|
||||
const files = await fs.promises.readdir(dir);
|
||||
const cutoff = Date.now() - maxAgeDays * 24 * 60 * 60 * 1000;
|
||||
for (const file of files) {
|
||||
if (!file.startsWith("session_") || !file.endsWith(".txt")) {
|
||||
continue;
|
||||
}
|
||||
const filePath = path.join(dir, file);
|
||||
try {
|
||||
const stat = await fs.promises.stat(filePath);
|
||||
if (stat.mtimeMs < cutoff) {
|
||||
await fs.promises.unlink(filePath);
|
||||
}
|
||||
} catch {
|
||||
// ignore - file may be locked
|
||||
}
|
||||
}
|
||||
} catch {
|
||||
// ignore - dir may not exist
|
||||
}
|
||||
}
|
||||
|
||||
export function initSessionLog(baseDir: string): void {
|
||||
sessionLogsDir = path.join(baseDir, "session-logs");
|
||||
try {
|
||||
fs.mkdirSync(sessionLogsDir, { recursive: true });
|
||||
} catch {
|
||||
sessionLogsDir = null;
|
||||
return;
|
||||
}
|
||||
|
||||
const timestamp = formatTimestamp();
|
||||
sessionLogPath = path.join(sessionLogsDir, `session_${timestamp}.txt`);
|
||||
|
||||
const isoTimestamp = new Date().toISOString();
|
||||
try {
|
||||
fs.writeFileSync(sessionLogPath, `=== Session gestartet: ${isoTimestamp} ===\n`, "utf8");
|
||||
} catch {
|
||||
sessionLogPath = null;
|
||||
return;
|
||||
}
|
||||
|
||||
setLogListener((line) => appendToSessionLog(line));
|
||||
|
||||
void cleanupOldSessionLogs(sessionLogsDir, 7);
|
||||
}
|
||||
|
||||
export function getSessionLogPath(): string | null {
|
||||
return sessionLogPath;
|
||||
}
|
||||
|
||||
export function shutdownSessionLog(): void {
|
||||
if (!sessionLogPath) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Flush any pending lines
|
||||
if (flushTimer) {
|
||||
clearTimeout(flushTimer);
|
||||
flushTimer = null;
|
||||
}
|
||||
flushPending();
|
||||
|
||||
// Write closing line
|
||||
const isoTimestamp = new Date().toISOString();
|
||||
try {
|
||||
fs.appendFileSync(sessionLogPath, `=== Session beendet: ${isoTimestamp} ===\n`, "utf8");
|
||||
} catch {
|
||||
// ignore
|
||||
}
|
||||
|
||||
setLogListener(null);
|
||||
sessionLogPath = null;
|
||||
}
|
||||
@ -1,21 +1,24 @@
|
||||
import fs from "node:fs";
|
||||
import fsp from "node:fs/promises";
|
||||
import path from "node:path";
|
||||
import { AppSettings, BandwidthScheduleEntry, DebridProvider, DownloadItem, DownloadStatus, HistoryEntry, PackageEntry, SessionState } from "../shared/types";
|
||||
import { AppSettings, BandwidthScheduleEntry, DebridProvider, DownloadItem, DownloadStatus, HistoryEntry, PackageEntry, PackagePriority, SessionState } from "../shared/types";
|
||||
import { defaultSettings } from "./constants";
|
||||
import { logger } from "./logger";
|
||||
|
||||
const VALID_PRIMARY_PROVIDERS = new Set(["realdebrid", "megadebrid", "bestdebrid", "alldebrid"]);
|
||||
const VALID_FALLBACK_PROVIDERS = new Set(["none", "realdebrid", "megadebrid", "bestdebrid", "alldebrid"]);
|
||||
const VALID_PRIMARY_PROVIDERS = new Set(["realdebrid", "megadebrid", "bestdebrid", "alldebrid", "ddownload"]);
|
||||
const VALID_FALLBACK_PROVIDERS = new Set(["none", "realdebrid", "megadebrid", "bestdebrid", "alldebrid", "ddownload"]);
|
||||
const VALID_CLEANUP_MODES = new Set(["none", "trash", "delete"]);
|
||||
const VALID_CONFLICT_MODES = new Set(["overwrite", "skip", "rename", "ask"]);
|
||||
const VALID_FINISHED_POLICIES = new Set(["never", "immediate", "on_start", "package_done"]);
|
||||
const VALID_SPEED_MODES = new Set(["global", "per_download"]);
|
||||
const VALID_THEMES = new Set(["dark", "light"]);
|
||||
const VALID_EXTRACT_CPU_PRIORITIES = new Set(["high", "middle", "low"]);
|
||||
const VALID_PACKAGE_PRIORITIES = new Set<string>(["high", "normal", "low"]);
|
||||
const VALID_DOWNLOAD_STATUSES = new Set<DownloadStatus>([
|
||||
"queued", "validating", "downloading", "paused", "reconnect_wait", "extracting", "integrity_check", "completed", "failed", "cancelled"
|
||||
]);
|
||||
const VALID_ITEM_PROVIDERS = new Set<DebridProvider>(["realdebrid", "megadebrid", "bestdebrid", "alldebrid"]);
|
||||
const VALID_ITEM_PROVIDERS = new Set<DebridProvider>(["realdebrid", "megadebrid", "bestdebrid", "alldebrid", "ddownload"]);
|
||||
const VALID_ONLINE_STATUSES = new Set(["online", "offline", "checking"]);
|
||||
|
||||
function asText(value: unknown): string {
|
||||
return String(value ?? "").trim();
|
||||
@ -65,6 +68,41 @@ function normalizeAbsoluteDir(value: unknown, fallback: string): string {
|
||||
return path.resolve(text);
|
||||
}
|
||||
|
||||
const DEFAULT_COLUMN_ORDER = ["name", "size", "progress", "hoster", "account", "prio", "status", "speed"];
|
||||
const ALL_VALID_COLUMNS = new Set([...DEFAULT_COLUMN_ORDER, "added"]);
|
||||
|
||||
function normalizeColumnOrder(raw: unknown): string[] {
|
||||
if (!Array.isArray(raw) || raw.length === 0) {
|
||||
return [...DEFAULT_COLUMN_ORDER];
|
||||
}
|
||||
const valid = ALL_VALID_COLUMNS;
|
||||
const seen = new Set<string>();
|
||||
const result: string[] = [];
|
||||
for (const col of raw) {
|
||||
if (typeof col === "string" && valid.has(col) && !seen.has(col)) {
|
||||
seen.add(col);
|
||||
result.push(col);
|
||||
}
|
||||
}
|
||||
// "name" is mandatory — ensure it's always present
|
||||
if (!seen.has("name")) {
|
||||
result.unshift("name");
|
||||
}
|
||||
return result;
|
||||
}
|
||||
|
||||
const DEPRECATED_UPDATE_REPOS = new Set([
|
||||
"sucukdeluxe/real-debrid-downloader"
|
||||
]);
|
||||
|
||||
function migrateUpdateRepo(raw: string, fallback: string): string {
|
||||
const trimmed = raw.trim();
|
||||
if (!trimmed || DEPRECATED_UPDATE_REPOS.has(trimmed.toLowerCase())) {
|
||||
return fallback;
|
||||
}
|
||||
return trimmed;
|
||||
}
|
||||
|
||||
export function normalizeSettings(settings: AppSettings): AppSettings {
|
||||
const defaults = defaultSettings();
|
||||
const normalized: AppSettings = {
|
||||
@ -73,7 +111,9 @@ export function normalizeSettings(settings: AppSettings): AppSettings {
|
||||
megaPassword: asText(settings.megaPassword),
|
||||
bestToken: asText(settings.bestToken),
|
||||
allDebridToken: asText(settings.allDebridToken),
|
||||
archivePasswordList: String(settings.archivePasswordList ?? "").replace(/\r\n/g, "\n"),
|
||||
ddownloadLogin: asText(settings.ddownloadLogin),
|
||||
ddownloadPassword: asText(settings.ddownloadPassword),
|
||||
archivePasswordList: String(settings.archivePasswordList ?? "").replace(/\r\n|\r/g, "\n"),
|
||||
rememberToken: Boolean(settings.rememberToken),
|
||||
providerPrimary: settings.providerPrimary,
|
||||
providerSecondary: settings.providerSecondary,
|
||||
@ -104,7 +144,7 @@ export function normalizeSettings(settings: AppSettings): AppSettings {
|
||||
speedLimitKbps: clampNumber(settings.speedLimitKbps, defaults.speedLimitKbps, 0, 500000),
|
||||
speedLimitMode: settings.speedLimitMode,
|
||||
autoUpdateCheck: Boolean(settings.autoUpdateCheck),
|
||||
updateRepo: asText(settings.updateRepo) || defaults.updateRepo,
|
||||
updateRepo: migrateUpdateRepo(asText(settings.updateRepo), defaults.updateRepo),
|
||||
clipboardWatch: Boolean(settings.clipboardWatch),
|
||||
minimizeToTray: Boolean(settings.minimizeToTray),
|
||||
collapseNewPackages: settings.collapseNewPackages !== undefined ? Boolean(settings.collapseNewPackages) : defaults.collapseNewPackages,
|
||||
@ -112,7 +152,10 @@ export function normalizeSettings(settings: AppSettings): AppSettings {
|
||||
confirmDeleteSelection: settings.confirmDeleteSelection !== undefined ? Boolean(settings.confirmDeleteSelection) : defaults.confirmDeleteSelection,
|
||||
totalDownloadedAllTime: typeof settings.totalDownloadedAllTime === "number" && settings.totalDownloadedAllTime >= 0 ? settings.totalDownloadedAllTime : defaults.totalDownloadedAllTime,
|
||||
theme: VALID_THEMES.has(settings.theme) ? settings.theme : defaults.theme,
|
||||
bandwidthSchedules: normalizeBandwidthSchedules(settings.bandwidthSchedules)
|
||||
bandwidthSchedules: normalizeBandwidthSchedules(settings.bandwidthSchedules),
|
||||
columnOrder: normalizeColumnOrder(settings.columnOrder),
|
||||
extractCpuPriority: settings.extractCpuPriority,
|
||||
autoExtractWhenStopped: settings.autoExtractWhenStopped !== undefined ? Boolean(settings.autoExtractWhenStopped) : defaults.autoExtractWhenStopped
|
||||
};
|
||||
|
||||
if (!VALID_PRIMARY_PROVIDERS.has(normalized.providerPrimary)) {
|
||||
@ -142,6 +185,9 @@ export function normalizeSettings(settings: AppSettings): AppSettings {
|
||||
if (!VALID_SPEED_MODES.has(normalized.speedLimitMode)) {
|
||||
normalized.speedLimitMode = defaults.speedLimitMode;
|
||||
}
|
||||
if (!VALID_EXTRACT_CPU_PRIORITIES.has(normalized.extractCpuPriority)) {
|
||||
normalized.extractCpuPriority = defaults.extractCpuPriority;
|
||||
}
|
||||
|
||||
return normalized;
|
||||
}
|
||||
@ -157,7 +203,8 @@ function sanitizeCredentialPersistence(settings: AppSettings): AppSettings {
|
||||
megaPassword: "",
|
||||
bestToken: "",
|
||||
allDebridToken: "",
|
||||
archivePasswordList: ""
|
||||
ddownloadLogin: "",
|
||||
ddownloadPassword: ""
|
||||
};
|
||||
}
|
||||
|
||||
@ -201,7 +248,7 @@ function readSettingsFile(filePath: string): AppSettings | null {
|
||||
}
|
||||
}
|
||||
|
||||
function normalizeLoadedSession(raw: unknown): SessionState {
|
||||
export function normalizeLoadedSession(raw: unknown): SessionState {
|
||||
const fallback = emptySession();
|
||||
const parsed = asRecord(raw);
|
||||
if (!parsed) {
|
||||
@ -227,6 +274,8 @@ function normalizeLoadedSession(raw: unknown): SessionState {
|
||||
const status: DownloadStatus = VALID_DOWNLOAD_STATUSES.has(statusRaw) ? statusRaw : "queued";
|
||||
const providerRaw = asText(item.provider) as DebridProvider;
|
||||
|
||||
const onlineStatusRaw = asText(item.onlineStatus);
|
||||
|
||||
itemsById[id] = {
|
||||
id,
|
||||
packageId,
|
||||
@ -244,6 +293,7 @@ function normalizeLoadedSession(raw: unknown): SessionState {
|
||||
attempts: clampNumber(item.attempts, 0, 0, 10_000),
|
||||
lastError: asText(item.lastError),
|
||||
fullStatus: asText(item.fullStatus),
|
||||
onlineStatus: VALID_ONLINE_STATUSES.has(onlineStatusRaw) ? onlineStatusRaw as "online" | "offline" | "checking" : undefined,
|
||||
createdAt: clampNumber(item.createdAt, now, 0, Number.MAX_SAFE_INTEGER),
|
||||
updatedAt: clampNumber(item.updatedAt, now, 0, Number.MAX_SAFE_INTEGER)
|
||||
};
|
||||
@ -274,6 +324,7 @@ function normalizeLoadedSession(raw: unknown): SessionState {
|
||||
.filter((value) => value.length > 0),
|
||||
cancelled: Boolean(pkg.cancelled),
|
||||
enabled: pkg.enabled === undefined ? true : Boolean(pkg.enabled),
|
||||
priority: VALID_PACKAGE_PRIORITIES.has(asText(pkg.priority)) ? asText(pkg.priority) as PackagePriority : "normal",
|
||||
createdAt: clampNumber(pkg.createdAt, now, 0, Number.MAX_SAFE_INTEGER),
|
||||
updatedAt: clampNumber(pkg.updatedAt, now, 0, Number.MAX_SAFE_INTEGER)
|
||||
};
|
||||
@ -304,7 +355,8 @@ function normalizeLoadedSession(raw: unknown): SessionState {
|
||||
return true;
|
||||
});
|
||||
for (const packageId of Object.keys(packagesById)) {
|
||||
if (!packageOrder.includes(packageId)) {
|
||||
if (!seenOrder.has(packageId)) {
|
||||
seenOrder.add(packageId);
|
||||
packageOrder.push(packageId);
|
||||
}
|
||||
}
|
||||
@ -376,7 +428,7 @@ function sessionBackupPath(sessionFile: string): string {
|
||||
return `${sessionFile}.bak`;
|
||||
}
|
||||
|
||||
function normalizeLoadedSessionTransientFields(session: SessionState): SessionState {
|
||||
export function normalizeLoadedSessionTransientFields(session: SessionState): SessionState {
|
||||
// Reset transient fields that may be stale from a previous crash
|
||||
const ACTIVE_STATUSES = new Set(["downloading", "validating", "extracting", "integrity_check", "paused", "reconnect_wait"]);
|
||||
for (const item of Object.values(session.items)) {
|
||||
@ -388,6 +440,19 @@ function normalizeLoadedSessionTransientFields(session: SessionState): SessionSt
|
||||
item.speedBps = 0;
|
||||
}
|
||||
|
||||
// Reset package-level active statuses to queued (mirrors item reset above)
|
||||
const ACTIVE_PKG_STATUSES = new Set(["downloading", "validating", "extracting", "integrity_check", "paused", "reconnect_wait"]);
|
||||
for (const pkg of Object.values(session.packages)) {
|
||||
if (ACTIVE_PKG_STATUSES.has(pkg.status)) {
|
||||
pkg.status = "queued";
|
||||
}
|
||||
pkg.postProcessLabel = undefined;
|
||||
}
|
||||
|
||||
// Clear stale session-level running/paused flags
|
||||
session.running = false;
|
||||
session.paused = false;
|
||||
|
||||
return session;
|
||||
}
|
||||
|
||||
@ -413,12 +478,17 @@ export function saveSettings(paths: StoragePaths, settings: AppSettings): void {
|
||||
const persisted = sanitizeCredentialPersistence(normalizeSettings(settings));
|
||||
const payload = JSON.stringify(persisted, null, 2);
|
||||
const tempPath = `${paths.configFile}.tmp`;
|
||||
fs.writeFileSync(tempPath, payload, "utf8");
|
||||
syncRenameWithExdevFallback(tempPath, paths.configFile);
|
||||
try {
|
||||
fs.writeFileSync(tempPath, payload, "utf8");
|
||||
syncRenameWithExdevFallback(tempPath, paths.configFile);
|
||||
} catch (error) {
|
||||
try { fs.rmSync(tempPath, { force: true }); } catch { /* ignore */ }
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
let asyncSettingsSaveRunning = false;
|
||||
let asyncSettingsSaveQueued: { paths: StoragePaths; payload: string } | null = null;
|
||||
let asyncSettingsSaveQueued: { paths: StoragePaths; settings: AppSettings } | null = null;
|
||||
|
||||
async function writeSettingsPayload(paths: StoragePaths, payload: string): Promise<void> {
|
||||
await fs.promises.mkdir(paths.baseDir, { recursive: true });
|
||||
@ -432,6 +502,7 @@ async function writeSettingsPayload(paths: StoragePaths, payload: string): Promi
|
||||
await fsp.copyFile(tempPath, paths.configFile);
|
||||
await fsp.rm(tempPath, { force: true }).catch(() => {});
|
||||
} else {
|
||||
await fsp.rm(tempPath, { force: true }).catch(() => {});
|
||||
throw renameError;
|
||||
}
|
||||
}
|
||||
@ -441,7 +512,7 @@ export async function saveSettingsAsync(paths: StoragePaths, settings: AppSettin
|
||||
const persisted = sanitizeCredentialPersistence(normalizeSettings(settings));
|
||||
const payload = JSON.stringify(persisted, null, 2);
|
||||
if (asyncSettingsSaveRunning) {
|
||||
asyncSettingsSaveQueued = { paths, payload };
|
||||
asyncSettingsSaveQueued = { paths, settings };
|
||||
return;
|
||||
}
|
||||
asyncSettingsSaveRunning = true;
|
||||
@ -454,7 +525,7 @@ export async function saveSettingsAsync(paths: StoragePaths, settings: AppSettin
|
||||
if (asyncSettingsSaveQueued) {
|
||||
const queued = asyncSettingsSaveQueued;
|
||||
asyncSettingsSaveQueued = null;
|
||||
void writeSettingsPayload(queued.paths, queued.payload).catch((err) => logger.error(`Async Settings-Save (queued) fehlgeschlagen: ${String(err)}`));
|
||||
void saveSettingsAsync(queued.paths, queued.settings);
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -507,6 +578,7 @@ export function loadSession(paths: StoragePaths): SessionState {
|
||||
}
|
||||
|
||||
export function saveSession(paths: StoragePaths, session: SessionState): void {
|
||||
syncSaveGeneration += 1;
|
||||
ensureBaseDir(paths.baseDir);
|
||||
if (fs.existsSync(paths.sessionFile)) {
|
||||
try {
|
||||
@ -517,25 +589,41 @@ export function saveSession(paths: StoragePaths, session: SessionState): void {
|
||||
}
|
||||
const payload = JSON.stringify({ ...session, updatedAt: Date.now() });
|
||||
const tempPath = sessionTempPath(paths.sessionFile, "sync");
|
||||
fs.writeFileSync(tempPath, payload, "utf8");
|
||||
syncRenameWithExdevFallback(tempPath, paths.sessionFile);
|
||||
try {
|
||||
fs.writeFileSync(tempPath, payload, "utf8");
|
||||
syncRenameWithExdevFallback(tempPath, paths.sessionFile);
|
||||
} catch (error) {
|
||||
try { fs.rmSync(tempPath, { force: true }); } catch { /* ignore */ }
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
let asyncSaveRunning = false;
|
||||
let asyncSaveQueued: { paths: StoragePaths; payload: string } | null = null;
|
||||
let syncSaveGeneration = 0;
|
||||
|
||||
async function writeSessionPayload(paths: StoragePaths, payload: string): Promise<void> {
|
||||
async function writeSessionPayload(paths: StoragePaths, payload: string, generation: number): Promise<void> {
|
||||
await fs.promises.mkdir(paths.baseDir, { recursive: true });
|
||||
await fsp.copyFile(paths.sessionFile, sessionBackupPath(paths.sessionFile)).catch(() => {});
|
||||
const tempPath = sessionTempPath(paths.sessionFile, "async");
|
||||
await fsp.writeFile(tempPath, payload, "utf8");
|
||||
// If a synchronous save occurred after this async save started, discard the stale write
|
||||
if (generation < syncSaveGeneration) {
|
||||
await fsp.rm(tempPath, { force: true }).catch(() => {});
|
||||
return;
|
||||
}
|
||||
try {
|
||||
await fsp.rename(tempPath, paths.sessionFile);
|
||||
} catch (renameError: unknown) {
|
||||
if (renameError && typeof renameError === "object" && "code" in renameError && (renameError as NodeJS.ErrnoException).code === "EXDEV") {
|
||||
if (generation < syncSaveGeneration) {
|
||||
await fsp.rm(tempPath, { force: true }).catch(() => {});
|
||||
return;
|
||||
}
|
||||
await fsp.copyFile(tempPath, paths.sessionFile);
|
||||
await fsp.rm(tempPath, { force: true }).catch(() => {});
|
||||
} else {
|
||||
await fsp.rm(tempPath, { force: true }).catch(() => {});
|
||||
throw renameError;
|
||||
}
|
||||
}
|
||||
@ -547,8 +635,9 @@ async function saveSessionPayloadAsync(paths: StoragePaths, payload: string): Pr
|
||||
return;
|
||||
}
|
||||
asyncSaveRunning = true;
|
||||
const gen = syncSaveGeneration;
|
||||
try {
|
||||
await writeSessionPayload(paths, payload);
|
||||
await writeSessionPayload(paths, payload, gen);
|
||||
} catch (error) {
|
||||
logger.error(`Async Session-Save fehlgeschlagen: ${String(error)}`);
|
||||
} finally {
|
||||
@ -561,6 +650,12 @@ async function saveSessionPayloadAsync(paths: StoragePaths, payload: string): Pr
|
||||
}
|
||||
}
|
||||
|
||||
export function cancelPendingAsyncSaves(): void {
|
||||
asyncSaveQueued = null;
|
||||
asyncSettingsSaveQueued = null;
|
||||
syncSaveGeneration += 1;
|
||||
}
|
||||
|
||||
export async function saveSessionAsync(paths: StoragePaths, session: SessionState): Promise<void> {
|
||||
const payload = JSON.stringify({ ...session, updatedAt: Date.now() });
|
||||
await saveSessionPayloadAsync(paths, payload);
|
||||
@ -586,7 +681,8 @@ function normalizeHistoryEntry(raw: unknown, index: number): HistoryEntry | null
|
||||
completedAt: clampNumber(entry.completedAt, Date.now(), 0, Number.MAX_SAFE_INTEGER),
|
||||
durationSeconds: clampNumber(entry.durationSeconds, 0, 0, Number.MAX_SAFE_INTEGER),
|
||||
status: entry.status === "deleted" ? "deleted" : "completed",
|
||||
outputDir: asText(entry.outputDir)
|
||||
outputDir: asText(entry.outputDir),
|
||||
urls: Array.isArray(entry.urls) ? (entry.urls as unknown[]).map(String).filter(Boolean) : undefined
|
||||
};
|
||||
}
|
||||
|
||||
@ -616,8 +712,13 @@ export function saveHistory(paths: StoragePaths, entries: HistoryEntry[]): void
|
||||
const trimmed = entries.slice(0, MAX_HISTORY_ENTRIES);
|
||||
const payload = JSON.stringify(trimmed, null, 2);
|
||||
const tempPath = `${paths.historyFile}.tmp`;
|
||||
fs.writeFileSync(tempPath, payload, "utf8");
|
||||
syncRenameWithExdevFallback(tempPath, paths.historyFile);
|
||||
try {
|
||||
fs.writeFileSync(tempPath, payload, "utf8");
|
||||
syncRenameWithExdevFallback(tempPath, paths.historyFile);
|
||||
} catch (error) {
|
||||
try { fs.rmSync(tempPath, { force: true }); } catch { /* ignore */ }
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
export function addHistoryEntry(paths: StoragePaths, entry: HistoryEntry): HistoryEntry[] {
|
||||
|
||||
@ -14,8 +14,32 @@ const DOWNLOAD_BODY_IDLE_TIMEOUT_MS = 45000;
|
||||
const RETRIES_PER_CANDIDATE = 3;
|
||||
const RETRY_DELAY_MS = 1500;
|
||||
const UPDATE_USER_AGENT = `RD-Node-Downloader/${APP_VERSION}`;
|
||||
const UPDATE_WEB_BASE = "https://codeberg.org";
|
||||
const UPDATE_API_BASE = "https://codeberg.org/api/v1";
|
||||
type UpdateSource = {
|
||||
name: string;
|
||||
webBase: string;
|
||||
apiBase: string;
|
||||
};
|
||||
|
||||
const UPDATE_SOURCES: UpdateSource[] = [
|
||||
{
|
||||
name: "git24",
|
||||
webBase: "https://git.24-music.de",
|
||||
apiBase: "https://git.24-music.de/api/v1"
|
||||
},
|
||||
{
|
||||
name: "codeberg",
|
||||
webBase: "https://codeberg.org",
|
||||
apiBase: "https://codeberg.org/api/v1"
|
||||
},
|
||||
{
|
||||
name: "github",
|
||||
webBase: "https://github.com",
|
||||
apiBase: "https://api.github.com"
|
||||
}
|
||||
];
|
||||
const PRIMARY_UPDATE_SOURCE = UPDATE_SOURCES[0];
|
||||
const UPDATE_WEB_BASE = PRIMARY_UPDATE_SOURCE.webBase;
|
||||
const UPDATE_API_BASE = PRIMARY_UPDATE_SOURCE.apiBase;
|
||||
|
||||
let activeUpdateAbortController: AbortController | null = null;
|
||||
|
||||
@ -57,9 +81,9 @@ export function normalizeUpdateRepo(repo: string): string {
|
||||
|
||||
const normalizeParts = (input: string): string => {
|
||||
const cleaned = input
|
||||
.replace(/^https?:\/\/(?:www\.)?(?:codeberg\.org|github\.com)\//i, "")
|
||||
.replace(/^(?:www\.)?(?:codeberg\.org|github\.com)\//i, "")
|
||||
.replace(/^git@(?:codeberg\.org|github\.com):/i, "")
|
||||
.replace(/^https?:\/\/(?:www\.)?(?:codeberg\.org|github\.com|git\.24-music\.de)\//i, "")
|
||||
.replace(/^(?:www\.)?(?:codeberg\.org|github\.com|git\.24-music\.de)\//i, "")
|
||||
.replace(/^git@(?:codeberg\.org|github\.com|git\.24-music\.de):/i, "")
|
||||
.replace(/\.git$/i, "")
|
||||
.replace(/^\/+|\/+$/g, "");
|
||||
const parts = cleaned.split("/").filter(Boolean);
|
||||
@ -76,7 +100,13 @@ export function normalizeUpdateRepo(repo: string): string {
|
||||
try {
|
||||
const url = new URL(raw);
|
||||
const host = url.hostname.toLowerCase();
|
||||
if (host === "codeberg.org" || host === "www.codeberg.org" || host === "github.com" || host === "www.github.com") {
|
||||
if (
|
||||
host === "codeberg.org"
|
||||
|| host === "www.codeberg.org"
|
||||
|| host === "github.com"
|
||||
|| host === "www.github.com"
|
||||
|| host === "git.24-music.de"
|
||||
) {
|
||||
const normalized = normalizeParts(url.pathname);
|
||||
if (normalized) {
|
||||
return normalized;
|
||||
@ -306,6 +336,8 @@ function parseReleasePayload(payload: Record<string, unknown>, fallback: UpdateC
|
||||
const releaseUrl = String(payload.html_url || fallback.releaseUrl);
|
||||
const setup = pickSetupAsset(readReleaseAssets(payload));
|
||||
|
||||
const body = typeof payload.body === "string" ? payload.body.trim() : "";
|
||||
|
||||
return {
|
||||
updateAvailable: isRemoteNewer(APP_VERSION, latestVersion),
|
||||
currentVersion: APP_VERSION,
|
||||
@ -314,7 +346,8 @@ function parseReleasePayload(payload: Record<string, unknown>, fallback: UpdateC
|
||||
releaseUrl,
|
||||
setupAssetUrl: setup?.browser_download_url || "",
|
||||
setupAssetName: setup?.name || "",
|
||||
setupAssetDigest: setup?.digest || ""
|
||||
setupAssetDigest: setup?.digest || "",
|
||||
releaseNotes: body || undefined
|
||||
};
|
||||
}
|
||||
|
||||
@ -761,7 +794,8 @@ async function downloadFile(url: string, targetPath: string, onProgress?: Update
|
||||
};
|
||||
|
||||
const reader = response.body.getReader();
|
||||
const chunks: Buffer[] = [];
|
||||
const tempPath = targetPath + ".tmp";
|
||||
const writeStream = fs.createWriteStream(tempPath);
|
||||
|
||||
try {
|
||||
resetIdleTimer();
|
||||
@ -775,27 +809,39 @@ async function downloadFile(url: string, targetPath: string, onProgress?: Update
|
||||
break;
|
||||
}
|
||||
const buf = Buffer.from(value.buffer, value.byteOffset, value.byteLength);
|
||||
chunks.push(buf);
|
||||
if (!writeStream.write(buf)) {
|
||||
await new Promise<void>((resolve) => writeStream.once("drain", resolve));
|
||||
}
|
||||
downloadedBytes += buf.byteLength;
|
||||
resetIdleTimer();
|
||||
emitDownloadProgress(false);
|
||||
}
|
||||
} catch (error) {
|
||||
writeStream.destroy();
|
||||
await fs.promises.rm(tempPath, { force: true }).catch(() => {});
|
||||
throw error;
|
||||
} finally {
|
||||
clearIdleTimer();
|
||||
}
|
||||
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
writeStream.end(() => resolve());
|
||||
writeStream.on("error", reject);
|
||||
});
|
||||
|
||||
if (idleTimedOut) {
|
||||
await fs.promises.rm(tempPath, { force: true }).catch(() => {});
|
||||
throw new Error(`Update Download Body Timeout nach ${Math.ceil(idleTimeoutMs / 1000)}s`);
|
||||
}
|
||||
|
||||
const fileBuffer = Buffer.concat(chunks);
|
||||
if (totalBytes && fileBuffer.byteLength !== totalBytes) {
|
||||
throw new Error(`Update Download unvollständig (${fileBuffer.byteLength} / ${totalBytes} Bytes)`);
|
||||
if (totalBytes && downloadedBytes !== totalBytes) {
|
||||
await fs.promises.rm(tempPath, { force: true }).catch(() => {});
|
||||
throw new Error(`Update Download unvollständig (${downloadedBytes} / ${totalBytes} Bytes)`);
|
||||
}
|
||||
|
||||
await fs.promises.writeFile(targetPath, fileBuffer);
|
||||
await fs.promises.rename(tempPath, targetPath);
|
||||
emitDownloadProgress(true);
|
||||
logger.info(`Update-Download abgeschlossen: ${targetPath} (${fileBuffer.byteLength} Bytes)`);
|
||||
logger.info(`Update-Download abgeschlossen: ${targetPath} (${downloadedBytes} Bytes)`);
|
||||
|
||||
return { expectedBytes: totalBytes };
|
||||
}
|
||||
|
||||
@ -4,6 +4,7 @@ import {
|
||||
AppSettings,
|
||||
DuplicatePolicy,
|
||||
HistoryEntry,
|
||||
PackagePriority,
|
||||
SessionStats,
|
||||
StartConflictEntry,
|
||||
StartConflictResolutionResult,
|
||||
@ -38,7 +39,7 @@ const api: ElectronApi = {
|
||||
reorderPackages: (packageIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.REORDER_PACKAGES, packageIds),
|
||||
removeItem: (itemId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.REMOVE_ITEM, itemId),
|
||||
togglePackage: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.TOGGLE_PACKAGE, packageId),
|
||||
exportQueue: (): Promise<string> => ipcRenderer.invoke(IPC_CHANNELS.EXPORT_QUEUE),
|
||||
exportQueue: (): Promise<{ saved: boolean }> => ipcRenderer.invoke(IPC_CHANNELS.EXPORT_QUEUE),
|
||||
importQueue: (json: string): Promise<{ addedPackages: number; addedLinks: number }> => ipcRenderer.invoke(IPC_CHANNELS.IMPORT_QUEUE, json),
|
||||
toggleClipboard: (): Promise<boolean> => ipcRenderer.invoke(IPC_CHANNELS.TOGGLE_CLIPBOARD),
|
||||
pickFolder: (): Promise<string | null> => ipcRenderer.invoke(IPC_CHANNELS.PICK_FOLDER),
|
||||
@ -49,11 +50,17 @@ const api: ElectronApi = {
|
||||
exportBackup: (): Promise<{ saved: boolean }> => ipcRenderer.invoke(IPC_CHANNELS.EXPORT_BACKUP),
|
||||
importBackup: (): Promise<{ restored: boolean; message: string }> => ipcRenderer.invoke(IPC_CHANNELS.IMPORT_BACKUP),
|
||||
openLog: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.OPEN_LOG),
|
||||
openSessionLog: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.OPEN_SESSION_LOG),
|
||||
retryExtraction: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RETRY_EXTRACTION, packageId),
|
||||
extractNow: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.EXTRACT_NOW, packageId),
|
||||
resetPackage: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RESET_PACKAGE, packageId),
|
||||
getHistory: (): Promise<HistoryEntry[]> => ipcRenderer.invoke(IPC_CHANNELS.GET_HISTORY),
|
||||
clearHistory: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.CLEAR_HISTORY),
|
||||
removeHistoryEntry: (entryId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.REMOVE_HISTORY_ENTRY, entryId),
|
||||
setPackagePriority: (packageId: string, priority: PackagePriority): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.SET_PACKAGE_PRIORITY, packageId, priority),
|
||||
skipItems: (itemIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.SKIP_ITEMS, itemIds),
|
||||
resetItems: (itemIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RESET_ITEMS, itemIds),
|
||||
startItems: (itemIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.START_ITEMS, itemIds),
|
||||
onStateUpdate: (callback: (snapshot: UiSnapshot) => void): (() => void) => {
|
||||
const listener = (_event: unknown, snapshot: UiSnapshot): void => callback(snapshot);
|
||||
ipcRenderer.on(IPC_CHANNELS.STATE_UPDATE, listener);
|
||||
|
||||
1237
src/renderer/App.tsx
1237
src/renderer/App.tsx
File diff suppressed because it is too large
Load Diff
25
src/renderer/package-order.ts
Normal file
25
src/renderer/package-order.ts
Normal file
@ -0,0 +1,25 @@
|
||||
import type { PackageEntry } from "../shared/types";
|
||||
|
||||
export function reorderPackageOrderByDrop(order: string[], draggedPackageId: string, targetPackageId: string): string[] {
|
||||
const fromIndex = order.indexOf(draggedPackageId);
|
||||
const toIndex = order.indexOf(targetPackageId);
|
||||
if (fromIndex < 0 || toIndex < 0 || fromIndex === toIndex) {
|
||||
return order;
|
||||
}
|
||||
const next = [...order];
|
||||
const [dragged] = next.splice(fromIndex, 1);
|
||||
const insertIndex = Math.max(0, Math.min(next.length, toIndex));
|
||||
next.splice(insertIndex, 0, dragged);
|
||||
return next;
|
||||
}
|
||||
|
||||
export function sortPackageOrderByName(order: string[], packages: Record<string, PackageEntry>, descending: boolean): string[] {
|
||||
const sorted = [...order];
|
||||
sorted.sort((a, b) => {
|
||||
const nameA = (packages[a]?.name ?? "").toLowerCase();
|
||||
const nameB = (packages[b]?.name ?? "").toLowerCase();
|
||||
const cmp = nameA.localeCompare(nameB, undefined, { numeric: true, sensitivity: "base" });
|
||||
return descending ? -cmp : cmp;
|
||||
});
|
||||
return sorted;
|
||||
}
|
||||
@ -344,6 +344,15 @@ body,
|
||||
background: rgba(244, 63, 94, 0.1);
|
||||
}
|
||||
|
||||
.ctrl-icon-btn.ctrl-move:not(:disabled) {
|
||||
color: var(--accent);
|
||||
}
|
||||
|
||||
.ctrl-icon-btn.ctrl-move:hover:not(:disabled) {
|
||||
border-color: var(--accent);
|
||||
background: color-mix(in srgb, var(--accent) 10%, transparent);
|
||||
}
|
||||
|
||||
.ctrl-icon-btn.ctrl-speed.active {
|
||||
color: #f59e0b;
|
||||
border-color: rgba(245, 158, 11, 0.5);
|
||||
@ -577,7 +586,7 @@ body,
|
||||
|
||||
.pkg-column-header {
|
||||
display: grid;
|
||||
grid-template-columns: 1fr 90px 170px 140px 130px 180px 100px;
|
||||
/* grid-template-columns set via inline style from columnOrder */
|
||||
gap: 8px;
|
||||
padding: 5px 12px;
|
||||
background: var(--card);
|
||||
@ -593,8 +602,10 @@ body,
|
||||
.pkg-column-header .pkg-col-size,
|
||||
.pkg-column-header .pkg-col-hoster,
|
||||
.pkg-column-header .pkg-col-account,
|
||||
.pkg-column-header .pkg-col-prio,
|
||||
.pkg-column-header .pkg-col-status,
|
||||
.pkg-column-header .pkg-col-speed {
|
||||
.pkg-column-header .pkg-col-speed,
|
||||
.pkg-column-header .pkg-col-added {
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
@ -612,7 +623,7 @@ body,
|
||||
|
||||
.pkg-columns {
|
||||
display: grid;
|
||||
grid-template-columns: 1fr 90px 170px 140px 130px 180px 100px;
|
||||
/* grid-template-columns set via inline style from columnOrder */
|
||||
gap: 8px;
|
||||
align-items: center;
|
||||
min-width: 0;
|
||||
@ -636,8 +647,10 @@ body,
|
||||
.pkg-columns .pkg-col-size,
|
||||
.pkg-columns .pkg-col-hoster,
|
||||
.pkg-columns .pkg-col-account,
|
||||
.pkg-columns .pkg-col-prio,
|
||||
.pkg-columns .pkg-col-status,
|
||||
.pkg-columns .pkg-col-speed {
|
||||
.pkg-columns .pkg-col-speed,
|
||||
.pkg-columns .pkg-col-added {
|
||||
font-size: 13px;
|
||||
color: var(--muted);
|
||||
overflow: hidden;
|
||||
@ -860,7 +873,7 @@ body,
|
||||
.status-bar {
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
gap: 16px;
|
||||
gap: 8px 16px;
|
||||
align-items: center;
|
||||
color: var(--muted);
|
||||
font-size: 12px;
|
||||
@ -871,6 +884,16 @@ body,
|
||||
margin: 0 -14px -10px;
|
||||
}
|
||||
|
||||
.footer-spacer {
|
||||
flex: 1;
|
||||
}
|
||||
|
||||
.footer-btn {
|
||||
font-size: 11px;
|
||||
padding: 2px 8px;
|
||||
min-height: 0;
|
||||
}
|
||||
|
||||
.settings-shell {
|
||||
display: grid;
|
||||
grid-template-rows: auto 1fr;
|
||||
@ -1284,7 +1307,7 @@ td {
|
||||
|
||||
.item-row {
|
||||
display: grid;
|
||||
grid-template-columns: 1fr 90px 170px 140px 130px 180px 100px;
|
||||
/* grid-template-columns set via inline style from columnOrder */
|
||||
gap: 8px;
|
||||
align-items: center;
|
||||
margin: 0 -12px;
|
||||
@ -1315,6 +1338,89 @@ td {
|
||||
padding-left: 32px;
|
||||
}
|
||||
|
||||
.link-status-dot {
|
||||
display: inline-block;
|
||||
width: 8px;
|
||||
height: 8px;
|
||||
border-radius: 50%;
|
||||
margin-right: 6px;
|
||||
flex-shrink: 0;
|
||||
vertical-align: middle;
|
||||
}
|
||||
.link-status-dot.online {
|
||||
background: #22c55e;
|
||||
box-shadow: 0 0 4px #22c55e80;
|
||||
}
|
||||
.link-status-dot.offline {
|
||||
background: #ef4444;
|
||||
box-shadow: 0 0 4px #ef444480;
|
||||
}
|
||||
.link-status-dot.checking {
|
||||
background: #f59e0b;
|
||||
box-shadow: 0 0 4px #f59e0b80;
|
||||
}
|
||||
|
||||
.prio-high {
|
||||
color: #f59e0b !important;
|
||||
font-weight: 700;
|
||||
}
|
||||
|
||||
.prio-low {
|
||||
color: #64748b !important;
|
||||
}
|
||||
|
||||
.pkg-col-dragging {
|
||||
opacity: 0.4;
|
||||
}
|
||||
|
||||
.pkg-col-drop-target {
|
||||
box-shadow: -2px 0 0 0 var(--accent);
|
||||
}
|
||||
|
||||
.pkg-column-header .pkg-col {
|
||||
cursor: grab;
|
||||
}
|
||||
|
||||
.pkg-column-header .pkg-col.sortable {
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.ctx-menu-sub {
|
||||
position: relative;
|
||||
}
|
||||
|
||||
.ctx-menu-sub > .ctx-menu-item::after {
|
||||
content: "";
|
||||
}
|
||||
|
||||
.ctx-menu-sub-items {
|
||||
display: none;
|
||||
position: absolute;
|
||||
left: 100%;
|
||||
top: 0;
|
||||
min-width: 120px;
|
||||
background: var(--card);
|
||||
border: 1px solid var(--border);
|
||||
border-radius: 6px;
|
||||
padding: 4px 0;
|
||||
box-shadow: 0 4px 12px rgba(0,0,0,.3);
|
||||
z-index: 1001;
|
||||
}
|
||||
|
||||
.ctx-menu-sub:hover .ctx-menu-sub-items {
|
||||
display: block;
|
||||
}
|
||||
|
||||
.ctx-menu-active {
|
||||
color: var(--accent) !important;
|
||||
}
|
||||
|
||||
.ctx-menu-disabled {
|
||||
opacity: 0.4;
|
||||
cursor: not-allowed !important;
|
||||
pointer-events: none;
|
||||
}
|
||||
|
||||
.item-remove {
|
||||
background: none;
|
||||
border: none;
|
||||
@ -1533,6 +1639,7 @@ td {
|
||||
border-radius: 12px;
|
||||
padding: 10px 14px;
|
||||
box-shadow: 0 16px 30px rgba(0, 0, 0, 0.35);
|
||||
z-index: 50;
|
||||
}
|
||||
|
||||
.ctx-menu {
|
||||
@ -1642,6 +1749,7 @@ td {
|
||||
font-weight: 600;
|
||||
pointer-events: none;
|
||||
backdrop-filter: blur(2px);
|
||||
z-index: 200;
|
||||
}
|
||||
|
||||
.modal-backdrop {
|
||||
@ -1656,6 +1764,8 @@ td {
|
||||
|
||||
.modal-card {
|
||||
width: min(560px, 100%);
|
||||
max-height: calc(100vh - 40px);
|
||||
overflow-y: auto;
|
||||
border: 1px solid var(--border);
|
||||
border-radius: 14px;
|
||||
background: linear-gradient(180deg, color-mix(in srgb, var(--card) 98%, transparent), color-mix(in srgb, var(--surface) 98%, transparent));
|
||||
@ -1674,6 +1784,34 @@ td {
|
||||
color: var(--muted);
|
||||
}
|
||||
|
||||
.modal-details {
|
||||
border: 1px solid var(--border);
|
||||
border-radius: 6px;
|
||||
padding: 0;
|
||||
}
|
||||
.modal-details summary {
|
||||
padding: 6px 10px;
|
||||
cursor: pointer;
|
||||
font-size: 13px;
|
||||
color: var(--muted);
|
||||
user-select: none;
|
||||
}
|
||||
.modal-details summary:hover {
|
||||
color: var(--text);
|
||||
}
|
||||
.modal-details pre {
|
||||
margin: 0;
|
||||
padding: 8px 10px;
|
||||
border-top: 1px solid var(--border);
|
||||
font-size: 12px;
|
||||
line-height: 1.5;
|
||||
white-space: pre-wrap;
|
||||
word-break: break-word;
|
||||
max-height: 260px;
|
||||
overflow-y: auto;
|
||||
color: var(--muted);
|
||||
}
|
||||
|
||||
.modal-path {
|
||||
font-size: 12px;
|
||||
word-break: break-all;
|
||||
@ -1744,16 +1882,19 @@ td {
|
||||
}
|
||||
|
||||
.pkg-columns,
|
||||
.pkg-column-header {
|
||||
grid-template-columns: 1fr;
|
||||
.pkg-column-header,
|
||||
.item-row {
|
||||
grid-template-columns: 1fr !important;
|
||||
}
|
||||
|
||||
.pkg-column-header .pkg-col-progress,
|
||||
.pkg-column-header .pkg-col-size,
|
||||
.pkg-column-header .pkg-col-hoster,
|
||||
.pkg-column-header .pkg-col-account,
|
||||
.pkg-column-header .pkg-col-prio,
|
||||
.pkg-column-header .pkg-col-status,
|
||||
.pkg-column-header .pkg-col-speed {
|
||||
.pkg-column-header .pkg-col-speed,
|
||||
.pkg-column-header .pkg-col-added {
|
||||
display: none;
|
||||
}
|
||||
|
||||
@ -1761,8 +1902,10 @@ td {
|
||||
.pkg-columns .pkg-col-size,
|
||||
.pkg-columns .pkg-col-hoster,
|
||||
.pkg-columns .pkg-col-account,
|
||||
.pkg-columns .pkg-col-prio,
|
||||
.pkg-columns .pkg-col-status,
|
||||
.pkg-columns .pkg-col-speed {
|
||||
.pkg-columns .pkg-col-speed,
|
||||
.pkg-columns .pkg-col-added {
|
||||
display: none;
|
||||
}
|
||||
|
||||
|
||||
@ -33,9 +33,15 @@ export const IPC_CHANNELS = {
|
||||
EXPORT_BACKUP: "app:export-backup",
|
||||
IMPORT_BACKUP: "app:import-backup",
|
||||
OPEN_LOG: "app:open-log",
|
||||
OPEN_SESSION_LOG: "app:open-session-log",
|
||||
RETRY_EXTRACTION: "queue:retry-extraction",
|
||||
EXTRACT_NOW: "queue:extract-now",
|
||||
RESET_PACKAGE: "queue:reset-package",
|
||||
GET_HISTORY: "history:get",
|
||||
CLEAR_HISTORY: "history:clear",
|
||||
REMOVE_HISTORY_ENTRY: "history:remove-entry"
|
||||
REMOVE_HISTORY_ENTRY: "history:remove-entry",
|
||||
SET_PACKAGE_PRIORITY: "queue:set-package-priority",
|
||||
SKIP_ITEMS: "queue:skip-items",
|
||||
RESET_ITEMS: "queue:reset-items",
|
||||
START_ITEMS: "queue:start-items"
|
||||
} as const;
|
||||
|
||||
@ -3,6 +3,7 @@ import type {
|
||||
AppSettings,
|
||||
DuplicatePolicy,
|
||||
HistoryEntry,
|
||||
PackagePriority,
|
||||
SessionStats,
|
||||
StartConflictEntry,
|
||||
StartConflictResolutionResult,
|
||||
@ -33,7 +34,7 @@ export interface ElectronApi {
|
||||
reorderPackages: (packageIds: string[]) => Promise<void>;
|
||||
removeItem: (itemId: string) => Promise<void>;
|
||||
togglePackage: (packageId: string) => Promise<void>;
|
||||
exportQueue: () => Promise<string>;
|
||||
exportQueue: () => Promise<{ saved: boolean }>;
|
||||
importQueue: (json: string) => Promise<{ addedPackages: number; addedLinks: number }>;
|
||||
toggleClipboard: () => Promise<boolean>;
|
||||
pickFolder: () => Promise<string | null>;
|
||||
@ -44,11 +45,17 @@ export interface ElectronApi {
|
||||
exportBackup: () => Promise<{ saved: boolean }>;
|
||||
importBackup: () => Promise<{ restored: boolean; message: string }>;
|
||||
openLog: () => Promise<void>;
|
||||
openSessionLog: () => Promise<void>;
|
||||
retryExtraction: (packageId: string) => Promise<void>;
|
||||
extractNow: (packageId: string) => Promise<void>;
|
||||
resetPackage: (packageId: string) => Promise<void>;
|
||||
getHistory: () => Promise<HistoryEntry[]>;
|
||||
clearHistory: () => Promise<void>;
|
||||
removeHistoryEntry: (entryId: string) => Promise<void>;
|
||||
setPackagePriority: (packageId: string, priority: PackagePriority) => Promise<void>;
|
||||
skipItems: (itemIds: string[]) => Promise<void>;
|
||||
resetItems: (itemIds: string[]) => Promise<void>;
|
||||
startItems: (itemIds: string[]) => Promise<void>;
|
||||
onStateUpdate: (callback: (snapshot: UiSnapshot) => void) => () => void;
|
||||
onClipboardDetected: (callback: (links: string[]) => void) => () => void;
|
||||
onUpdateInstallProgress: (callback: (progress: UpdateInstallProgress) => void) => () => void;
|
||||
|
||||
@ -14,9 +14,11 @@ export type CleanupMode = "none" | "trash" | "delete";
|
||||
export type ConflictMode = "overwrite" | "skip" | "rename" | "ask";
|
||||
export type SpeedMode = "global" | "per_download";
|
||||
export type FinishedCleanupPolicy = "never" | "immediate" | "on_start" | "package_done";
|
||||
export type DebridProvider = "realdebrid" | "megadebrid" | "bestdebrid" | "alldebrid";
|
||||
export type DebridProvider = "realdebrid" | "megadebrid" | "bestdebrid" | "alldebrid" | "ddownload";
|
||||
export type DebridFallbackProvider = DebridProvider | "none";
|
||||
export type AppTheme = "dark" | "light";
|
||||
export type PackagePriority = "high" | "normal" | "low";
|
||||
export type ExtractCpuPriority = "high" | "middle" | "low";
|
||||
|
||||
export interface BandwidthScheduleEntry {
|
||||
id: string;
|
||||
@ -40,6 +42,8 @@ export interface AppSettings {
|
||||
megaPassword: string;
|
||||
bestToken: string;
|
||||
allDebridToken: string;
|
||||
ddownloadLogin: string;
|
||||
ddownloadPassword: string;
|
||||
archivePasswordList: string;
|
||||
rememberToken: boolean;
|
||||
providerPrimary: DebridProvider;
|
||||
@ -80,6 +84,9 @@ export interface AppSettings {
|
||||
confirmDeleteSelection: boolean;
|
||||
totalDownloadedAllTime: number;
|
||||
bandwidthSchedules: BandwidthScheduleEntry[];
|
||||
columnOrder: string[];
|
||||
extractCpuPriority: ExtractCpuPriority;
|
||||
autoExtractWhenStopped: boolean;
|
||||
}
|
||||
|
||||
export interface DownloadItem {
|
||||
@ -101,6 +108,7 @@ export interface DownloadItem {
|
||||
fullStatus: string;
|
||||
createdAt: number;
|
||||
updatedAt: number;
|
||||
onlineStatus?: "online" | "offline" | "checking";
|
||||
}
|
||||
|
||||
export interface PackageEntry {
|
||||
@ -112,6 +120,8 @@ export interface PackageEntry {
|
||||
itemIds: string[];
|
||||
cancelled: boolean;
|
||||
enabled: boolean;
|
||||
priority: PackagePriority;
|
||||
postProcessLabel?: string;
|
||||
createdAt: number;
|
||||
updatedAt: number;
|
||||
}
|
||||
@ -212,6 +222,7 @@ export interface UpdateCheckResult {
|
||||
setupAssetUrl?: string;
|
||||
setupAssetName?: string;
|
||||
setupAssetDigest?: string;
|
||||
releaseNotes?: string;
|
||||
error?: string;
|
||||
}
|
||||
|
||||
@ -268,6 +279,7 @@ export interface HistoryEntry {
|
||||
durationSeconds: number;
|
||||
status: "completed" | "deleted";
|
||||
outputDir: string;
|
||||
urls?: string[];
|
||||
}
|
||||
|
||||
export interface HistoryState {
|
||||
|
||||
@ -1,5 +1,5 @@
|
||||
import { describe, expect, it } from "vitest";
|
||||
import { reorderPackageOrderByDrop, sortPackageOrderByName } from "../src/renderer/App";
|
||||
import { reorderPackageOrderByDrop, sortPackageOrderByName } from "../src/renderer/package-order";
|
||||
|
||||
describe("reorderPackageOrderByDrop", () => {
|
||||
it("moves adjacent package down by one on drop", () => {
|
||||
@ -25,9 +25,9 @@ describe("sortPackageOrderByName", () => {
|
||||
const sorted = sortPackageOrderByName(
|
||||
["pkg3", "pkg1", "pkg2"],
|
||||
{
|
||||
pkg1: { id: "pkg1", name: "Alpha", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 },
|
||||
pkg2: { id: "pkg2", name: "beta", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 },
|
||||
pkg3: { id: "pkg3", name: "Gamma", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 }
|
||||
pkg1: { id: "pkg1", name: "Alpha", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
|
||||
pkg2: { id: "pkg2", name: "beta", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
|
||||
pkg3: { id: "pkg3", name: "Gamma", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 }
|
||||
},
|
||||
false
|
||||
);
|
||||
@ -38,9 +38,9 @@ describe("sortPackageOrderByName", () => {
|
||||
const sorted = sortPackageOrderByName(
|
||||
["pkg1", "pkg2", "pkg3"],
|
||||
{
|
||||
pkg1: { id: "pkg1", name: "Alpha", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 },
|
||||
pkg2: { id: "pkg2", name: "beta", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 },
|
||||
pkg3: { id: "pkg3", name: "Gamma", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 }
|
||||
pkg1: { id: "pkg1", name: "Alpha", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
|
||||
pkg2: { id: "pkg2", name: "beta", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
|
||||
pkg3: { id: "pkg3", name: "Gamma", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 }
|
||||
},
|
||||
true
|
||||
);
|
||||
|
||||
@ -269,6 +269,7 @@ describe("buildAutoRenameBaseName", () => {
|
||||
const result = buildAutoRenameBaseName("Show.S99.720p-4sf", "show.s99e999.720p.mkv");
|
||||
// SCENE_EPISODE_RE allows up to 3-digit episodes and 2-digit seasons
|
||||
expect(result).not.toBeNull();
|
||||
expect(result!).toContain("S99E999");
|
||||
});
|
||||
|
||||
// Real-world scene release patterns
|
||||
@ -343,6 +344,7 @@ describe("buildAutoRenameBaseName", () => {
|
||||
const result = buildAutoRenameBaseName("Show.S01-4sf", "show.s01e01.mkv");
|
||||
// "mkv" should not be treated as part of the filename match
|
||||
expect(result).not.toBeNull();
|
||||
expect(result!).toContain("S01E01");
|
||||
});
|
||||
|
||||
it("does not match episode-like patterns in codec strings", () => {
|
||||
@ -373,6 +375,7 @@ describe("buildAutoRenameBaseName", () => {
|
||||
// Extreme edge case - sanitizeFilename trims leading dots
|
||||
expect(result).not.toBeNull();
|
||||
expect(result!).toContain("S01E01");
|
||||
expect(result!).toContain("-4sf");
|
||||
expect(result!).not.toContain(".S01E01.S01E01"); // no duplication
|
||||
});
|
||||
|
||||
@ -620,4 +623,72 @@ describe("buildAutoRenameBaseNameFromFolders", () => {
|
||||
);
|
||||
expect(result).toBe("Mammon.S01E05E06.German.1080P.Bluray.x264-SMAHD");
|
||||
});
|
||||
|
||||
// Last-resort fallback: folder has season but no scene group suffix (user-renamed packages)
|
||||
it("renames when folder has season but no scene group suffix (Mystery Road case)", () => {
|
||||
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
|
||||
["Mystery Road S02"],
|
||||
"myst.road.de.dl.hdtv.7p-s02e05",
|
||||
{ forceEpisodeForSeasonFolder: true }
|
||||
);
|
||||
expect(result).toBe("Mystery Road S02E05");
|
||||
});
|
||||
|
||||
it("renames with season-only folder and custom name without dots", () => {
|
||||
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
|
||||
["Meine Serie S03"],
|
||||
"meine-serie-s03e10-720p",
|
||||
{ forceEpisodeForSeasonFolder: true }
|
||||
);
|
||||
expect(result).toBe("Meine Serie S03E10");
|
||||
});
|
||||
|
||||
it("prefers scene-group folder over season-only fallback", () => {
|
||||
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
|
||||
[
|
||||
"Mystery Road S02",
|
||||
"Mystery.Road.S02.GERMAN.DL.AC3.720p.HDTV.x264-hrs"
|
||||
],
|
||||
"myst.road.de.dl.hdtv.7p-s02e05",
|
||||
{ forceEpisodeForSeasonFolder: true }
|
||||
);
|
||||
// Should use the scene-group folder (hrs), not the custom one
|
||||
expect(result).toBe("Mystery.Road.S02E05.GERMAN.DL.AC3.720p.HDTV.x264-hrs");
|
||||
});
|
||||
|
||||
it("does not use season-only fallback when forceEpisodeForSeasonFolder is false", () => {
|
||||
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
|
||||
["Mystery Road S02"],
|
||||
"myst.road.de.dl.hdtv.7p-s02e05",
|
||||
{ forceEpisodeForSeasonFolder: false }
|
||||
);
|
||||
expect(result).toBeNull();
|
||||
});
|
||||
|
||||
it("renames Riviera S02 with single-digit episode s02e2", () => {
|
||||
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
|
||||
["Riviera.S02.GERMAN.DUBBED.DL.720p.WebHD.x264-TVP"],
|
||||
"tvp-riviera-s02e2-720p",
|
||||
{ forceEpisodeForSeasonFolder: true }
|
||||
);
|
||||
expect(result).toBe("Riviera.S02E02.GERMAN.DUBBED.DL.720p.WebHD.x264-TVP");
|
||||
});
|
||||
|
||||
it("renames Room 104 abbreviated source r104.de.dl.web.7p-s04e02", () => {
|
||||
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
|
||||
["Room.104.S04.GERMAN.DL.720p.WEBRiP.x264-LAW"],
|
||||
"r104.de.dl.web.7p-s04e02",
|
||||
{ forceEpisodeForSeasonFolder: true }
|
||||
);
|
||||
expect(result).toBe("Room.104.S04E02.GERMAN.DL.720p.WEBRiP.x264-LAW");
|
||||
});
|
||||
|
||||
it("renames Room 104 wayne source with episode", () => {
|
||||
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
|
||||
["Room.104.S04.GERMAN.DL.720p.WEBRiP.x264-LAW"],
|
||||
"room.104.s04e01.german.dl.720p.web.h264-wayne",
|
||||
{ forceEpisodeForSeasonFolder: true }
|
||||
);
|
||||
expect(result).toBe("Room.104.S04E01.GERMAN.DL.720p.WEBRiP.x264-LAW");
|
||||
});
|
||||
});
|
||||
|
||||
@ -317,7 +317,7 @@ describe("debrid service", () => {
|
||||
const controller = new AbortController();
|
||||
const abortTimer = setTimeout(() => {
|
||||
controller.abort("test");
|
||||
}, 25);
|
||||
}, 200);
|
||||
|
||||
try {
|
||||
await expect(service.unrestrictLink("https://rapidgator.net/file/abort-mega-web", controller.signal)).rejects.toThrow(/aborted/i);
|
||||
|
||||
@ -36,12 +36,8 @@ afterEach(() => {
|
||||
}
|
||||
});
|
||||
|
||||
describe("extractor jvm backend", () => {
|
||||
describe.skipIf(!hasJavaRuntime() || !hasJvmExtractorRuntime())("extractor jvm backend", () => {
|
||||
it("extracts zip archives through SevenZipJBinding backend", async () => {
|
||||
if (!hasJavaRuntime() || !hasJvmExtractorRuntime()) {
|
||||
return;
|
||||
}
|
||||
|
||||
process.env.RD_EXTRACT_BACKEND = "jvm";
|
||||
|
||||
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-extract-"));
|
||||
@ -69,11 +65,112 @@ describe("extractor jvm backend", () => {
|
||||
expect(fs.existsSync(path.join(targetDir, "episode.txt"))).toBe(true);
|
||||
});
|
||||
|
||||
it("respects ask/skip conflict mode in jvm backend", async () => {
|
||||
if (!hasJavaRuntime() || !hasJvmExtractorRuntime()) {
|
||||
return;
|
||||
}
|
||||
it("emits progress callbacks with archiveName and percent", async () => {
|
||||
process.env.RD_EXTRACT_BACKEND = "jvm";
|
||||
|
||||
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-progress-"));
|
||||
tempDirs.push(root);
|
||||
const packageDir = path.join(root, "pkg");
|
||||
const targetDir = path.join(root, "out");
|
||||
fs.mkdirSync(packageDir, { recursive: true });
|
||||
|
||||
// Create a ZIP with some content to trigger progress
|
||||
const zipPath = path.join(packageDir, "progress-test.zip");
|
||||
const zip = new AdmZip();
|
||||
zip.addFile("file1.txt", Buffer.from("Hello World ".repeat(100)));
|
||||
zip.addFile("file2.txt", Buffer.from("Another file ".repeat(100)));
|
||||
zip.writeZip(zipPath);
|
||||
|
||||
const progressUpdates: Array<{
|
||||
archiveName: string;
|
||||
percent: number;
|
||||
phase: string;
|
||||
archivePercent?: number;
|
||||
}> = [];
|
||||
|
||||
const result = await extractPackageArchives({
|
||||
packageDir,
|
||||
targetDir,
|
||||
cleanupMode: "none",
|
||||
conflictMode: "overwrite",
|
||||
removeLinks: false,
|
||||
removeSamples: false,
|
||||
onProgress: (update) => {
|
||||
progressUpdates.push({
|
||||
archiveName: update.archiveName,
|
||||
percent: update.percent,
|
||||
phase: update.phase,
|
||||
archivePercent: update.archivePercent,
|
||||
});
|
||||
},
|
||||
});
|
||||
|
||||
expect(result.extracted).toBe(1);
|
||||
expect(result.failed).toBe(0);
|
||||
|
||||
// Should have at least preparing, extracting, and done phases
|
||||
const phases = new Set(progressUpdates.map((u) => u.phase));
|
||||
expect(phases.has("preparing")).toBe(true);
|
||||
expect(phases.has("extracting")).toBe(true);
|
||||
|
||||
// Extracting phase should include the archive name
|
||||
const extracting = progressUpdates.filter((u) => u.phase === "extracting" && u.archiveName === "progress-test.zip");
|
||||
expect(extracting.length).toBeGreaterThan(0);
|
||||
|
||||
// Should end at 100%
|
||||
const lastExtracting = extracting[extracting.length - 1];
|
||||
expect(lastExtracting.archivePercent).toBe(100);
|
||||
|
||||
// Files should exist
|
||||
expect(fs.existsSync(path.join(targetDir, "file1.txt"))).toBe(true);
|
||||
expect(fs.existsSync(path.join(targetDir, "file2.txt"))).toBe(true);
|
||||
});
|
||||
|
||||
it("extracts multiple archives sequentially with progress for each", async () => {
|
||||
process.env.RD_EXTRACT_BACKEND = "jvm";
|
||||
|
||||
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-multi-"));
|
||||
tempDirs.push(root);
|
||||
const packageDir = path.join(root, "pkg");
|
||||
const targetDir = path.join(root, "out");
|
||||
fs.mkdirSync(packageDir, { recursive: true });
|
||||
|
||||
// Create two separate ZIP archives
|
||||
const zip1 = new AdmZip();
|
||||
zip1.addFile("episode01.txt", Buffer.from("ep1 content"));
|
||||
zip1.writeZip(path.join(packageDir, "archive1.zip"));
|
||||
|
||||
const zip2 = new AdmZip();
|
||||
zip2.addFile("episode02.txt", Buffer.from("ep2 content"));
|
||||
zip2.writeZip(path.join(packageDir, "archive2.zip"));
|
||||
|
||||
const archiveNames = new Set<string>();
|
||||
|
||||
const result = await extractPackageArchives({
|
||||
packageDir,
|
||||
targetDir,
|
||||
cleanupMode: "none",
|
||||
conflictMode: "overwrite",
|
||||
removeLinks: false,
|
||||
removeSamples: false,
|
||||
onProgress: (update) => {
|
||||
if (update.phase === "extracting" && update.archiveName) {
|
||||
archiveNames.add(update.archiveName);
|
||||
}
|
||||
},
|
||||
});
|
||||
|
||||
expect(result.extracted).toBe(2);
|
||||
expect(result.failed).toBe(0);
|
||||
// Both archive names should have appeared in progress
|
||||
expect(archiveNames.has("archive1.zip")).toBe(true);
|
||||
expect(archiveNames.has("archive2.zip")).toBe(true);
|
||||
// Both files extracted
|
||||
expect(fs.existsSync(path.join(targetDir, "episode01.txt"))).toBe(true);
|
||||
expect(fs.existsSync(path.join(targetDir, "episode02.txt"))).toBe(true);
|
||||
});
|
||||
|
||||
it("respects ask/skip conflict mode in jvm backend", async () => {
|
||||
process.env.RD_EXTRACT_BACKEND = "jvm";
|
||||
|
||||
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-extract-"));
|
||||
|
||||
@ -15,6 +15,8 @@ import {
|
||||
|
||||
const tempDirs: string[] = [];
|
||||
const originalExtractBackend = process.env.RD_EXTRACT_BACKEND;
|
||||
const originalStatfs = fs.promises.statfs;
|
||||
const originalZipEntryMemoryLimit = process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB;
|
||||
|
||||
beforeEach(() => {
|
||||
process.env.RD_EXTRACT_BACKEND = "legacy";
|
||||
@ -29,6 +31,12 @@ afterEach(() => {
|
||||
} else {
|
||||
process.env.RD_EXTRACT_BACKEND = originalExtractBackend;
|
||||
}
|
||||
(fs.promises as any).statfs = originalStatfs;
|
||||
if (originalZipEntryMemoryLimit === undefined) {
|
||||
delete process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB;
|
||||
} else {
|
||||
process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB = originalZipEntryMemoryLimit;
|
||||
}
|
||||
});
|
||||
|
||||
describe("extractor", () => {
|
||||
@ -574,7 +582,6 @@ describe("extractor", () => {
|
||||
});
|
||||
|
||||
it("keeps original ZIP size guard error when external fallback is unavailable", async () => {
|
||||
const previousLimit = process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB;
|
||||
process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB = "8";
|
||||
|
||||
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-extract-"));
|
||||
@ -588,32 +595,20 @@ describe("extractor", () => {
|
||||
zip.addFile("large.bin", Buffer.alloc(9 * 1024 * 1024, 7));
|
||||
zip.writeZip(zipPath);
|
||||
|
||||
try {
|
||||
const result = await extractPackageArchives({
|
||||
packageDir,
|
||||
targetDir,
|
||||
cleanupMode: "none",
|
||||
conflictMode: "overwrite",
|
||||
removeLinks: false,
|
||||
removeSamples: false
|
||||
});
|
||||
expect(result.extracted).toBe(0);
|
||||
expect(result.failed).toBe(1);
|
||||
expect(String(result.lastError)).toMatch(/ZIP-Eintrag.*groß/i);
|
||||
} finally {
|
||||
if (previousLimit === undefined) {
|
||||
delete process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB;
|
||||
} else {
|
||||
process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB = previousLimit;
|
||||
}
|
||||
}
|
||||
const result = await extractPackageArchives({
|
||||
packageDir,
|
||||
targetDir,
|
||||
cleanupMode: "none",
|
||||
conflictMode: "overwrite",
|
||||
removeLinks: false,
|
||||
removeSamples: false
|
||||
});
|
||||
expect(result.extracted).toBe(0);
|
||||
expect(result.failed).toBe(1);
|
||||
expect(String(result.lastError)).toMatch(/ZIP-Eintrag.*groß/i);
|
||||
});
|
||||
|
||||
it("matches resume-state archive names case-insensitively on Windows", async () => {
|
||||
if (process.platform !== "win32") {
|
||||
return;
|
||||
}
|
||||
|
||||
it.skipIf(process.platform !== "win32")("matches resume-state archive names case-insensitively on Windows", async () => {
|
||||
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-extract-"));
|
||||
tempDirs.push(root);
|
||||
const packageDir = path.join(root, "pkg");
|
||||
@ -650,23 +645,18 @@ describe("extractor", () => {
|
||||
zip.addFile("test.txt", Buffer.alloc(1024, 0x41));
|
||||
zip.writeZip(path.join(packageDir, "test.zip"));
|
||||
|
||||
const originalStatfs = fs.promises.statfs;
|
||||
(fs.promises as any).statfs = async () => ({ bfree: 1, bsize: 1 });
|
||||
|
||||
try {
|
||||
await expect(
|
||||
extractPackageArchives({
|
||||
packageDir,
|
||||
targetDir,
|
||||
cleanupMode: "none" as any,
|
||||
conflictMode: "overwrite" as any,
|
||||
removeLinks: false,
|
||||
removeSamples: false,
|
||||
})
|
||||
).rejects.toThrow(/Nicht genug Speicherplatz/);
|
||||
} finally {
|
||||
(fs.promises as any).statfs = originalStatfs;
|
||||
}
|
||||
await expect(
|
||||
extractPackageArchives({
|
||||
packageDir,
|
||||
targetDir,
|
||||
cleanupMode: "none" as any,
|
||||
conflictMode: "overwrite" as any,
|
||||
removeLinks: false,
|
||||
removeSamples: false,
|
||||
})
|
||||
).rejects.toThrow(/Nicht genug Speicherplatz/);
|
||||
});
|
||||
|
||||
it("proceeds when disk space is sufficient", async () => {
|
||||
@ -1002,4 +992,98 @@ describe("extractor", () => {
|
||||
expect(classifyExtractionError("something weird happened")).toBe("unknown");
|
||||
});
|
||||
});
|
||||
|
||||
describe("password discovery", () => {
|
||||
it("extracts first archive serially before parallel pool when multiple passwords", async () => {
|
||||
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-pwdisc-"));
|
||||
tempDirs.push(root);
|
||||
const packageDir = path.join(root, "pkg");
|
||||
const targetDir = path.join(root, "out");
|
||||
fs.mkdirSync(packageDir, { recursive: true });
|
||||
|
||||
// Create 3 zip archives
|
||||
for (const name of ["ep01.zip", "ep02.zip", "ep03.zip"]) {
|
||||
const zip = new AdmZip();
|
||||
zip.addFile(`${name}.txt`, Buffer.from(name));
|
||||
zip.writeZip(path.join(packageDir, name));
|
||||
}
|
||||
|
||||
const seenOrder: string[] = [];
|
||||
const result = await extractPackageArchives({
|
||||
packageDir,
|
||||
targetDir,
|
||||
cleanupMode: "none",
|
||||
conflictMode: "overwrite",
|
||||
removeLinks: false,
|
||||
removeSamples: false,
|
||||
maxParallel: 2,
|
||||
passwordList: "pw1|pw2|pw3",
|
||||
onProgress: (update) => {
|
||||
if (update.phase !== "extracting" || !update.archiveName) return;
|
||||
if (seenOrder[seenOrder.length - 1] !== update.archiveName) {
|
||||
seenOrder.push(update.archiveName);
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
expect(result.extracted).toBe(3);
|
||||
expect(result.failed).toBe(0);
|
||||
// First archive should be ep01 (natural order, extracted serially for discovery)
|
||||
expect(seenOrder[0]).toBe("ep01.zip");
|
||||
});
|
||||
|
||||
it("skips discovery when only one password candidate", async () => {
|
||||
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-pwdisc-skip-"));
|
||||
tempDirs.push(root);
|
||||
const packageDir = path.join(root, "pkg");
|
||||
const targetDir = path.join(root, "out");
|
||||
fs.mkdirSync(packageDir, { recursive: true });
|
||||
|
||||
for (const name of ["a.zip", "b.zip"]) {
|
||||
const zip = new AdmZip();
|
||||
zip.addFile(`${name}.txt`, Buffer.from(name));
|
||||
zip.writeZip(path.join(packageDir, name));
|
||||
}
|
||||
|
||||
// No passwordList → only empty string → length=1 → no discovery phase
|
||||
const result = await extractPackageArchives({
|
||||
packageDir,
|
||||
targetDir,
|
||||
cleanupMode: "none",
|
||||
conflictMode: "overwrite",
|
||||
removeLinks: false,
|
||||
removeSamples: false,
|
||||
maxParallel: 4
|
||||
});
|
||||
|
||||
expect(result.extracted).toBe(2);
|
||||
expect(result.failed).toBe(0);
|
||||
});
|
||||
|
||||
it("skips discovery when only one archive", async () => {
|
||||
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-pwdisc-one-"));
|
||||
tempDirs.push(root);
|
||||
const packageDir = path.join(root, "pkg");
|
||||
const targetDir = path.join(root, "out");
|
||||
fs.mkdirSync(packageDir, { recursive: true });
|
||||
|
||||
const zip = new AdmZip();
|
||||
zip.addFile("single.txt", Buffer.from("single"));
|
||||
zip.writeZip(path.join(packageDir, "only.zip"));
|
||||
|
||||
const result = await extractPackageArchives({
|
||||
packageDir,
|
||||
targetDir,
|
||||
cleanupMode: "none",
|
||||
conflictMode: "overwrite",
|
||||
removeLinks: false,
|
||||
removeSamples: false,
|
||||
maxParallel: 4,
|
||||
passwordList: "pw1|pw2|pw3"
|
||||
});
|
||||
|
||||
expect(result.extracted).toBe(1);
|
||||
expect(result.failed).toBe(0);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@ -166,7 +166,7 @@ describe("mega-web-fallback", () => {
|
||||
const controller = new AbortController();
|
||||
const timer = setTimeout(() => {
|
||||
controller.abort("test");
|
||||
}, 30);
|
||||
}, 200);
|
||||
|
||||
try {
|
||||
await expect(fallback.unrestrict("https://mega.debrid/link2", controller.signal)).rejects.toThrow(/aborted/i);
|
||||
|
||||
188
tests/resolve-archive-items.test.ts
Normal file
188
tests/resolve-archive-items.test.ts
Normal file
@ -0,0 +1,188 @@
|
||||
import { describe, expect, it } from "vitest";
|
||||
import { resolveArchiveItemsFromList } from "../src/main/download-manager";
|
||||
|
||||
type MinimalItem = {
|
||||
targetPath?: string;
|
||||
fileName?: string;
|
||||
[key: string]: unknown;
|
||||
};
|
||||
|
||||
function makeItems(names: string[]): MinimalItem[] {
|
||||
return names.map((name) => ({
|
||||
targetPath: `C:\\Downloads\\Package\\${name}`,
|
||||
fileName: name,
|
||||
id: name,
|
||||
status: "completed",
|
||||
}));
|
||||
}
|
||||
|
||||
describe("resolveArchiveItemsFromList", () => {
|
||||
// ── Multipart RAR (.partN.rar) ──
|
||||
|
||||
it("matches multipart .part1.rar archives", () => {
|
||||
const items = makeItems([
|
||||
"Movie.part1.rar",
|
||||
"Movie.part2.rar",
|
||||
"Movie.part3.rar",
|
||||
"Other.rar",
|
||||
]);
|
||||
const result = resolveArchiveItemsFromList("Movie.part1.rar", items as any);
|
||||
expect(result).toHaveLength(3);
|
||||
expect(result.map((i: any) => i.fileName)).toEqual([
|
||||
"Movie.part1.rar",
|
||||
"Movie.part2.rar",
|
||||
"Movie.part3.rar",
|
||||
]);
|
||||
});
|
||||
|
||||
it("matches multipart .part01.rar archives (zero-padded)", () => {
|
||||
const items = makeItems([
|
||||
"Film.part01.rar",
|
||||
"Film.part02.rar",
|
||||
"Film.part10.rar",
|
||||
"Unrelated.zip",
|
||||
]);
|
||||
const result = resolveArchiveItemsFromList("Film.part01.rar", items as any);
|
||||
expect(result).toHaveLength(3);
|
||||
});
|
||||
|
||||
// ── Old-style RAR (.rar + .r00, .r01, etc.) ──
|
||||
|
||||
it("matches old-style .rar + .rNN volumes", () => {
|
||||
const items = makeItems([
|
||||
"Archive.rar",
|
||||
"Archive.r00",
|
||||
"Archive.r01",
|
||||
"Archive.r02",
|
||||
"Other.zip",
|
||||
]);
|
||||
const result = resolveArchiveItemsFromList("Archive.rar", items as any);
|
||||
expect(result).toHaveLength(4);
|
||||
});
|
||||
|
||||
// ── Single RAR ──
|
||||
|
||||
it("matches a single .rar file", () => {
|
||||
const items = makeItems(["SingleFile.rar", "Other.mkv"]);
|
||||
const result = resolveArchiveItemsFromList("SingleFile.rar", items as any);
|
||||
expect(result).toHaveLength(1);
|
||||
expect((result[0] as any).fileName).toBe("SingleFile.rar");
|
||||
});
|
||||
|
||||
// ── Split ZIP ──
|
||||
|
||||
it("matches split .zip.NNN files", () => {
|
||||
const items = makeItems([
|
||||
"Data.zip",
|
||||
"Data.zip.001",
|
||||
"Data.zip.002",
|
||||
"Data.zip.003",
|
||||
]);
|
||||
const result = resolveArchiveItemsFromList("Data.zip.001", items as any);
|
||||
expect(result).toHaveLength(4);
|
||||
});
|
||||
|
||||
// ── Split 7z ──
|
||||
|
||||
it("matches split .7z.NNN files", () => {
|
||||
const items = makeItems([
|
||||
"Backup.7z.001",
|
||||
"Backup.7z.002",
|
||||
]);
|
||||
const result = resolveArchiveItemsFromList("Backup.7z.001", items as any);
|
||||
expect(result).toHaveLength(2);
|
||||
});
|
||||
|
||||
// ── Generic .NNN splits ──
|
||||
|
||||
it("matches generic .NNN split files", () => {
|
||||
const items = makeItems([
|
||||
"video.001",
|
||||
"video.002",
|
||||
"video.003",
|
||||
]);
|
||||
const result = resolveArchiveItemsFromList("video.001", items as any);
|
||||
expect(result).toHaveLength(3);
|
||||
});
|
||||
|
||||
// ── Exact filename match ──
|
||||
|
||||
it("matches a single .zip by exact name", () => {
|
||||
const items = makeItems(["myarchive.zip", "other.rar"]);
|
||||
const result = resolveArchiveItemsFromList("myarchive.zip", items as any);
|
||||
expect(result).toHaveLength(1);
|
||||
expect((result[0] as any).fileName).toBe("myarchive.zip");
|
||||
});
|
||||
|
||||
// ── Case insensitivity ──
|
||||
|
||||
it("matches case-insensitively", () => {
|
||||
const items = makeItems([
|
||||
"MOVIE.PART1.RAR",
|
||||
"MOVIE.PART2.RAR",
|
||||
]);
|
||||
const result = resolveArchiveItemsFromList("movie.part1.rar", items as any);
|
||||
expect(result).toHaveLength(2);
|
||||
});
|
||||
|
||||
// ── Stem-based fallback ──
|
||||
|
||||
it("uses stem-based fallback when exact patterns fail", () => {
|
||||
// Simulate a debrid service that renames "Movie.part1.rar" to "Movie.part1_dl.rar"
|
||||
// but the disk file is "Movie.part1.rar"
|
||||
const items = makeItems([
|
||||
"Movie.rar",
|
||||
]);
|
||||
// The archive on disk is "Movie.part1.rar" but there's no item matching the
|
||||
// .partN pattern. The stem "movie" should match "Movie.rar" via fallback.
|
||||
const result = resolveArchiveItemsFromList("Movie.part1.rar", items as any);
|
||||
// stem fallback: "movie" starts with "movie" and ends with .rar
|
||||
expect(result).toHaveLength(1);
|
||||
});
|
||||
|
||||
// ── Single item fallback ──
|
||||
|
||||
it("returns single archive item when no pattern matches", () => {
|
||||
const items = makeItems(["totally-different-name.rar"]);
|
||||
const result = resolveArchiveItemsFromList("Original.rar", items as any);
|
||||
// Single item in list with archive extension → return it
|
||||
expect(result).toHaveLength(1);
|
||||
});
|
||||
|
||||
// ── Empty when no match ──
|
||||
|
||||
it("returns empty when items have no archive extensions", () => {
|
||||
const items = makeItems(["video.mkv", "subtitle.srt"]);
|
||||
const result = resolveArchiveItemsFromList("Archive.rar", items as any);
|
||||
expect(result).toHaveLength(0);
|
||||
});
|
||||
|
||||
// ── Items without targetPath ──
|
||||
|
||||
it("falls back to fileName when targetPath is missing", () => {
|
||||
const items = [
|
||||
{ fileName: "Movie.part1.rar", id: "1", status: "completed" },
|
||||
{ fileName: "Movie.part2.rar", id: "2", status: "completed" },
|
||||
];
|
||||
const result = resolveArchiveItemsFromList("Movie.part1.rar", items as any);
|
||||
expect(result).toHaveLength(2);
|
||||
});
|
||||
|
||||
// ── Multiple archives, should not cross-match ──
|
||||
|
||||
it("does not cross-match different archive groups", () => {
|
||||
const items = makeItems([
|
||||
"Episode.S01E01.part1.rar",
|
||||
"Episode.S01E01.part2.rar",
|
||||
"Episode.S01E02.part1.rar",
|
||||
"Episode.S01E02.part2.rar",
|
||||
]);
|
||||
const result1 = resolveArchiveItemsFromList("Episode.S01E01.part1.rar", items as any);
|
||||
expect(result1).toHaveLength(2);
|
||||
expect(result1.every((i: any) => i.fileName.includes("S01E01"))).toBe(true);
|
||||
|
||||
const result2 = resolveArchiveItemsFromList("Episode.S01E02.part1.rar", items as any);
|
||||
expect(result2).toHaveLength(2);
|
||||
expect(result2.every((i: any) => i.fileName.includes("S01E02"))).toBe(true);
|
||||
});
|
||||
});
|
||||
@ -153,7 +153,7 @@ async function main(): Promise<void> {
|
||||
createStoragePaths(path.join(tempRoot, "state-pause"))
|
||||
);
|
||||
manager2.addPackages([{ name: "pause", links: ["https://dummy/slow"] }]);
|
||||
manager2.start();
|
||||
await manager2.start();
|
||||
await new Promise((resolve) => setTimeout(resolve, 120));
|
||||
const paused = manager2.togglePause();
|
||||
assert(paused, "Pause konnte nicht aktiviert werden");
|
||||
|
||||
163
tests/session-log.test.ts
Normal file
163
tests/session-log.test.ts
Normal file
@ -0,0 +1,163 @@
|
||||
import fs from "node:fs";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import { afterEach, describe, expect, it } from "vitest";
|
||||
import { initSessionLog, getSessionLogPath, shutdownSessionLog } from "../src/main/session-log";
|
||||
import { setLogListener } from "../src/main/logger";
|
||||
|
||||
const tempDirs: string[] = [];
|
||||
|
||||
afterEach(() => {
|
||||
// Ensure session log is shut down between tests
|
||||
shutdownSessionLog();
|
||||
// Ensure listener is cleared between tests
|
||||
setLogListener(null);
|
||||
for (const dir of tempDirs.splice(0)) {
|
||||
fs.rmSync(dir, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
|
||||
describe("session-log", () => {
|
||||
it("initSessionLog creates directory and file", () => {
|
||||
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
|
||||
tempDirs.push(baseDir);
|
||||
|
||||
initSessionLog(baseDir);
|
||||
const logPath = getSessionLogPath();
|
||||
expect(logPath).not.toBeNull();
|
||||
expect(fs.existsSync(logPath!)).toBe(true);
|
||||
expect(fs.existsSync(path.join(baseDir, "session-logs"))).toBe(true);
|
||||
expect(path.basename(logPath!)).toMatch(/^session_\d{4}-\d{2}-\d{2}_\d{2}-\d{2}-\d{2}\.txt$/);
|
||||
|
||||
const content = fs.readFileSync(logPath!, "utf8");
|
||||
expect(content).toContain("=== Session gestartet:");
|
||||
|
||||
shutdownSessionLog();
|
||||
});
|
||||
|
||||
it("logger listener writes to session log", async () => {
|
||||
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
|
||||
tempDirs.push(baseDir);
|
||||
|
||||
initSessionLog(baseDir);
|
||||
const logPath = getSessionLogPath()!;
|
||||
|
||||
// Simulate a log line via the listener
|
||||
const { logger } = await import("../src/main/logger");
|
||||
logger.info("Test-Nachricht für Session-Log");
|
||||
|
||||
// Wait for flush (200ms interval + margin)
|
||||
await new Promise((resolve) => setTimeout(resolve, 500));
|
||||
|
||||
const content = fs.readFileSync(logPath, "utf8");
|
||||
expect(content).toContain("Test-Nachricht für Session-Log");
|
||||
|
||||
shutdownSessionLog();
|
||||
});
|
||||
|
||||
it("shutdownSessionLog writes closing line", () => {
|
||||
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
|
||||
tempDirs.push(baseDir);
|
||||
|
||||
initSessionLog(baseDir);
|
||||
const logPath = getSessionLogPath()!;
|
||||
|
||||
shutdownSessionLog();
|
||||
|
||||
const content = fs.readFileSync(logPath, "utf8");
|
||||
expect(content).toContain("=== Session beendet:");
|
||||
});
|
||||
|
||||
it("shutdownSessionLog removes listener", async () => {
|
||||
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
|
||||
tempDirs.push(baseDir);
|
||||
|
||||
initSessionLog(baseDir);
|
||||
const logPath = getSessionLogPath()!;
|
||||
|
||||
shutdownSessionLog();
|
||||
|
||||
// Log after shutdown - should NOT appear in session log
|
||||
const { logger } = await import("../src/main/logger");
|
||||
logger.info("Nach-Shutdown-Nachricht");
|
||||
|
||||
await new Promise((resolve) => setTimeout(resolve, 500));
|
||||
|
||||
const content = fs.readFileSync(logPath, "utf8");
|
||||
expect(content).not.toContain("Nach-Shutdown-Nachricht");
|
||||
});
|
||||
|
||||
it("cleanupOldSessionLogs deletes old files", async () => {
|
||||
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
|
||||
tempDirs.push(baseDir);
|
||||
|
||||
const logsDir = path.join(baseDir, "session-logs");
|
||||
fs.mkdirSync(logsDir, { recursive: true });
|
||||
|
||||
// Create a fake old session log
|
||||
const oldFile = path.join(logsDir, "session_2020-01-01_00-00-00.txt");
|
||||
fs.writeFileSync(oldFile, "old session");
|
||||
// Set mtime to 30 days ago
|
||||
const oldTime = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000);
|
||||
fs.utimesSync(oldFile, oldTime, oldTime);
|
||||
|
||||
// Create a recent file
|
||||
const newFile = path.join(logsDir, "session_2099-01-01_00-00-00.txt");
|
||||
fs.writeFileSync(newFile, "new session");
|
||||
|
||||
// initSessionLog triggers cleanup
|
||||
initSessionLog(baseDir);
|
||||
|
||||
// Wait for async cleanup
|
||||
await new Promise((resolve) => setTimeout(resolve, 300));
|
||||
|
||||
expect(fs.existsSync(oldFile)).toBe(false);
|
||||
expect(fs.existsSync(newFile)).toBe(true);
|
||||
|
||||
shutdownSessionLog();
|
||||
});
|
||||
|
||||
it("cleanupOldSessionLogs keeps recent files", async () => {
|
||||
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
|
||||
tempDirs.push(baseDir);
|
||||
|
||||
const logsDir = path.join(baseDir, "session-logs");
|
||||
fs.mkdirSync(logsDir, { recursive: true });
|
||||
|
||||
// Create a file from 2 days ago (should be kept)
|
||||
const recentFile = path.join(logsDir, "session_2025-12-01_00-00-00.txt");
|
||||
fs.writeFileSync(recentFile, "recent session");
|
||||
const recentTime = new Date(Date.now() - 2 * 24 * 60 * 60 * 1000);
|
||||
fs.utimesSync(recentFile, recentTime, recentTime);
|
||||
|
||||
initSessionLog(baseDir);
|
||||
|
||||
await new Promise((resolve) => setTimeout(resolve, 300));
|
||||
|
||||
expect(fs.existsSync(recentFile)).toBe(true);
|
||||
|
||||
shutdownSessionLog();
|
||||
});
|
||||
|
||||
it("multiple sessions create different files", async () => {
|
||||
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
|
||||
tempDirs.push(baseDir);
|
||||
|
||||
initSessionLog(baseDir);
|
||||
const path1 = getSessionLogPath();
|
||||
shutdownSessionLog();
|
||||
|
||||
// Small delay to ensure different timestamp
|
||||
await new Promise((resolve) => setTimeout(resolve, 1100));
|
||||
|
||||
initSessionLog(baseDir);
|
||||
const path2 = getSessionLogPath();
|
||||
shutdownSessionLog();
|
||||
|
||||
expect(path1).not.toBeNull();
|
||||
expect(path2).not.toBeNull();
|
||||
expect(path1).not.toBe(path2);
|
||||
expect(fs.existsSync(path1!)).toBe(true);
|
||||
expect(fs.existsSync(path2!)).toBe(true);
|
||||
});
|
||||
});
|
||||
@ -22,7 +22,7 @@ afterEach(() => {
|
||||
|
||||
describe("update", () => {
|
||||
it("normalizes update repo input", () => {
|
||||
expect(normalizeUpdateRepo("")).toBe("Sucukdeluxe/real-debrid-downloader");
|
||||
expect(normalizeUpdateRepo("")).toBe("Administrator/real-debrid-downloader");
|
||||
expect(normalizeUpdateRepo("owner/repo")).toBe("owner/repo");
|
||||
expect(normalizeUpdateRepo("https://codeberg.org/owner/repo")).toBe("owner/repo");
|
||||
expect(normalizeUpdateRepo("https://www.codeberg.org/owner/repo")).toBe("owner/repo");
|
||||
@ -31,14 +31,14 @@ describe("update", () => {
|
||||
expect(normalizeUpdateRepo("git@codeberg.org:owner/repo.git")).toBe("owner/repo");
|
||||
});
|
||||
|
||||
it("uses normalized repo slug for Codeberg API requests", async () => {
|
||||
it("uses normalized repo slug for API requests", async () => {
|
||||
let requestedUrl = "";
|
||||
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
|
||||
requestedUrl = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
tag_name: `v${APP_VERSION}`,
|
||||
html_url: "https://codeberg.org/owner/repo/releases/tag/v1.0.0",
|
||||
html_url: "https://git.24-music.de/owner/repo/releases/tag/v1.0.0",
|
||||
assets: []
|
||||
}),
|
||||
{
|
||||
@ -48,8 +48,8 @@ describe("update", () => {
|
||||
);
|
||||
}) as typeof fetch;
|
||||
|
||||
const result = await checkGitHubUpdate("https://codeberg.org/owner/repo/releases");
|
||||
expect(requestedUrl).toBe("https://codeberg.org/api/v1/repos/owner/repo/releases/latest");
|
||||
const result = await checkGitHubUpdate("https://git.24-music.de/owner/repo/releases");
|
||||
expect(requestedUrl).toBe("https://git.24-music.de/api/v1/repos/owner/repo/releases/latest");
|
||||
expect(result.currentVersion).toBe(APP_VERSION);
|
||||
expect(result.latestVersion).toBe(APP_VERSION);
|
||||
expect(result.updateAvailable).toBe(false);
|
||||
@ -484,14 +484,14 @@ describe("normalizeUpdateRepo extended", () => {
|
||||
});
|
||||
|
||||
it("returns default for malformed inputs", () => {
|
||||
expect(normalizeUpdateRepo("just-one-part")).toBe("Sucukdeluxe/real-debrid-downloader");
|
||||
expect(normalizeUpdateRepo(" ")).toBe("Sucukdeluxe/real-debrid-downloader");
|
||||
expect(normalizeUpdateRepo("just-one-part")).toBe("Administrator/real-debrid-downloader");
|
||||
expect(normalizeUpdateRepo(" ")).toBe("Administrator/real-debrid-downloader");
|
||||
});
|
||||
|
||||
it("rejects traversal-like owner or repo segments", () => {
|
||||
expect(normalizeUpdateRepo("../owner/repo")).toBe("Sucukdeluxe/real-debrid-downloader");
|
||||
expect(normalizeUpdateRepo("owner/../repo")).toBe("Sucukdeluxe/real-debrid-downloader");
|
||||
expect(normalizeUpdateRepo("https://codeberg.org/owner/../../repo")).toBe("Sucukdeluxe/real-debrid-downloader");
|
||||
expect(normalizeUpdateRepo("../owner/repo")).toBe("Administrator/real-debrid-downloader");
|
||||
expect(normalizeUpdateRepo("owner/../repo")).toBe("Administrator/real-debrid-downloader");
|
||||
expect(normalizeUpdateRepo("https://codeberg.org/owner/../../repo")).toBe("Administrator/real-debrid-downloader");
|
||||
});
|
||||
|
||||
it("handles www prefix", () => {
|
||||
|
||||
@ -12,5 +12,5 @@
|
||||
"isolatedModules": true,
|
||||
"types": ["node", "vite/client"]
|
||||
},
|
||||
"include": ["src", "tests", "vite.config.ts"]
|
||||
"include": ["src", "tests", "vite.config.mts"]
|
||||
}
|
||||
|
||||
Loading…
Reference in New Issue
Block a user