Replace extraction backend with SevenZipJBinding + Zip4j JVM sidecar
Some checks are pending
Build and Release / build (push) Waiting to run

- New JVM sidecar (resources/extractor-jvm/) using SevenZipJBinding for
  RAR/7z/TAR and Zip4j for ZIP multipart, matching JDownloader 2 stack
- Auto/JVM/Legacy backend modes via RD_EXTRACT_BACKEND env variable
- Fallback to legacy UnRAR/7z when JVM runtime unavailable
- Fix isJvmRuntimeMissingError false positives on valid extraction errors
- Cache JVM layout resolution to avoid repeated filesystem checks
- Route nested ZIP extraction through JVM backend consistently

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
Sucukdeluxe 2026-03-03 02:08:42 +01:00
parent 3ee3af03cf
commit de369f3bcd
22 changed files with 1888 additions and 9 deletions

View File

@ -62,7 +62,8 @@ Requirements:
- Node.js `20+` (recommended `22+`) - Node.js `20+` (recommended `22+`)
- npm - npm
- Windows `10/11` (for packaging and regular desktop use) - Windows `10/11` (for packaging and regular desktop use)
- Optional: 7-Zip/UnRAR for specific archive formats - Java Runtime `8+` (for SevenZipJBinding sidecar backend)
- Optional fallback: 7-Zip/UnRAR if you force legacy extraction mode
```bash ```bash
npm install npm install
@ -122,7 +123,7 @@ The app stores runtime files in Electron's `userData` directory, including:
## Troubleshooting ## Troubleshooting
- Download does not start: verify token and selected provider in Settings. - Download does not start: verify token and selected provider in Settings.
- Extraction fails: check archive passwords and extraction tool availability. - Extraction fails: check archive passwords, JVM runtime (`resources/extractor-jvm`), or force legacy mode with `RD_EXTRACT_BACKEND=legacy`.
- Very slow downloads: check active speed limit and bandwidth schedules. - Very slow downloads: check active speed limit and bandwidth schedules.
- Unexpected interruptions: enable reconnect and fallback providers. - Unexpected interruptions: enable reconnect and fallback providers.

View File

@ -0,0 +1,48 @@
# Disk Space Pre-Check + Nested Extraction
## Context
Two feature gaps identified from JDownloader 2 comparison:
1. JD2 checks disk space before extraction (DiskSpaceReservation)
2. JD2 supports archives within archives (nested/deep extraction)
## Feature 1: Disk Space Pre-Check
### Approach
Before extracting, calculate total archive size (sum of all archive parts) and check free disk space on the target drive. Use 1.1x archive size as minimum requirement (scene releases are mostly video with minimal compression).
### Behavior
- Check runs once in `extractPackageArchives()` before the extraction loop
- On insufficient space: abort extraction, set status to failed with message "Nicht genug Speicherplatz: X GB benötigt, Y GB frei"
- User can retry after freeing space (existing retry mechanism)
- Uses `fs.statfs()` (Node 18+) or platform-specific fallback for free space
### Implementation Location
- `extractor.ts`: New `checkDiskSpace()` function
- Called at the top of `extractPackageArchives()` after finding candidates
- Calculates total size from `collectArchiveCleanupTargets()` for each candidate
## Feature 2: Nested Extraction (1 Level Deep)
### Approach
After successfully extracting all archives, scan the output directory for new archive files. If found, extract them once (no further nesting check).
### Blacklist
Skip: `.iso`, `.img`, `.bin`, `.dmg` (disk images should not be auto-extracted)
### Behavior
- Runs after successful extraction of all top-level archives
- Calls `findArchiveCandidates()` on `targetDir`
- Filters out blacklisted extensions
- Extracts found archives with same options (passwords, conflict mode, etc.)
- No recursive nesting — exactly one additional pass
- Progress reported as second phase
- Cleanup of nested archives follows same cleanup mode
### Implementation Location
- `extractor.ts`: New nested extraction pass at end of `extractPackageArchives()`
- After the main extraction loop succeeds, before cleanup
- Reuses existing `runExternalExtract()` / `extractZipArchive()`
## Files
- `src/main/extractor.ts` — both features
- `tests/extractor.test.ts` — disk space check tests (mock fs.statfs)

View File

@ -0,0 +1,398 @@
# Disk Space Pre-Check + Nested Extraction Implementation Plan
> **For Claude:** REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task.
**Goal:** Add JDownloader-style disk space checking before extraction and single-level nested archive extraction.
**Architecture:** Two independent features in `extractor.ts`. Disk space check uses `fs.statfs()` to verify free space before starting. Nested extraction calls `findArchiveCandidates()` on the output directory after the main pass completes, then extracts any found archives once.
**Tech Stack:** Node.js `fs.statfs`, existing UnRAR/WinRAR extraction pipeline, vitest for tests.
---
### Task 1: Disk Space Check — Utility Function
**Files:**
- Modify: `src/main/extractor.ts` (after line 96, before `zipEntryMemoryLimitBytes`)
**Step 1: Add the `checkDiskSpaceForExtraction` function**
Add after the constants block (line 96) in `extractor.ts`:
```typescript
const DISK_SPACE_SAFETY_FACTOR = 1.1;
async function estimateArchivesTotalBytes(candidates: string[]): Promise<number> {
let total = 0;
for (const archivePath of candidates) {
const parts = collectArchiveCleanupTargets(archivePath);
for (const part of parts) {
try {
total += (await fs.promises.stat(part)).size;
} catch { /* missing part, ignore */ }
}
}
return total;
}
function humanSizeGB(bytes: number): string {
if (bytes >= 1024 * 1024 * 1024) {
return `${(bytes / (1024 * 1024 * 1024)).toFixed(1)} GB`;
}
return `${(bytes / (1024 * 1024)).toFixed(0)} MB`;
}
async function checkDiskSpaceForExtraction(targetDir: string, candidates: string[]): Promise<void> {
if (candidates.length === 0) return;
const archiveBytes = await estimateArchivesTotalBytes(candidates);
if (archiveBytes <= 0) return;
const requiredBytes = Math.ceil(archiveBytes * DISK_SPACE_SAFETY_FACTOR);
let freeBytes: number;
try {
const stats = await fs.promises.statfs(targetDir);
freeBytes = stats.bfree * stats.bsize;
} catch {
// statfs not supported or target doesn't exist yet — skip check
return;
}
if (freeBytes < requiredBytes) {
const msg = `Nicht genug Speicherplatz: ${humanSizeGB(requiredBytes)} benötigt, ${humanSizeGB(freeBytes)} frei`;
logger.error(`Disk-Space-Check: ${msg} (target=${targetDir})`);
throw new Error(msg);
}
logger.info(`Disk-Space-Check OK: ${humanSizeGB(freeBytes)} frei, ${humanSizeGB(requiredBytes)} benötigt (target=${targetDir})`);
}
```
**Step 2: Wire into `extractPackageArchives`**
In `extractPackageArchives()`, after the candidates are filtered (line ~1230, after the log line), add:
```typescript
// Disk space pre-check
try {
await fs.promises.mkdir(options.targetDir, { recursive: true });
} catch { /* ignore */ }
await checkDiskSpaceForExtraction(options.targetDir, candidates);
```
This goes right after line 1230 (`logger.info(...)`) and before line 1231 (`if (candidates.length === 0)`).
**Step 3: Build and verify**
Run: `npm run build`
Expected: Compiles without errors.
**Step 4: Commit**
```
feat: add disk space pre-check before extraction
```
---
### Task 2: Disk Space Check — Tests
**Files:**
- Modify: `tests/extractor.test.ts`
**Step 1: Add disk space test**
Add a new `describe("disk space check")` block in the extractor test file. Since `checkDiskSpaceForExtraction` is a private function, test it indirectly via `extractPackageArchives` — create a temp dir with a tiny zip, mock `fs.promises.statfs` to return very low free space, and verify extraction fails with the right message.
```typescript
describe("disk space check", () => {
it("aborts extraction when disk space is insufficient", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-diskspace-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
fs.mkdirSync(targetDir, { recursive: true });
// Create a small zip
const zip = new AdmZip();
zip.addFile("test.txt", Buffer.alloc(1024, 0x41));
zip.writeZip(path.join(packageDir, "test.zip"));
// Mock statfs to report almost no free space
const originalStatfs = fs.promises.statfs;
(fs.promises as any).statfs = async () => ({ bfree: 1, bsize: 1 });
try {
await expect(
extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none" as any,
conflictMode: "overwrite" as any,
removeLinks: false,
removeSamples: false,
})
).rejects.toThrow(/Nicht genug Speicherplatz/);
} finally {
(fs.promises as any).statfs = originalStatfs;
}
});
it("proceeds when disk space is sufficient", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-diskspace-ok-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
fs.mkdirSync(targetDir, { recursive: true });
const zip = new AdmZip();
zip.addFile("test.txt", Buffer.alloc(1024, 0x41));
zip.writeZip(path.join(packageDir, "test.zip"));
// Don't mock statfs — real disk should have enough space
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none" as any,
conflictMode: "overwrite" as any,
removeLinks: false,
removeSamples: false,
});
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
});
});
```
**Step 2: Run tests**
Run: `npx vitest run tests/extractor.test.ts`
Expected: All tests pass including new disk space tests.
**Step 3: Commit**
```
test: add disk space pre-check tests
```
---
### Task 3: Nested Extraction — Implementation
**Files:**
- Modify: `src/main/extractor.ts` (in `extractPackageArchives`, after line ~1404 main loop ends)
**Step 1: Add the nested extraction blacklist constant**
Add near the top of the file (after ARCHIVE_SORT_COLLATOR, line 96):
```typescript
const NESTED_EXTRACT_BLACKLIST_RE = /\.(iso|img|bin|dmg)$/i;
```
**Step 2: Add nested extraction pass in `extractPackageArchives`**
After the main extraction for-loop ends (line ~1404, after the last `clearInterval(pulseTimer)`) and before the `if (extracted > 0)` block (line 1406), add:
```typescript
// ── Nested extraction: check output dir for archives produced by extraction ──
if (extracted > 0 && failed === 0 && !options.skipPostCleanup && !options.onlyArchives) {
try {
const nestedCandidates = (await findArchiveCandidates(options.targetDir))
.filter((p) => !NESTED_EXTRACT_BLACKLIST_RE.test(p));
if (nestedCandidates.length > 0) {
logger.info(`Nested-Extraction: ${nestedCandidates.length} Archive im Output gefunden`);
// Disk space check for nested archives too
try {
await checkDiskSpaceForExtraction(options.targetDir, nestedCandidates);
} catch (spaceError) {
logger.warn(`Nested-Extraction Disk-Space-Check fehlgeschlagen: ${String(spaceError)}`);
// Don't fail the whole extraction, just skip nesting
nestedCandidates.length = 0;
}
for (const nestedArchive of nestedCandidates) {
if (options.signal?.aborted) {
throw new Error("aborted:extract");
}
const nestedName = path.basename(nestedArchive);
const nestedKey = archiveNameKey(nestedName);
if (resumeCompleted.has(nestedKey)) {
logger.info(`Nested-Extraction übersprungen (bereits entpackt): ${nestedName}`);
continue;
}
const nestedStartedAt = Date.now();
let nestedPercent = 0;
emitProgress(extracted + failed, `nested: ${nestedName}`, "extracting", nestedPercent, 0);
const nestedPulse = setInterval(() => {
emitProgress(extracted + failed, `nested: ${nestedName}`, "extracting", nestedPercent, Date.now() - nestedStartedAt);
}, 1100);
const hybrid = Boolean(options.hybridMode);
logger.info(`Nested-Entpacke: ${nestedName} -> ${options.targetDir}${hybrid ? " (hybrid)" : ""}`);
try {
const ext = path.extname(nestedArchive).toLowerCase();
if (ext === ".zip") {
try {
await extractZipArchive(nestedArchive, options.targetDir, options.conflictMode, options.signal);
nestedPercent = 100;
} catch (zipErr) {
if (!shouldFallbackToExternalZip(zipErr)) throw zipErr;
const usedPw = await runExternalExtract(nestedArchive, options.targetDir, options.conflictMode, passwordCandidates, (v) => { nestedPercent = Math.max(nestedPercent, v); }, options.signal, hybrid);
passwordCandidates = prioritizePassword(passwordCandidates, usedPw);
}
} else {
const usedPw = await runExternalExtract(nestedArchive, options.targetDir, options.conflictMode, passwordCandidates, (v) => { nestedPercent = Math.max(nestedPercent, v); }, options.signal, hybrid);
passwordCandidates = prioritizePassword(passwordCandidates, usedPw);
}
extracted += 1;
resumeCompleted.add(nestedKey);
await writeExtractResumeState(options.packageDir, resumeCompleted, options.packageId);
logger.info(`Nested-Entpacken erfolgreich: ${nestedName}`);
// Cleanup nested archive after successful extraction
if (options.cleanupMode === "delete") {
const nestedParts = collectArchiveCleanupTargets(nestedArchive);
for (const part of nestedParts) {
try { await fs.promises.unlink(part); } catch { /* ignore */ }
}
}
} catch (nestedErr) {
const errText = String(nestedErr);
if (isExtractAbortError(errText)) throw new Error("aborted:extract");
if (isNoExtractorError(errText)) {
logger.warn(`Nested-Extraction: Kein Extractor, überspringe restliche`);
break;
}
failed += 1;
lastError = errText;
logger.error(`Nested-Entpack-Fehler ${nestedName}: ${errText}`);
} finally {
clearInterval(nestedPulse);
}
}
}
} catch (nestedError) {
const errText = String(nestedError);
if (isExtractAbortError(errText)) throw new Error("aborted:extract");
logger.warn(`Nested-Extraction Fehler: ${cleanErrorText(errText)}`);
}
}
```
**Step 3: Build and verify**
Run: `npm run build`
Expected: Compiles without errors.
**Step 4: Commit**
```
feat: add single-level nested archive extraction
```
---
### Task 4: Nested Extraction — Tests
**Files:**
- Modify: `tests/extractor.test.ts`
**Step 1: Add nested extraction test**
```typescript
describe("nested extraction", () => {
it("extracts archives found inside extracted output", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-nested-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
fs.mkdirSync(targetDir, { recursive: true });
// Create inner zip with a text file
const innerZip = new AdmZip();
innerZip.addFile("deep.txt", Buffer.from("deep content"));
// Create outer zip containing the inner zip
const outerZip = new AdmZip();
outerZip.addFile("inner.zip", innerZip.toBuffer());
outerZip.writeZip(path.join(packageDir, "outer.zip"));
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none" as any,
conflictMode: "overwrite" as any,
removeLinks: false,
removeSamples: false,
});
// outer.zip extracted (1) + inner.zip extracted (1) = 2
expect(result.extracted).toBe(2);
expect(result.failed).toBe(0);
// deep.txt should exist in the target
expect(fs.existsSync(path.join(targetDir, "deep.txt"))).toBe(true);
});
it("does not extract blacklisted extensions like .iso", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-nested-bl-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
fs.mkdirSync(targetDir, { recursive: true });
// Create a zip that contains a file named "disc.iso"
// (not a real archive, but tests the blacklist filter path)
const zip = new AdmZip();
zip.addFile("disc.iso", Buffer.alloc(64, 0));
zip.addFile("readme.txt", Buffer.from("hello"));
zip.writeZip(path.join(packageDir, "package.zip"));
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none" as any,
conflictMode: "overwrite" as any,
removeLinks: false,
removeSamples: false,
});
expect(result.extracted).toBe(1); // Only outer zip, no nested
expect(fs.existsSync(path.join(targetDir, "disc.iso"))).toBe(true);
expect(fs.existsSync(path.join(targetDir, "readme.txt"))).toBe(true);
});
});
```
**Step 2: Run all extractor tests**
Run: `npx vitest run tests/extractor.test.ts`
Expected: All tests pass.
**Step 3: Commit**
```
test: add nested extraction tests
```
---
### Task 5: Full Build + Test Verification
**Step 1: Build**
Run: `npm run build`
Expected: Clean compile.
**Step 2: Run all fast tests**
Run: `npx vitest run tests/extractor.test.ts tests/utils.test.ts tests/storage.test.ts tests/integrity.test.ts tests/cleanup.test.ts tests/debrid.test.ts tests/auto-rename.test.ts`
Expected: All pass (except pre-existing cleanup.test.ts failures).
**Step 3: Final commit with version bump if releasing**
```
feat: disk space pre-check + nested extraction (JD2 parity)
```

View File

@ -1,6 +1,6 @@
{ {
"name": "real-debrid-downloader", "name": "real-debrid-downloader",
"version": "1.5.42", "version": "1.5.43",
"description": "Real-Debrid Downloader Desktop (Electron + React + TypeScript)", "description": "Real-Debrid Downloader Desktop (Electron + React + TypeScript)",
"main": "build/main/main/main.js", "main": "build/main/main/main.js",
"author": "Sucukdeluxe", "author": "Sucukdeluxe",
@ -53,6 +53,7 @@
"files": [ "files": [
"build/main/**/*", "build/main/**/*",
"build/renderer/**/*", "build/renderer/**/*",
"resources/extractor-jvm/**/*",
"package.json" "package.json"
], ],
"win": { "win": {

View File

@ -0,0 +1,22 @@
# JVM extractor runtime
This directory contains the Java sidecar runtime used by `src/main/extractor.ts`.
## Included backends
- `sevenzipjbinding` for the primary extraction path (RAR/7z/ZIP and others)
- `zip4j` for ZIP multipart handling (JD-style split ZIP behavior)
## Layout
- `classes/` compiled `JBindExtractorMain` classes
- `lib/` runtime jars required by the sidecar
- `src/` Java source for the sidecar
## Rebuild notes
The checked-in classes are Java 8 compatible and built from:
`resources/extractor-jvm/src/com/sucukdeluxe/extractor/JBindExtractorMain.java`
If you need to rebuild, compile against the jars in `lib/` with a Java 8-compatible compiler.

View File

@ -0,0 +1,12 @@
Bundled JVM extractor dependencies:
1) sevenzipjbinding (16.02-2.01)
- Maven artifact: net.sf.sevenzipjbinding:sevenzipjbinding
- Maven artifact: net.sf.sevenzipjbinding:sevenzipjbinding-all-platforms
- Upstream: https://sevenzipjbind.sourceforge.net/
2) zip4j (2.11.5)
- Maven artifact: net.lingala.zip4j:zip4j
- Upstream: https://github.com/srikanth-lingala/zip4j
Please review upstream licenses and notices before redistribution.

Binary file not shown.

Binary file not shown.

View File

@ -0,0 +1,856 @@
package com.sucukdeluxe.extractor;
import net.lingala.zip4j.ZipFile;
import net.lingala.zip4j.exception.ZipException;
import net.lingala.zip4j.model.FileHeader;
import net.sf.sevenzipjbinding.ExtractOperationResult;
import net.sf.sevenzipjbinding.IArchiveOpenCallback;
import net.sf.sevenzipjbinding.IArchiveOpenVolumeCallback;
import net.sf.sevenzipjbinding.IInArchive;
import net.sf.sevenzipjbinding.IInStream;
import net.sf.sevenzipjbinding.ISequentialOutStream;
import net.sf.sevenzipjbinding.ICryptoGetTextPassword;
import net.sf.sevenzipjbinding.PropID;
import net.sf.sevenzipjbinding.SevenZip;
import net.sf.sevenzipjbinding.SevenZipException;
import net.sf.sevenzipjbinding.impl.RandomAccessFileInStream;
import net.sf.sevenzipjbinding.impl.VolumedArchiveInStream;
import net.sf.sevenzipjbinding.simple.ISimpleInArchive;
import net.sf.sevenzipjbinding.simple.ISimpleInArchiveItem;
import java.io.Closeable;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.io.RandomAccessFile;
import java.nio.charset.StandardCharsets;
import java.util.ArrayList;
import java.util.Base64;
import java.util.HashMap;
import java.util.HashSet;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Locale;
import java.util.Map;
import java.util.Set;
import java.util.regex.Pattern;
public final class JBindExtractorMain {
private static final int BUFFER_SIZE = 64 * 1024;
private static final Pattern NUMBERED_ZIP_SPLIT_RE = Pattern.compile("(?i).*\\.zip\\.\\d{3}$");
private static final Pattern OLD_ZIP_SPLIT_RE = Pattern.compile("(?i).*\\.z\\d{2,3}$");
private static final Pattern SEVEN_ZIP_SPLIT_RE = Pattern.compile("(?i).*\\.7z\\.001$");
private JBindExtractorMain() {
}
public static void main(String[] args) {
int exit = 1;
try {
ExtractionRequest request = parseArgs(args);
exit = runExtraction(request);
} catch (IllegalArgumentException error) {
emitError("Argumentfehler: " + safeMessage(error));
exit = 2;
} catch (Throwable error) {
emitError(safeMessage(error));
exit = 1;
}
System.exit(exit);
}
private static int runExtraction(ExtractionRequest request) throws Exception {
List<String> passwords = normalizePasswords(request.passwords);
Exception lastError = null;
boolean hadWrongPassword = false;
for (String password : passwords) {
try {
extractSingle(request, password);
emitPassword(password);
emitDone();
return 0;
} catch (WrongPasswordException wrongPassword) {
hadWrongPassword = true;
lastError = wrongPassword;
} catch (Exception error) {
lastError = error;
break;
}
}
if (hadWrongPassword && (lastError instanceof WrongPasswordException)) {
emitError("Falsches Archiv-Passwort");
return 1;
}
if (lastError != null) {
throw lastError;
}
emitError("Entpacken fehlgeschlagen");
return 1;
}
private static void extractSingle(ExtractionRequest request, String password) throws Exception {
Backend backend = request.backend;
if (backend == Backend.AUTO) {
backend = shouldUseZip4j(request.archiveFile) ? Backend.ZIP4J : Backend.SEVENZIPJBIND;
}
emitBackend(backend);
if (backend == Backend.ZIP4J) {
extractWithZip4j(request, password);
return;
}
extractWithSevenZip(request, password);
}
private static void extractWithZip4j(ExtractionRequest request, String password) throws Exception {
ZipFile zipFile = new ZipFile(request.archiveFile);
try {
if (password != null && password.length() > 0) {
zipFile.setPassword(password.toCharArray());
}
List<FileHeader> fileHeaders = zipFile.getFileHeaders();
if (fileHeaders == null) {
fileHeaders = new ArrayList<FileHeader>();
}
long totalUnits = 0;
boolean encrypted = false;
for (FileHeader header : fileHeaders) {
if (header == null || header.isDirectory()) {
continue;
}
encrypted = encrypted || header.isEncrypted();
totalUnits += safeSize(header.getUncompressedSize());
}
ProgressTracker progress = new ProgressTracker(totalUnits);
progress.emitStart();
Set<String> reserved = new HashSet<String>();
for (FileHeader header : fileHeaders) {
if (header == null) {
continue;
}
String entryName = normalizeEntryName(header.getFileName(), "file");
if (header.isDirectory()) {
File dir = resolveDirectory(request.targetDir, entryName);
ensureDirectory(dir);
reserved.add(pathKey(dir));
continue;
}
long itemUnits = safeSize(header.getUncompressedSize());
File output = resolveOutputFile(request.targetDir, entryName, request.conflictMode, reserved);
if (output == null) {
progress.advance(itemUnits);
continue;
}
ensureDirectory(output.getParentFile());
long[] remaining = new long[] { itemUnits };
try {
InputStream in = zipFile.getInputStream(header);
OutputStream out = new FileOutputStream(output);
try {
byte[] buffer = new byte[BUFFER_SIZE];
while (true) {
int read = in.read(buffer);
if (read < 0) {
break;
}
if (read == 0) {
continue;
}
out.write(buffer, 0, read);
long accounted = Math.min(remaining[0], (long) read);
remaining[0] -= accounted;
progress.advance(accounted);
}
} finally {
try {
out.close();
} catch (Throwable ignored) {
}
try {
in.close();
} catch (Throwable ignored) {
}
}
if (remaining[0] > 0) {
progress.advance(remaining[0]);
}
long modified = header.getLastModifiedTimeEpoch();
if (modified > 0) {
output.setLastModified(modified);
}
} catch (ZipException error) {
if (isWrongPassword(error, encrypted)) {
throw new WrongPasswordException(error);
}
throw error;
}
}
progress.emitDone();
} finally {
try {
zipFile.close();
} catch (Throwable ignored) {
}
}
}
private static void extractWithSevenZip(ExtractionRequest request, String password) throws Exception {
SevenZipArchiveContext context = null;
try {
context = openSevenZipArchive(request.archiveFile, password);
IInArchive archive = context.archive;
ISimpleInArchive simple = archive.getSimpleInterface();
ISimpleInArchiveItem[] items = simple.getArchiveItems();
long totalUnits = 0;
boolean encrypted = false;
for (ISimpleInArchiveItem item : items) {
if (item == null || item.isFolder()) {
continue;
}
try {
encrypted = encrypted || item.isEncrypted();
} catch (Throwable ignored) {
// ignore encrypted flag read issues
}
totalUnits += safeSize(item.getSize());
}
ProgressTracker progress = new ProgressTracker(totalUnits);
progress.emitStart();
Set<String> reserved = new HashSet<String>();
for (ISimpleInArchiveItem item : items) {
if (item == null) {
continue;
}
String entryName = normalizeEntryName(item.getPath(), "item-" + item.getItemIndex());
if (item.isFolder()) {
File dir = resolveDirectory(request.targetDir, entryName);
ensureDirectory(dir);
reserved.add(pathKey(dir));
continue;
}
long itemUnits = safeSize(item.getSize());
File output = resolveOutputFile(request.targetDir, entryName, request.conflictMode, reserved);
if (output == null) {
progress.advance(itemUnits);
continue;
}
ensureDirectory(output.getParentFile());
final FileOutputStream out = new FileOutputStream(output);
final long[] remaining = new long[] { itemUnits };
try {
ExtractOperationResult result = item.extractSlow(new ISequentialOutStream() {
@Override
public int write(byte[] data) throws SevenZipException {
if (data == null || data.length == 0) {
return 0;
}
try {
out.write(data);
} catch (IOException error) {
throw new SevenZipException("Fehler beim Schreiben: " + error.getMessage(), error);
}
long accounted = Math.min(remaining[0], (long) data.length);
remaining[0] -= accounted;
progress.advance(accounted);
return data.length;
}
});
if (remaining[0] > 0) {
progress.advance(remaining[0]);
}
if (result != ExtractOperationResult.OK) {
if (isPasswordFailure(result, encrypted)) {
throw new WrongPasswordException(new IOException("Falsches Passwort"));
}
throw new IOException("7z-Fehler: " + result.name());
}
} catch (SevenZipException error) {
if (looksLikeWrongPassword(error, encrypted)) {
throw new WrongPasswordException(error);
}
throw error;
} finally {
try {
out.close();
} catch (Throwable ignored) {
}
}
try {
java.util.Date modified = item.getLastWriteTime();
if (modified != null) {
output.setLastModified(modified.getTime());
}
} catch (Throwable ignored) {
// best effort
}
}
progress.emitDone();
} finally {
if (context != null) {
context.close();
}
}
}
private static SevenZipArchiveContext openSevenZipArchive(File archiveFile, String password) throws Exception {
String nameLower = archiveFile.getName().toLowerCase(Locale.ROOT);
String effectivePassword = password == null ? "" : password;
SevenZipVolumeCallback callback = new SevenZipVolumeCallback(archiveFile, effectivePassword);
if (SEVEN_ZIP_SPLIT_RE.matcher(nameLower).matches()) {
VolumedArchiveInStream volumed = new VolumedArchiveInStream(archiveFile.getName(), callback);
IInArchive archive = SevenZip.openInArchive(null, volumed, callback);
return new SevenZipArchiveContext(archive, null, volumed, callback);
}
RandomAccessFile raf = new RandomAccessFile(archiveFile, "r");
RandomAccessFileInStream stream = new RandomAccessFileInStream(raf);
IInArchive archive = SevenZip.openInArchive(null, stream, callback);
return new SevenZipArchiveContext(archive, stream, null, callback);
}
private static boolean isWrongPassword(ZipException error, boolean encrypted) {
if (error == null) {
return false;
}
if (error.getType() == ZipException.Type.WRONG_PASSWORD) {
return true;
}
String text = safeMessage(error).toLowerCase(Locale.ROOT);
if (text.contains("wrong password") || text.contains("falsches passwort")) {
return true;
}
return encrypted && (text.contains("checksum") || text.contains("crc") || text.contains("password"));
}
private static boolean isPasswordFailure(ExtractOperationResult result, boolean encrypted) {
if (!encrypted || result == null) {
return false;
}
return result == ExtractOperationResult.CRCERROR || result == ExtractOperationResult.DATAERROR;
}
private static boolean looksLikeWrongPassword(Throwable error, boolean encrypted) {
if (error == null) {
return false;
}
String text = safeMessage(error).toLowerCase(Locale.ROOT);
if (text.contains("wrong password") || text.contains("falsches passwort")) {
return true;
}
return encrypted && (text.contains("crc") || text.contains("data error") || text.contains("checksum"));
}
private static boolean shouldUseZip4j(File archiveFile) {
String name = archiveFile.getName().toLowerCase(Locale.ROOT);
if (NUMBERED_ZIP_SPLIT_RE.matcher(name).matches()) {
return true;
}
if (OLD_ZIP_SPLIT_RE.matcher(name).matches()) {
return true;
}
if (name.endsWith(".zip")) {
File parent = archiveFile.getParentFile();
if (parent == null || !parent.exists()) {
return false;
}
String stem = archiveFile.getName().substring(0, archiveFile.getName().length() - 4);
File[] siblings = parent.listFiles();
if (siblings == null) {
return false;
}
String prefix = (stem + ".z").toLowerCase(Locale.ROOT);
for (File sibling : siblings) {
String siblingName = sibling.getName().toLowerCase(Locale.ROOT);
if (!sibling.isFile()) {
continue;
}
if (siblingName.startsWith(prefix) && siblingName.length() >= prefix.length() + 2) {
String suffix = siblingName.substring(prefix.length());
if (suffix.matches("\\d{2,3}")) {
return true;
}
}
}
}
return false;
}
private static File resolveDirectory(File targetDir, String entryName) throws IOException {
File directory = secureResolve(targetDir, entryName);
return directory;
}
private static File resolveOutputFile(File targetDir, String entryName, ConflictMode conflictMode, Set<String> reserved) throws IOException {
File base = secureResolve(targetDir, entryName);
String key = pathKey(base);
boolean exists = base.exists() || reserved.contains(key);
if (!exists) {
reserved.add(key);
return base;
}
if (conflictMode == ConflictMode.SKIP) {
return null;
}
if (conflictMode == ConflictMode.OVERWRITE) {
if (base.exists()) {
deleteRecursively(base);
}
reserved.add(key);
return base;
}
File parent = base.getParentFile();
String fileName = base.getName();
int dot = fileName.lastIndexOf('.');
String stem = dot > 0 ? fileName.substring(0, dot) : fileName;
String ext = dot > 0 ? fileName.substring(dot) : "";
int counter = 1;
while (counter <= 10000) {
String candidateName = stem + " (" + counter + ")" + ext;
File candidate = new File(parent, candidateName);
String candidateKey = pathKey(candidate);
if (!candidate.exists() && !reserved.contains(candidateKey)) {
reserved.add(candidateKey);
return candidate;
}
counter += 1;
}
throw new IOException("Rename-Limit erreicht fur " + entryName);
}
private static void deleteRecursively(File file) throws IOException {
if (file == null || !file.exists()) {
return;
}
if (file.isDirectory()) {
File[] children = file.listFiles();
if (children != null) {
for (File child : children) {
deleteRecursively(child);
}
}
}
if (!file.delete()) {
throw new IOException("Konnte Datei nicht uberschreiben: " + file.getAbsolutePath());
}
}
private static File secureResolve(File targetDir, String entryName) throws IOException {
String normalized = normalizeEntryName(entryName, "file");
while (normalized.startsWith("/")) {
normalized = normalized.substring(1);
}
while (normalized.startsWith("\\")) {
normalized = normalized.substring(1);
}
if (normalized.matches("^[a-zA-Z]:.*")) {
normalized = normalized.substring(2);
}
File targetCanonical = targetDir.getCanonicalFile();
File output = new File(targetCanonical, normalized);
File outputCanonical = output.getCanonicalFile();
String targetPath = targetCanonical.getPath();
String outputPath = outputCanonical.getPath();
String targetPathNorm = isWindows() ? targetPath.toLowerCase(Locale.ROOT) : targetPath;
String outputPathNorm = isWindows() ? outputPath.toLowerCase(Locale.ROOT) : outputPath;
if (!outputPathNorm.equals(targetPathNorm) && !outputPathNorm.startsWith(targetPathNorm + File.separator)) {
throw new IOException("Path Traversal blockiert: " + entryName);
}
return outputCanonical;
}
private static String normalizeEntryName(String value, String fallback) {
String entry = value == null ? "" : value.trim();
if (entry.length() == 0) {
return fallback;
}
entry = entry.replace('\\', '/');
while (entry.startsWith("./")) {
entry = entry.substring(2);
}
if (entry.length() == 0) {
return fallback;
}
return entry;
}
private static long safeSize(Long value) {
if (value == null) {
return 1;
}
long size = value.longValue();
if (size <= 0) {
return 1;
}
return size;
}
private static void ensureDirectory(File dir) throws IOException {
if (dir == null) {
return;
}
if (dir.exists()) {
if (!dir.isDirectory()) {
throw new IOException("Pfad ist keine Directory: " + dir.getAbsolutePath());
}
return;
}
if (!dir.mkdirs() && !dir.isDirectory()) {
throw new IOException("Verzeichnis konnte nicht erstellt werden: " + dir.getAbsolutePath());
}
}
private static String pathKey(File file) {
String value = file.getAbsolutePath();
if (isWindows()) {
value = value.toLowerCase(Locale.ROOT);
}
return value;
}
private static boolean isWindows() {
String osName = System.getProperty("os.name", "").toLowerCase(Locale.ROOT);
return osName.contains("win");
}
private static List<String> normalizePasswords(List<String> input) {
LinkedHashSet<String> deduped = new LinkedHashSet<String>();
deduped.add("");
if (input != null) {
for (String value : input) {
deduped.add(value == null ? "" : value);
}
}
return new ArrayList<String>(deduped);
}
private static ExtractionRequest parseArgs(String[] args) {
ExtractionRequest request = new ExtractionRequest();
int index = 0;
while (index < args.length) {
String key = args[index];
if ("--archive".equals(key)) {
request.archiveFile = new File(readNext(args, ++index, key));
} else if ("--target".equals(key)) {
request.targetDir = new File(readNext(args, ++index, key));
} else if ("--conflict".equals(key)) {
request.conflictMode = ConflictMode.fromValue(readNext(args, ++index, key));
} else if ("--backend".equals(key)) {
request.backend = Backend.fromValue(readNext(args, ++index, key));
} else if ("--password".equals(key)) {
request.passwords.add(readNext(args, ++index, key));
} else {
throw new IllegalArgumentException("Unbekanntes Argument: " + key);
}
index += 1;
}
if (request.archiveFile == null) {
throw new IllegalArgumentException("--archive fehlt");
}
if (request.targetDir == null) {
throw new IllegalArgumentException("--target fehlt");
}
if (!request.archiveFile.exists() || !request.archiveFile.isFile()) {
throw new IllegalArgumentException("Archiv nicht gefunden: " + request.archiveFile.getAbsolutePath());
}
return request;
}
private static String readNext(String[] args, int index, String key) {
if (index >= args.length) {
throw new IllegalArgumentException("Wert fehlt fur " + key);
}
return args[index];
}
private static String safeMessage(Throwable error) {
if (error == null) {
return "Unbekannter Fehler";
}
String message = error.getMessage();
if (message == null || message.trim().length() == 0) {
message = error.toString();
}
return message.replace('\n', ' ').replace('\r', ' ').trim();
}
private static void emitBackend(Backend backend) {
System.out.println("RD_BACKEND " + backend.value);
}
private static void emitPassword(String password) {
String encoded = Base64.getEncoder().encodeToString((password == null ? "" : password).getBytes(StandardCharsets.UTF_8));
System.out.println("RD_PASSWORD " + encoded);
}
private static void emitDone() {
System.out.println("RD_DONE");
}
private static void emitError(String message) {
System.err.println("RD_ERROR " + message);
}
private enum Backend {
AUTO("auto"),
SEVENZIPJBIND("7zjbinding"),
ZIP4J("zip4j");
private final String value;
Backend(String value) {
this.value = value;
}
static Backend fromValue(String raw) {
String value = raw == null ? "" : raw.trim().toLowerCase(Locale.ROOT);
if ("auto".equals(value)) {
return AUTO;
}
if ("7zjb".equals(value) || "7zjbinding".equals(value) || "sevenzipjbinding".equals(value)) {
return SEVENZIPJBIND;
}
if ("zip4j".equals(value)) {
return ZIP4J;
}
throw new IllegalArgumentException("Ungueltiger Backend-Wert: " + raw);
}
}
private enum ConflictMode {
OVERWRITE,
SKIP,
RENAME;
static ConflictMode fromValue(String raw) {
String value = raw == null ? "" : raw.trim().toLowerCase(Locale.ROOT);
if ("overwrite".equals(value)) {
return OVERWRITE;
}
if ("skip".equals(value) || "ask".equals(value)) {
return SKIP;
}
if ("rename".equals(value)) {
return RENAME;
}
throw new IllegalArgumentException("Ungueltiger Conflict-Wert: " + raw);
}
}
private static final class ExtractionRequest {
private File archiveFile;
private File targetDir;
private ConflictMode conflictMode = ConflictMode.SKIP;
private Backend backend = Backend.AUTO;
private final List<String> passwords = new ArrayList<String>();
}
private static final class WrongPasswordException extends Exception {
private static final long serialVersionUID = 1L;
WrongPasswordException(Throwable cause) {
super(cause);
}
}
private static final class ProgressTracker {
private final long total;
private long completed;
private int lastPercent = -1;
ProgressTracker(long totalUnits) {
this.total = Math.max(1L, totalUnits);
this.completed = 0L;
}
synchronized void emitStart() {
emitPercent(0);
}
synchronized void advance(long units) {
if (units <= 0) {
return;
}
completed += units;
if (completed > total) {
completed = total;
}
int percent = (int) Math.min(100L, Math.max(0L, (completed * 100L) / total));
emitPercent(percent);
}
synchronized void emitDone() {
completed = total;
emitPercent(100);
}
private void emitPercent(int percent) {
int bounded = Math.max(0, Math.min(100, percent));
if (bounded == lastPercent) {
return;
}
lastPercent = bounded;
System.out.println("RD_PROGRESS " + bounded + "%");
}
}
private static final class SevenZipArchiveContext implements Closeable {
private final IInArchive archive;
private final IInStream rootStream;
private final VolumedArchiveInStream volumedArchiveInStream;
private final SevenZipVolumeCallback callback;
SevenZipArchiveContext(IInArchive archive, IInStream rootStream, VolumedArchiveInStream volumedArchiveInStream, SevenZipVolumeCallback callback) {
this.archive = archive;
this.rootStream = rootStream;
this.volumedArchiveInStream = volumedArchiveInStream;
this.callback = callback;
}
@Override
public void close() {
if (archive != null) {
try {
archive.close();
} catch (Throwable ignored) {
}
}
if (rootStream != null) {
try {
rootStream.close();
} catch (Throwable ignored) {
}
}
if (volumedArchiveInStream != null) {
try {
volumedArchiveInStream.close();
} catch (Throwable ignored) {
}
}
if (callback != null) {
callback.close();
}
}
}
private static final class SevenZipVolumeCallback implements IArchiveOpenCallback, IArchiveOpenVolumeCallback, ICryptoGetTextPassword, Closeable {
private final File archiveDir;
private final String firstFileName;
private final String password;
private final Map<String, RandomAccessFile> openRafs = new HashMap<String, RandomAccessFile>();
SevenZipVolumeCallback(File archiveFile, String password) {
this.archiveDir = archiveFile.getAbsoluteFile().getParentFile();
this.firstFileName = archiveFile.getName();
this.password = password == null ? "" : password;
}
@Override
public Object getProperty(PropID propID) {
if (propID == PropID.NAME) {
return firstFileName;
}
return null;
}
@Override
public IInStream getStream(String filename) throws SevenZipException {
File file = resolveVolumeFile(filename);
if (file == null || !file.exists() || !file.isFile()) {
return null;
}
try {
String key = pathKey(file);
RandomAccessFile raf = openRafs.get(key);
if (raf == null) {
raf = new RandomAccessFile(file, "r");
openRafs.put(key, raf);
}
raf.seek(0L);
return new RandomAccessFileInStream(raf);
} catch (IOException error) {
throw new SevenZipException("Volume konnte nicht geoffnet werden: " + filename, error);
}
}
@Override
public void setTotal(Long files, Long bytes) {
// no-op
}
@Override
public void setCompleted(Long files, Long bytes) {
// no-op
}
@Override
public String cryptoGetTextPassword() {
return password;
}
private File resolveVolumeFile(String filename) {
if (filename == null || filename.trim().length() == 0) {
return null;
}
File direct = new File(filename);
if (direct.isAbsolute() && direct.exists()) {
return direct;
}
if (archiveDir != null) {
File relative = new File(archiveDir, filename);
if (relative.exists()) {
return relative;
}
File[] siblings = archiveDir.listFiles();
if (siblings != null) {
for (File sibling : siblings) {
if (!sibling.isFile()) {
continue;
}
if (sibling.getName().equalsIgnoreCase(filename)) {
return sibling;
}
}
}
}
return direct.exists() ? direct : null;
}
@Override
public void close() {
for (RandomAccessFile raf : openRafs.values()) {
try {
raf.close();
} catch (Throwable ignored) {
}
}
openRafs.clear();
}
}
}

View File

@ -9,6 +9,15 @@ import { removeDownloadLinkArtifacts, removeSampleArtifacts } from "./cleanup";
const DEFAULT_ARCHIVE_PASSWORDS = ["", "serienfans.org", "serienjunkies.org"]; const DEFAULT_ARCHIVE_PASSWORDS = ["", "serienfans.org", "serienjunkies.org"];
const NO_EXTRACTOR_MESSAGE = "WinRAR/UnRAR nicht gefunden. Bitte WinRAR installieren."; const NO_EXTRACTOR_MESSAGE = "WinRAR/UnRAR nicht gefunden. Bitte WinRAR installieren.";
const NO_JVM_EXTRACTOR_MESSAGE = "7-Zip-JBinding Runtime nicht gefunden. Bitte resources/extractor-jvm prüfen.";
const JVM_EXTRACTOR_MAIN_CLASS = "com.sucukdeluxe.extractor.JBindExtractorMain";
const JVM_EXTRACTOR_CLASSES_SUBDIR = "classes";
const JVM_EXTRACTOR_LIB_SUBDIR = "lib";
const JVM_EXTRACTOR_REQUIRED_LIBS = [
"sevenzipjbinding.jar",
"sevenzipjbinding-all-platforms.jar",
"zip4j.jar"
];
// ── subst drive mapping for long paths on Windows ── // ── subst drive mapping for long paths on Windows ──
const SUBST_THRESHOLD = 100; const SUBST_THRESHOLD = 100;
@ -296,6 +305,9 @@ function parseProgressPercent(chunk: string): number | null {
} }
async function shouldPreferExternalZip(archivePath: string): Promise<boolean> { async function shouldPreferExternalZip(archivePath: string): Promise<boolean> {
if (extractorBackendMode() !== "legacy") {
return true;
}
try { try {
const stat = await fs.promises.stat(archivePath); const stat = await fs.promises.stat(archivePath);
return stat.size >= 64 * 1024 * 1024; return stat.size >= 64 * 1024 * 1024;
@ -680,6 +692,365 @@ function runExtractCommand(
}); });
} }
type ExtractBackendMode = "auto" | "jvm" | "legacy";
type JvmExtractorLayout = {
javaCommand: string;
classPath: string;
rootDir: string;
};
type JvmExtractResult = {
ok: boolean;
missingCommand: boolean;
missingRuntime: boolean;
aborted: boolean;
timedOut: boolean;
errorText: string;
usedPassword: string;
backend: string;
};
function extractorBackendMode(): ExtractBackendMode {
const defaultMode = process.env.VITEST ? "legacy" : "auto";
const raw = String(process.env.RD_EXTRACT_BACKEND || defaultMode).trim().toLowerCase();
if (raw === "legacy") {
return "legacy";
}
if (raw === "jvm" || raw === "jbind" || raw === "7zjbinding") {
return "jvm";
}
return "auto";
}
function isJvmRuntimeMissingError(errorText: string): boolean {
const text = String(errorText || "").toLowerCase();
return text.includes("could not find or load main class")
|| text.includes("classnotfoundexception")
|| text.includes("noclassdeffounderror")
|| text.includes("unsatisfiedlinkerror")
|| text.includes("enoent");
}
function resolveJavaCommandCandidates(): string[] {
const programFiles = process.env.ProgramFiles || "C:\\Program Files";
const programFilesX86 = process.env["ProgramFiles(x86)"] || "C:\\Program Files (x86)";
const localAppData = process.env.LOCALAPPDATA || "";
const candidates = [
process.env.RD_JAVA_BIN || "",
path.join(programFiles, "JDownloader", "jre", "bin", "java.exe"),
path.join(programFilesX86, "JDownloader", "jre", "bin", "java.exe"),
localAppData ? path.join(localAppData, "JDownloader", "jre", "bin", "java.exe") : "",
"java"
].filter(Boolean);
return Array.from(new Set(candidates));
}
function resolveJvmExtractorRootCandidates(): string[] {
const fromEnv = String(process.env.RD_EXTRACTOR_JVM_DIR || "").trim();
const electronResourcesPath = (process as NodeJS.Process & { resourcesPath?: string }).resourcesPath || "";
const candidates = [
fromEnv,
path.join(process.cwd(), "resources", "extractor-jvm"),
path.join(process.cwd(), "build", "resources", "extractor-jvm"),
path.join(__dirname, "..", "..", "..", "resources", "extractor-jvm"),
electronResourcesPath ? path.join(electronResourcesPath, "extractor-jvm") : ""
].filter(Boolean);
return Array.from(new Set(candidates));
}
let cachedJvmLayout: JvmExtractorLayout | null | undefined;
function resolveJvmExtractorLayout(): JvmExtractorLayout | null {
if (cachedJvmLayout !== undefined) {
return cachedJvmLayout;
}
const javaCandidates = resolveJavaCommandCandidates();
const javaCommand = javaCandidates.find((candidate) => {
if (!candidate) {
return false;
}
if (!isAbsoluteCommand(candidate)) {
return true;
}
return fs.existsSync(candidate);
}) || "";
if (!javaCommand) {
return null;
}
for (const rootDir of resolveJvmExtractorRootCandidates()) {
const classesDir = path.join(rootDir, JVM_EXTRACTOR_CLASSES_SUBDIR);
if (!fs.existsSync(classesDir)) {
continue;
}
const libs = JVM_EXTRACTOR_REQUIRED_LIBS.map((name) => path.join(rootDir, JVM_EXTRACTOR_LIB_SUBDIR, name));
if (libs.some((filePath) => !fs.existsSync(filePath))) {
continue;
}
const classPath = [classesDir, ...libs].join(path.delimiter);
const layout = { javaCommand, classPath, rootDir };
cachedJvmLayout = layout;
return layout;
}
cachedJvmLayout = null;
return null;
}
function parseJvmLine(
line: string,
onArchiveProgress: ((percent: number) => void) | undefined,
state: { bestPercent: number; usedPassword: string; backend: string; reportedError: string }
): void {
const trimmed = String(line || "").trim();
if (!trimmed) {
return;
}
if (trimmed.startsWith("RD_PROGRESS ")) {
const parsed = parseProgressPercent(trimmed);
if (parsed !== null && parsed > state.bestPercent) {
state.bestPercent = parsed;
onArchiveProgress?.(parsed);
}
return;
}
if (trimmed.startsWith("RD_PASSWORD ")) {
const encoded = trimmed.slice("RD_PASSWORD ".length).trim();
try {
state.usedPassword = Buffer.from(encoded, "base64").toString("utf8");
} catch {
state.usedPassword = "";
}
return;
}
if (trimmed.startsWith("RD_BACKEND ")) {
state.backend = trimmed.slice("RD_BACKEND ".length).trim();
return;
}
if (trimmed.startsWith("RD_ERROR ")) {
state.reportedError = trimmed.slice("RD_ERROR ".length).trim();
}
}
function runJvmExtractCommand(
layout: JvmExtractorLayout,
archivePath: string,
targetDir: string,
conflictMode: ConflictMode,
passwordCandidates: string[],
onArchiveProgress?: (percent: number) => void,
signal?: AbortSignal,
timeoutMs?: number
): Promise<JvmExtractResult> {
if (signal?.aborted) {
return Promise.resolve({
ok: false,
missingCommand: false,
missingRuntime: false,
aborted: true,
timedOut: false,
errorText: "aborted:extract",
usedPassword: "",
backend: ""
});
}
const mode = effectiveConflictMode(conflictMode);
const args = [
"-Dfile.encoding=UTF-8",
"-Xms32m",
"-Xmx512m",
"-cp",
layout.classPath,
JVM_EXTRACTOR_MAIN_CLASS,
"--archive",
archivePath,
"--target",
targetDir,
"--conflict",
mode,
"--backend",
"auto"
];
for (const password of passwordCandidates) {
args.push("--password", password);
}
return new Promise((resolve) => {
let settled = false;
let output = "";
let timeoutId: NodeJS.Timeout | null = null;
let timedOutByWatchdog = false;
let abortedBySignal = false;
let onAbort: (() => void) | null = null;
const parseState = { bestPercent: 0, usedPassword: "", backend: "", reportedError: "" };
let stdoutBuffer = "";
let stderrBuffer = "";
const child = spawn(layout.javaCommand, args, { windowsHide: true });
lowerExtractProcessPriority(child.pid);
const flushLines = (rawChunk: string, fromStdErr = false): void => {
if (!rawChunk) {
return;
}
output = appendLimited(output, rawChunk);
const nextBuffer = `${fromStdErr ? stderrBuffer : stdoutBuffer}${rawChunk}`;
const lines = nextBuffer.split(/\r?\n/);
const keep = lines.pop() || "";
for (const line of lines) {
parseJvmLine(line, onArchiveProgress, parseState);
}
if (fromStdErr) {
stderrBuffer = keep;
} else {
stdoutBuffer = keep;
}
};
const finish = (result: JvmExtractResult): void => {
if (settled) {
return;
}
settled = true;
if (timeoutId) {
clearTimeout(timeoutId);
timeoutId = null;
}
if (signal && onAbort) {
signal.removeEventListener("abort", onAbort);
}
resolve(result);
};
if (timeoutMs && timeoutMs > 0) {
timeoutId = setTimeout(() => {
timedOutByWatchdog = true;
killProcessTree(child);
finish({
ok: false,
missingCommand: false,
missingRuntime: false,
aborted: false,
timedOut: true,
errorText: `Entpacken Timeout nach ${Math.ceil(timeoutMs / 1000)}s`,
usedPassword: parseState.usedPassword,
backend: parseState.backend
});
}, timeoutMs);
}
onAbort = signal
? (): void => {
abortedBySignal = true;
killProcessTree(child);
finish({
ok: false,
missingCommand: false,
missingRuntime: false,
aborted: true,
timedOut: false,
errorText: "aborted:extract",
usedPassword: parseState.usedPassword,
backend: parseState.backend
});
}
: null;
if (signal && onAbort) {
signal.addEventListener("abort", onAbort, { once: true });
}
child.stdout.on("data", (chunk) => {
flushLines(String(chunk || ""), false);
});
child.stderr.on("data", (chunk) => {
flushLines(String(chunk || ""), true);
});
child.on("error", (error) => {
const text = cleanErrorText(String(error));
finish({
ok: false,
missingCommand: text.toLowerCase().includes("enoent"),
missingRuntime: true,
aborted: false,
timedOut: false,
errorText: text,
usedPassword: parseState.usedPassword,
backend: parseState.backend
});
});
child.on("close", (code) => {
parseJvmLine(stdoutBuffer, onArchiveProgress, parseState);
parseJvmLine(stderrBuffer, onArchiveProgress, parseState);
if (abortedBySignal) {
finish({
ok: false,
missingCommand: false,
missingRuntime: false,
aborted: true,
timedOut: false,
errorText: "aborted:extract",
usedPassword: parseState.usedPassword,
backend: parseState.backend
});
return;
}
if (timedOutByWatchdog) {
finish({
ok: false,
missingCommand: false,
missingRuntime: false,
aborted: false,
timedOut: true,
errorText: `Entpacken Timeout nach ${Math.ceil((timeoutMs || 0) / 1000)}s`,
usedPassword: parseState.usedPassword,
backend: parseState.backend
});
return;
}
const message = cleanErrorText(parseState.reportedError || output) || `Exit Code ${String(code ?? "?")}`;
if (code === 0) {
onArchiveProgress?.(100);
finish({
ok: true,
missingCommand: false,
missingRuntime: false,
aborted: false,
timedOut: false,
errorText: "",
usedPassword: parseState.usedPassword,
backend: parseState.backend
});
return;
}
finish({
ok: false,
missingCommand: false,
missingRuntime: isJvmRuntimeMissingError(message),
aborted: false,
timedOut: false,
errorText: message,
usedPassword: parseState.usedPassword,
backend: parseState.backend
});
});
});
}
export function buildExternalExtractArgs( export function buildExternalExtractArgs(
command: string, command: string,
archivePath: string, archivePath: string,
@ -772,10 +1143,9 @@ async function runExternalExtract(
signal?: AbortSignal, signal?: AbortSignal,
hybridMode = false hybridMode = false
): Promise<string> { ): Promise<string> {
const command = await resolveExtractorCommand();
const passwords = passwordCandidates;
let lastError = "";
const timeoutMs = await computeExtractTimeoutMs(archivePath); const timeoutMs = await computeExtractTimeoutMs(archivePath);
const backendMode = extractorBackendMode();
let jvmFailureReason = "";
await fs.promises.mkdir(targetDir, { recursive: true }); await fs.promises.mkdir(targetDir, { recursive: true });
@ -785,7 +1155,61 @@ async function runExternalExtract(
const effectiveTargetDir = subst ? `${subst.drive}:` : targetDir; const effectiveTargetDir = subst ? `${subst.drive}:` : targetDir;
try { try {
return await runExternalExtractInner(command, archivePath, effectiveTargetDir, conflictMode, passwordCandidates, onArchiveProgress, signal, timeoutMs, hybridMode); if (backendMode !== "legacy") {
const layout = resolveJvmExtractorLayout();
if (!layout) {
jvmFailureReason = NO_JVM_EXTRACTOR_MESSAGE;
if (backendMode === "jvm") {
throw new Error(NO_JVM_EXTRACTOR_MESSAGE);
}
logger.warn(`JVM-Extractor nicht verfügbar, nutze Legacy-Extractor: ${path.basename(archivePath)}`);
} else {
logger.info(`JVM-Extractor aktiv (${layout.rootDir}): ${path.basename(archivePath)}`);
const jvmResult = await runJvmExtractCommand(
layout,
archivePath,
effectiveTargetDir,
conflictMode,
passwordCandidates,
onArchiveProgress,
signal,
timeoutMs
);
if (jvmResult.ok) {
return jvmResult.usedPassword;
}
if (jvmResult.aborted) {
throw new Error("aborted:extract");
}
if (jvmResult.timedOut) {
throw new Error(jvmResult.errorText || `Entpacken Timeout nach ${Math.ceil(timeoutMs / 1000)}s`);
}
jvmFailureReason = jvmResult.errorText || "JVM-Extractor fehlgeschlagen";
if (backendMode === "jvm") {
throw new Error(jvmFailureReason);
}
logger.warn(`JVM-Extractor Fehler, fallback auf Legacy: ${jvmFailureReason}`);
}
}
const command = await resolveExtractorCommand();
const password = await runExternalExtractInner(
command,
archivePath,
effectiveTargetDir,
conflictMode,
passwordCandidates,
onArchiveProgress,
signal,
timeoutMs,
hybridMode
);
if (jvmFailureReason) {
logger.info(`Legacy-Extractor übernahm nach JVM-Fehler: ${path.basename(archivePath)}`);
}
return password;
} finally { } finally {
if (subst) removeSubstMapping(subst); if (subst) removeSubstMapping(subst);
} }
@ -1471,7 +1895,7 @@ export async function extractPackageArchives(options: ExtractOptions): Promise<{
logger.info(`Nested-Entpacke: ${nestedName} -> ${options.targetDir}${hybrid ? " (hybrid)" : ""}`); logger.info(`Nested-Entpacke: ${nestedName} -> ${options.targetDir}${hybrid ? " (hybrid)" : ""}`);
try { try {
const ext = path.extname(nestedArchive).toLowerCase(); const ext = path.extname(nestedArchive).toLowerCase();
if (ext === ".zip") { if (ext === ".zip" && !(await shouldPreferExternalZip(nestedArchive))) {
try { try {
await extractZipArchive(nestedArchive, options.targetDir, options.conflictMode, options.signal); await extractZipArchive(nestedArchive, options.targetDir, options.conflictMode, options.signal);
nestedPercent = 100; nestedPercent = 100;

107
tests/extractor-jvm.test.ts Normal file
View File

@ -0,0 +1,107 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { spawnSync } from "node:child_process";
import AdmZip from "adm-zip";
import { afterEach, describe, expect, it } from "vitest";
import { extractPackageArchives } from "../src/main/extractor";
const tempDirs: string[] = [];
const originalBackend = process.env.RD_EXTRACT_BACKEND;
function hasJavaRuntime(): boolean {
const result = spawnSync("java", ["-version"], { stdio: "ignore" });
return result.status === 0;
}
function hasJvmExtractorRuntime(): boolean {
const root = path.join(process.cwd(), "resources", "extractor-jvm");
const classesMain = path.join(root, "classes", "com", "sucukdeluxe", "extractor", "JBindExtractorMain.class");
const requiredLibs = [
path.join(root, "lib", "sevenzipjbinding.jar"),
path.join(root, "lib", "sevenzipjbinding-all-platforms.jar"),
path.join(root, "lib", "zip4j.jar")
];
return fs.existsSync(classesMain) && requiredLibs.every((libPath) => fs.existsSync(libPath));
}
afterEach(() => {
for (const dir of tempDirs.splice(0)) {
fs.rmSync(dir, { recursive: true, force: true });
}
if (originalBackend === undefined) {
delete process.env.RD_EXTRACT_BACKEND;
} else {
process.env.RD_EXTRACT_BACKEND = originalBackend;
}
});
describe("extractor jvm backend", () => {
it("extracts zip archives through SevenZipJBinding backend", async () => {
if (!hasJavaRuntime() || !hasJvmExtractorRuntime()) {
return;
}
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-extract-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
const zipPath = path.join(packageDir, "release.zip");
const zip = new AdmZip();
zip.addFile("episode.txt", Buffer.from("ok"));
zip.writeZip(zipPath);
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false
});
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
expect(fs.existsSync(path.join(targetDir, "episode.txt"))).toBe(true);
});
it("respects ask/skip conflict mode in jvm backend", async () => {
if (!hasJavaRuntime() || !hasJvmExtractorRuntime()) {
return;
}
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-extract-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
fs.mkdirSync(targetDir, { recursive: true });
const zipPath = path.join(packageDir, "conflict.zip");
const zip = new AdmZip();
zip.addFile("same.txt", Buffer.from("new"));
zip.writeZip(zipPath);
const existingPath = path.join(targetDir, "same.txt");
fs.writeFileSync(existingPath, "old", "utf8");
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "ask",
removeLinks: false,
removeSamples: false
});
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
expect(fs.readFileSync(existingPath, "utf8")).toBe("old");
});
});

View File

@ -2,15 +2,25 @@ import fs from "node:fs";
import os from "node:os"; import os from "node:os";
import path from "node:path"; import path from "node:path";
import AdmZip from "adm-zip"; import AdmZip from "adm-zip";
import { afterEach, describe, expect, it } from "vitest"; import { afterEach, beforeEach, describe, expect, it } from "vitest";
import { buildExternalExtractArgs, collectArchiveCleanupTargets, extractPackageArchives } from "../src/main/extractor"; import { buildExternalExtractArgs, collectArchiveCleanupTargets, extractPackageArchives } from "../src/main/extractor";
const tempDirs: string[] = []; const tempDirs: string[] = [];
const originalExtractBackend = process.env.RD_EXTRACT_BACKEND;
beforeEach(() => {
process.env.RD_EXTRACT_BACKEND = "legacy";
});
afterEach(() => { afterEach(() => {
for (const dir of tempDirs.splice(0)) { for (const dir of tempDirs.splice(0)) {
fs.rmSync(dir, { recursive: true, force: true }); fs.rmSync(dir, { recursive: true, force: true });
} }
if (originalExtractBackend === undefined) {
delete process.env.RD_EXTRACT_BACKEND;
} else {
process.env.RD_EXTRACT_BACKEND = originalExtractBackend;
}
}); });
describe("extractor", () => { describe("extractor", () => {