Compare commits

...

351 Commits
v1.1.7 ... main

Author SHA1 Message Date
Sucukdeluxe
72642351d0 Release v1.6.55 2026-03-05 06:25:20 +01:00
Sucukdeluxe
51a01ea03f Use bulk IInArchive.extract() for ~8x faster extraction, fix archive item resolution
- Replace extractSlow() per-item extraction with IInArchive.extract() bulk API
  in 7-Zip-JBinding. Solid RAR archives no longer re-decode from the beginning
  for each item, bringing extraction speed close to native WinRAR/7z.exe (~375 MB/s
  instead of ~43 MB/s).

- Add BulkExtractCallback implementing both IArchiveExtractCallback and
  ICryptoGetTextPassword for proper password handling during bulk extraction.

- Fix resolveArchiveItemsFromList with multi-level fallback matching:
  1. Pattern match (multipart RAR, split ZIP/7z, generic splits)
  2. Exact filename match (case-insensitive)
  3. Stem-based fuzzy match (handles debrid service filename modifications)
  4. Single-item archive fallback

- Simplify caching from Set+Array workaround back to simple Map<string, T>
  (the original "caching failure" was caused by resolveArchiveItemsFromList
  returning empty arrays, not by Map/Set/Object data structure bugs).

- Add comprehensive tests for archive item resolution (14 test cases)
  and JVM extraction progress callbacks (2 test cases).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 06:24:12 +01:00
Sucukdeluxe
d9a78ea837 Release v1.6.54 2026-03-05 05:59:50 +01:00
Sucukdeluxe
5b221d5bd5 Add persistent JVM daemon for extraction, fix caching with Set+Array
- JVM extractor now supports --daemon mode: starts once, processes
  multiple archives via stdin JSON protocol, eliminating ~5s JVM boot
  per archive
- TypeScript side: daemon manager starts JVM once, sends requests via
  stdin, falls back to spawning new process if daemon is busy
- Fix extraction progress caching: replaced Object.create(null) + in
  operator with Set<string> + linear Array scan — both Map.has() and
  the in operator mysteriously failed to find keys that were just set
- Daemon auto-shutdown on app quit via shutdownDaemon() in before-quit

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 05:59:13 +01:00
Sucukdeluxe
c36549ca69 Release v1.6.53 2026-03-05 05:48:41 +01:00
Sucukdeluxe
7e79bef8da Increase JVM extractor heap to 8GB max / 512MB initial
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 05:48:02 +01:00
Sucukdeluxe
e3b4a4ba19 Release v1.6.52 2026-03-05 05:42:55 +01:00
Sucukdeluxe
30d216c7ca Fix extraction progress caching and JVM tuning
- Replace Map-based archive item cache with plain Object.create(null)
  to work around mysterious Map.has() returning false despite set()
  being called with the same key — this caused resolveArchiveItems
  to run on every 1.1s pulse instead of being cached, preventing
  extraction progress (Entpacken X%) from ever showing in the UI
- Apply same fix to both hybrid and full extraction paths
- Increase JVM heap from 512MB to 1GB for better extraction throughput
- Use SerialGC for faster JVM startup on short-lived extract processes
- Add download lifecycle logging (package add + item download start)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 05:42:23 +01:00
Sucukdeluxe
d80483adc2 Add download lifecycle logging for better diagnostics
- Log when packages are added (count + names)
- Log when individual item downloads start (filename, size, provider)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 05:32:44 +01:00
Sucukdeluxe
1cda391dfe Fix extraction speed and UI label updates
- Change OS priority from IDLE/BELOW_NORMAL to NORMAL/BELOW_NORMAL so
  extraction runs at full speed (matching manual 7-Zip/WinRAR performance)
- Use "high" priority in both hybrid and full extraction paths
- Increase hybrid extraction threads from hardcoded 2 to dynamic
  calculation (half CPU count, min 2, max 8)
- Fix emitState forced emit being silently dropped when a non-forced
  timer was already pending — forced emits now always replace pending
  timers to ensure immediate UI feedback during extraction transitions

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 05:28:42 +01:00
Sucukdeluxe
375ec36781 Release v1.6.50 2026-03-05 05:09:24 +01:00
Sucukdeluxe
4ad1c05444 Fix extraction UI labels and speed for final extraction pass
- Force immediate emitState when first resolving archive items so UI
  transitions from 'Ausstehend' to 'Entpacken X%' instantly
- Use BELOW_NORMAL priority (instead of IDLE) for final extraction
  when all downloads are complete — matches manual extraction speed
- Add diagnostic logging for resolveArchiveItems matching

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 05:08:09 +01:00
Sucukdeluxe
c88eeb0b12 Release v1.6.49 2026-03-05 04:45:53 +01:00
Sucukdeluxe
c6261aba6a Log when each item download completes
Add "Download fertig: filename (size), pkg=name" log line when an item
finishes downloading, enabling precise timing analysis of when archive
parts become available for extraction.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 04:45:17 +01:00
Sucukdeluxe
a010b967b9 Release v1.6.48 2026-03-05 04:42:09 +01:00
Sucukdeluxe
af6547f254 Add MKV collection after hybrid extraction + detailed timing logs
- Run collectMkvFilesToLibrary in background after each hybrid extraction
  round so MKVs are moved to the library as episodes are extracted, not
  only after the entire package finishes
- Add timing logs to identify bottlenecks:
  - Post-process slot wait time
  - Per-round duration with requeue status
  - Recovery loop duration
  - Setup time in handlePackagePostProcessing
  - findReadyArchiveSets duration when > 200ms

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 04:41:01 +01:00
Sucukdeluxe
ba235b0b93 Release v1.6.47 2026-03-05 04:37:42 +01:00
Sucukdeluxe
1bfde96e46 Self-requeue hybrid extraction to avoid missed archive sets
After a hybrid extraction round completes, set the requeue flag so the
do-while loop immediately checks for more ready archive sets. Previously,
if all items completed before the task started processing, the single
requeue flag was consumed and no new completions triggered re-extraction,
causing 25+ second gaps until the next download completion.

Also change runHybridExtraction return type from void to number
(extracted count) to enable conditional self-requeue only when archives
were actually extracted, preventing infinite requeue loops.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 04:37:03 +01:00
Sucukdeluxe
e1f9b4b6d3 Release v1.6.46 2026-03-05 04:24:22 +01:00
Sucukdeluxe
95cf4fbed8 Eliminate 10-15s pause between package extractions
Release post-process slot immediately after main extraction completes.
All slow post-extraction work (nested extraction, auto-rename, archive
cleanup, link/sample removal, empty directory cleanup, MKV collection)
now runs in background via runDeferredPostExtraction so the next package
can start unpacking without delay.

- Export hasAnyFilesRecursive, removeEmptyDirectoryTree, cleanupArchives
  from extractor.ts for use in deferred handler
- Import removeDownloadLinkArtifacts, removeSampleArtifacts from cleanup
- Expand runDeferredPostExtraction with full post-cleanup pipeline:
  nested extraction, rename, archive cleanup, link/sample removal,
  empty dir tree removal, resume state clearing, MKV collection

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 04:22:20 +01:00
Sucukdeluxe
9ddc7d31bb Make update changelog collapsible in confirm dialog
Long changelogs made the update dialog unscrollable, preventing users
from reaching the install button. Changelog is now in a collapsed
<details> element. Dialog also has max-height with overflow scroll.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 04:12:47 +01:00
Sucukdeluxe
83626017b9 Fix btn-danger CSS class mismatch in history tab
CSS defines .btn.danger (two classes) but code used "btn btn-danger"
(one hyphenated class). History danger buttons now get correct red styling.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 04:07:09 +01:00
Sucukdeluxe
b9372f0ef0 Add ddownload to VALID_PRIMARY_PROVIDERS and VALID_FALLBACK_PROVIDERS
DDownload was missing from provider validation sets, preventing users
from configuring it as primary or fallback provider in settings.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 04:04:31 +01:00
Sucukdeluxe
db97a7df14 Fix setPackagePriority type safety and add missing .catch() to IPC calls
- Use PackagePriority type instead of string/any in preload and app-controller
- Add .catch() to start(), extractNow(), setPackagePriority(), updateSettings(columnOrder), openLog(), openSessionLog()

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 03:59:10 +01:00
Sucukdeluxe
575fca3806 Release v1.6.45 2026-03-05 03:54:54 +01:00
Sucukdeluxe
a1c8f42435 Comprehensive bugfix release v1.6.45
Fix ~70 issues across the entire codebase including security fixes,
error handling improvements, test stabilization, and code quality.

- Fix TLS race condition with reference-counted acquire/release
- Bind debug server to 127.0.0.1 instead of 0.0.0.0
- Add overall timeout to MegaWebFallback
- Stream update installer to disk instead of RAM buffering
- Add path traversal protection in JVM extractor
- Cache DdownloadClient with credential-based invalidation
- Add .catch() to all fire-and-forget IPC calls
- Wrap app startup, clipboard, session-log in try/catch
- Add timeouts to container.ts fetch calls
- Fix variable shadowing, tsconfig path, line endings
- Stabilize tests with proper cleanup and timing tolerance
- Fix installer privileges, scripts, and afterPack null checks
- Delete obsolete _upload_release.mjs

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 03:53:28 +01:00
Sucukdeluxe
a3c2680fec Show transitional label between archive extractions
After an archive finishes at 100%, show "Naechstes Archiv..." label
while the next archive initializes, eliminating the "dead" gap where
no activity was visible between consecutive extractions.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 03:13:08 +01:00
Sucukdeluxe
12dade0240 Show compact changelog in update dialog, strip sub-items and long descriptions
Only top-level list items are shown in the updater changelog.
Indented sub-items, headings, and long descriptions are removed
for a clean, compact display. Detailed notes remain on the
Gitea release page.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 03:02:41 +01:00
Sucukdeluxe
2a528a126c Add detailed preparation labels and dash separator for post-process status
- Show "Entpacken vorbereiten..." while scanning archives and checking disk space
- Show "Archive scannen..." and "Speicherplatz prüfen..." phases from extractor
- Use dash separator in UI: "[10/10 - Done] - Entpacken 45% (3/6)"
- Handle new "preparing" phase in both hybrid and full extraction progress handlers

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 02:57:23 +01:00
Sucukdeluxe
8839080069 Show live extraction progress in package postProcessLabel
Update postProcessLabel during extraction with detailed progress:
- Overall percentage and archive count (e.g. "Entpacken 45% (3/6)")
- Password cracking progress when testing passwords
- Works for both hybrid and full extraction modes

Previously the label was static "Entpacken..." with no detail about
what was happening during potentially long extraction phases.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 02:46:10 +01:00
Sucukdeluxe
8f66d75eb3 Show DDownload provider label instead of generic Debrid in status
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 02:42:19 +01:00
Sucukdeluxe
56ee681aec Strip markdown formatting from changelog in update dialog
The confirm dialog is plain text and cannot render markdown. Strip
headings, bold, italic, code backticks, and normalize list bullets.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 02:37:18 +01:00
Sucukdeluxe
6db03f05a9 Fix DDownload downloads failing due to SSL certificate verification
DDownload's storage servers (dstorage.org) use certificates that fail
Node.js TLS verification. Add skipTlsVerify flag to UnrestrictedLink
and temporarily disable TLS verification for the download fetch when
the flag is set.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 02:32:59 +01:00
Sucukdeluxe
068da94e2a Auto-route DDownload URLs to DDownload provider before debrid chain
DDownload is a direct file hoster, not a debrid service. DDownload URLs
are now automatically handled by the DDownload provider when configured,
before trying any debrid providers. Remove DDownload from the
primary/secondary/tertiary provider dropdowns since it only handles its
own URLs and doesn't belong in the debrid fallback chain.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 02:25:34 +01:00
Sucukdeluxe
4b824b2d9f Fix crash when DDownload settings are missing from persisted config
Guard against undefined ddownloadLogin/ddownloadPassword in renderer
when upgrading from a version without DDownload support.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 02:17:41 +01:00
Sucukdeluxe
284c5e7aa6 Release v1.6.35
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 02:07:52 +01:00
Sucukdeluxe
036cd3e066 Add DDownload provider, post-processing status labels, and update changelog
- DDownload (ddownload.com/ddl.to) as new hoster with web login
- Post-processing labels: Entpacken/Renaming/Aufräumen/MKVs
- Release notes shown in update confirmation dialog

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 02:05:16 +01:00
Sucukdeluxe
479c7a3f3f Fix hybrid extraction showing "Ausstehend" instead of "Warten auf Parts"
When hybrid extraction finds no ready archive sets (because remaining parts
are still downloading), completed items were incorrectly labeled as
"Entpacken - Ausstehend" instead of "Entpacken - Warten auf Parts".

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 01:38:51 +01:00
Sucukdeluxe
0404d870ad Release v1.6.33
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 01:24:26 +01:00
Sucukdeluxe
93a53763e0 Release v1.6.32
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 01:21:40 +01:00
Sucukdeluxe
c20d743286 Fix updater GetUserByName error, mask backup credentials, clean up old scripts
- Migrate deprecated updateRepo value (Sucukdeluxe/) to new default (Administrator/)
- Mask sensitive fields (tokens, passwords) in backup export with ***
- Preserve current credentials when importing backup with masked values
- Remove 22 obsolete release_v*.mjs scripts, release_codeberg.mjs, set_version_node.mjs
- Remove release:codeberg script from package.json

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 01:20:57 +01:00
Sucukdeluxe
ba938f64c5 Release v1.6.31 2026-03-05 00:45:46 +01:00
Sucukdeluxe
af00d69e5c Switch release/update docs and tooling to Gitea 2026-03-05 00:42:59 +01:00
Sucukdeluxe
bc47da504c Rename app description to Desktop downloader 2026-03-04 23:56:24 +01:00
Sucukdeluxe
5a24c891c0 Migrate project to GitHub and log 2026-03-04 23:55:42 +01:00
Sucukdeluxe
1103df98c1 Analysiere Programm auf Bugs 2026-03-04 23:03:16 +01:00
Sucukdeluxe
74920e2e2f Round 8 bug fixes (20 fixes)
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 22:14:03 +01:00
Sucukdeluxe
75775f2798 Round 7 bug fixes (13 fixes)
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 21:59:42 +01:00
Sucukdeluxe
fad0f1060b Release v1.6.30
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 21:36:35 +01:00
Sucukdeluxe
0ca359e509 Round 6 bug fixes (pre-release)
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 21:23:34 +01:00
Sucukdeluxe
1d0ee31001 Release v1.6.29
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 21:11:32 +01:00
Sucukdeluxe
20a0a59670 Release v1.6.28
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 20:54:49 +01:00
Sucukdeluxe
9a00304a93 Release v1.6.27
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 20:34:50 +01:00
Sucukdeluxe
55b00bf884 Release v1.6.26
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 20:18:47 +01:00
Sucukdeluxe
e85f12977f Fix extractedArchives.push -> .add (Set, not Array)
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 20:03:40 +01:00
Sucukdeluxe
940346e2f4 Release v1.6.25
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 20:01:01 +01:00
Sucukdeluxe
1854e6bb17 Release v1.6.24
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 19:34:13 +01:00
Sucukdeluxe
26b2ef0abb Release v1.6.23
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 19:26:01 +01:00
Sucukdeluxe
9cceaacd14 Release v1.6.22
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 19:04:53 +01:00
Sucukdeluxe
1ed13f7f88 Release v1.6.21
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 18:57:18 +01:00
Sucukdeluxe
729aa30253 Release v1.6.20
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 18:09:13 +01:00
Sucukdeluxe
b8bbc9c32f Release v1.6.19
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 17:45:30 +01:00
Sucukdeluxe
a263e3eb2c Release v1.6.18
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 17:31:20 +01:00
Sucukdeluxe
10bae4f98b Release v1.6.17
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 17:04:32 +01:00
Sucukdeluxe
b02aef2af9 Release v1.6.16
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 16:50:59 +01:00
Sucukdeluxe
56c0b633c8 Release v1.6.15
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 15:47:48 +01:00
Sucukdeluxe
4e8e8eba66 Release v1.6.14
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 15:30:49 +01:00
Sucukdeluxe
d5638b922d Release v1.6.13
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 15:18:15 +01:00
Sucukdeluxe
dc695c9a04 Release v1.6.12
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 15:07:08 +01:00
Sucukdeluxe
52909258ca Release v1.6.11
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 15:03:29 +01:00
Sucukdeluxe
e9b9801ac1 Release v1.6.10
Fix post-process slot counter going negative after stop(), allowing multiple
packages to extract simultaneously instead of one at a time.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 14:46:44 +01:00
Sucukdeluxe
86a358d568 Release v1.6.9
Fix extraction resume state / progress sync, abort labels, and hybrid pkg.status

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 14:43:31 +01:00
Sucukdeluxe
97c5bfaa7d Release v1.6.8
- Fix "Fertig" status on completed items: session recovery no longer resets
  "Entpacken - Ausstehend" to "Fertig (size)" — respects autoExtract setting
- Extraction continues during pause instead of being aborted
- Hybrid extraction recovery on start/resume: triggerPendingExtractions and
  recoverPostProcessingOnStartup now handle partial packages with hybridExtract
- Move Up/Down buttons: optimistic UI update so packages move instantly

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 14:23:29 +01:00
Sucukdeluxe
8d0c110415 Release v1.6.7
Add proactive disk-busy detection: lower STREAM_HIGH_WATER_MARK from 2 MB
to 512 KB so backpressure triggers sooner, and monitor stream.writableLength
to show "Warte auf Festplatte" after 300 ms of undrained writes — before
actual backpressure hits.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 14:13:14 +01:00
Sucukdeluxe
a131f4a11b Release v1.6.6
Fix hybrid extraction stalling: requeue loop now keeps the post-process
slot so the same package re-runs immediately without waiting behind other
packages.  Also skip already-extracted archives on requeue rounds.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 14:04:28 +01:00
Sucukdeluxe
335873a7f6 Release v1.6.5
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 13:49:20 +01:00
Sucukdeluxe
7446e07a8c Release v1.6.4
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 09:54:37 +01:00
Sucukdeluxe
693f7b482a Release v1.6.3
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 05:54:41 +01:00
Sucukdeluxe
1d4a13466f Release v1.6.2
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 05:41:40 +01:00
Sucukdeluxe
17844d4c28 Release v1.6.1
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 05:27:10 +01:00
Sucukdeluxe
55d0e3141c Release v1.6.0
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 05:13:07 +01:00
Sucukdeluxe
a967eb1080 Release v1.5.99
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 04:43:46 +01:00
Sucukdeluxe
21ff749cf3 Release v1.5.98
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 04:31:45 +01:00
Sucukdeluxe
18862bb8e0 Release v1.5.97
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 04:24:45 +01:00
Sucukdeluxe
27833615b7 Release v1.5.96
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 04:17:22 +01:00
Sucukdeluxe
00fae5cadd Release v1.5.95
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 04:08:29 +01:00
Sucukdeluxe
4fcbd5c4f7 Release v1.5.94
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 03:51:10 +01:00
Sucukdeluxe
bb8fd0646a Release v1.5.93
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 03:35:11 +01:00
Sucukdeluxe
1218adf5f2 Release v1.5.92
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 03:21:48 +01:00
Sucukdeluxe
818bf40a9c Release v1.5.91
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 03:15:54 +01:00
Sucukdeluxe
254612a49b Release v1.5.90
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 03:04:49 +01:00
Sucukdeluxe
92101e249a Release v1.5.89
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 02:57:33 +01:00
Sucukdeluxe
a18ab484cc Release v1.5.88
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 02:50:16 +01:00
Sucukdeluxe
7af9d67770 Release v1.5.87
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 02:38:05 +01:00
Sucukdeluxe
d63afcce89 Release v1.5.86
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 00:58:30 +01:00
Sucukdeluxe
15d0969cd9 Release v1.5.85
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 00:31:13 +01:00
Sucukdeluxe
5574b50d20 Add visual online/offline status dot indicator for Rapidgator links
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 00:23:20 +01:00
Sucukdeluxe
662c903bf3 Add Rapidgator link online/offline check when links are added
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 00:19:21 +01:00
Sucukdeluxe
545043e1d6 Add multi-select, Ctrl+A, right-click context menu with link viewer to history tab
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 00:06:58 +01:00
Sucukdeluxe
8f10ff8f96 Show TB for sizes >= 1 TiB in humanSize formatter
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 00:00:43 +01:00
Sucukdeluxe
62f3bd94de Remove speed limit toolbar button, fix hoster stats grouping, add Ctrl+A select all, fix context menu clipping
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-03 23:56:35 +01:00
Sucukdeluxe
253b1868ec Release v1.5.79
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-03 23:46:06 +01:00
Sucukdeluxe
c4aefb6175 Release v1.5.78
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 23:17:37 +01:00
Sucukdeluxe
956cad0da4 Release v1.5.77
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 22:58:34 +01:00
Sucukdeluxe
83d8df84bf Release v1.5.76
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 22:50:51 +01:00
Sucukdeluxe
0c058fa162 Release v1.5.75
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 22:28:40 +01:00
Sucukdeluxe
21dbf46f81 Release v1.5.74
- Fix hybrid extract not using maxParallelExtract setting (was hardcoded to 1)
- Fix "Warten auf Parts" label shown for items whose downloads are already complete
- Update hybrid extract progress handler to support parallel archive tracking

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 22:13:58 +01:00
Sucukdeluxe
af6eea8253 Release v1.5.73
- Show full passwords (unmasked) in extraction logs for easier debugging

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 21:55:36 +01:00
Sucukdeluxe
7029271999 Release v1.5.72
- WRONG_PASSWORD JVM error now falls back to legacy UnRAR extractor
- Added masked password logging for JVM and legacy extractors
- Per-attempt password logging shows which passwords are tried and in what order

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 21:50:35 +01:00
Sucukdeluxe
5dabee332e Parallel archive extraction within packages
maxParallelExtract now controls how many archives extract simultaneously
within a single package (e.g. 4 episodes at once). Packages still
extract sequentially (one package at a time) to focus I/O. Progress
handler updated to track multiple active archives independently.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 21:43:34 +01:00
Sucukdeluxe
d9fe98231f Extract packages sequentially instead of in parallel
Previously maxParallelExtract allowed multiple packages to extract
simultaneously, splitting I/O across packages. Now packages extract
one at a time in packageOrder so each package finishes faster.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 21:35:12 +01:00
Sucukdeluxe
6ee98328fb Fix JVM extractor not falling back to legacy UnRAR on codec errors
When SevenZipJBinding reports "Archive file can't be opened with any
of the registered codecs", the extractor now falls back to legacy
UnRAR instead of failing immediately. Previously, backend mode "jvm"
(the production default) only allowed fallback for UNSUPPORTEDMETHOD.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 21:28:47 +01:00
Sucukdeluxe
3dbb94d298 Release v1.5.68: Extractor optimizations inspired by JDownloader
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 21:25:02 +01:00
Sucukdeluxe
1956be0c71 Release v1.5.67
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 20:54:40 +01:00
Sucukdeluxe
e9414853f9 Sync package-lock.json version
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 20:54:14 +01:00
Sucukdeluxe
d67cce501b Split Hoster column into Hoster + Account columns
Hoster column now shows only the file hoster (e.g. rapidgator.net),
new Account column shows the debrid service used (e.g. Mega-Debrid).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 20:54:03 +01:00
Sucukdeluxe
d87b74d359 Release v1.5.66
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 20:43:08 +01:00
Sucukdeluxe
be7a8fd103 Add progress sorting, extraction priority by packageOrder, auto-expand extracting packages
- Fortschritt column is now clickable/sortable (ascending/descending by package %)
- Extraction queue respects packageOrder: top packages get extracted first
- Packages currently extracting are auto-expanded so user can see progress
- Increased Fortschritt column width for better spacing

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 20:42:28 +01:00
Sucukdeluxe
d23740eac7 Release v1.5.65
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 20:31:58 +01:00
Sucukdeluxe
56a507b45d Add configurable parallel extraction count (1-8, default 2)
- New setting maxParallelExtract in AppSettings
- UI input in Entpacken tab: "Parallele Entpackungen"
- Replaces hardcoded maxConcurrent=2 in acquirePostProcessSlot

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 20:31:24 +01:00
Sucukdeluxe
31ce1e6618 Release v1.5.64
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 20:25:09 +01:00
Sucukdeluxe
b637fb3db8 Fix test suite: await async calls, update status expectations, add timeouts
- cleanup.test.ts: add async/await for removeDownloadLinkArtifacts and removeSampleArtifacts
- download-manager.test.ts: await manager.start() to prevent race conditions
- download-manager.test.ts: update "Entpackt" -> "Entpackt - Done" expectations
- download-manager.test.ts: await async getStartConflicts()
- download-manager.test.ts: add 35s timeout for extraction tests

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 20:23:57 +01:00
Sucukdeluxe
1d876f8ded Fix parallel JVM extraction: isolate temp dirs to prevent native DLL lock conflicts
Each JVM extractor process now gets its own temp directory via
-Djava.io.tmpdir so parallel SevenZipJBinding instances don't fight
over the same lib7-Zip-JBinding.dll file lock on Windows.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 20:23:57 +01:00
Sucukdeluxe
2f43164732 Fix hybrid-extract item matching: use fileName for robust part detection
The previous targetPath-only matching missed items whose targetPath
differed from the on-disk filename. Now matches by basename and
fileName for reliable archive-part to item association.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 18:22:00 +01:00
Sucukdeluxe
9747cabb14 Release v1.5.63
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 18:16:42 +01:00
Sucukdeluxe
9b460758f9 Add parallel extraction (2 concurrent) and better status labels
- Replace serial packagePostProcessQueue with semaphore (max 2 concurrent)
- Hybrid-extract: items waiting for parts show "Entpacken - Warten auf Parts"
- Failed hybrid extraction shows "Entpacken - Error" instead of "Fertig"

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 18:15:57 +01:00
Sucukdeluxe
8cc1f788ad Release v1.5.62
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 18:03:50 +01:00
Sucukdeluxe
0888e16aec Fix startPackages: scheduler now respects runPackageIds filter
findNextQueuedItem(), hasQueuedItems(), hasDelayedQueuedItems() and
countQueuedItems() now skip packages not in runPackageIds when the set
is non-empty. This ensures "Ausgewählte Downloads starten" only
processes selected packages instead of all enabled ones.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 18:03:05 +01:00
Sucukdeluxe
cab550a13c Release v1.5.61 2026-03-03 17:54:15 +01:00
Sucukdeluxe
2ef3983049 Revert to v1.5.49 base + fix "Ausgewählte Downloads starten"
- Restore all source files from v1.5.49 (proven stable on both servers)
- Add startPackages() IPC method that starts only specified packages
- Fix context menu "Ausgewählte Downloads starten" to use startPackages()
  instead of start() which was starting ALL enabled packages

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 17:53:39 +01:00
Sucukdeluxe
ca4392fa8b Release v1.5.60 2026-03-03 17:33:33 +01:00
Sucukdeluxe
8fad61e329 Restore multipart RAR open strategy for JVM extractor
Bring back callback-first RAR multipart opening with explicit RAR5/RAR fallback while keeping 7z.001 on VolumedArchiveInStream to improve 7z-JBinding compatibility on split RAR sets.
2026-03-03 17:32:22 +01:00
Sucukdeluxe
02bd4a61fb Release v1.5.59 2026-03-03 16:56:06 +01:00
Sucukdeluxe
30ac5bf9db Harden hybrid extract readiness for partial archives
Require near-complete file size checks in Item-Recovery and hybrid ready-set detection so partially downloaded RAR parts are not marked completed and extracted prematurely.
2026-03-03 16:52:16 +01:00
Sucukdeluxe
87e0a986e6 Release v1.5.58 2026-03-03 16:37:14 +01:00
Sucukdeluxe
353cef7dbd Add JVM hybrid-extract retry and clean up Java extractor
- Add automatic retry with 3s delay when JVM extractor fails with
  "codecs" or "can't be opened" error during hybrid-extract mode
  (handles transient Windows file locks after download completion)
- Log archive file size before JVM extraction in hybrid mode
- Remove unused ArchiveFormat import, RAR_MULTIPART_RE/RAR_OLDSPLIT_RE
  patterns, and hasOldStyleRarSplits() method from Java extractor
- Keep simple openSevenZipArchive with currentVolumeName tracking

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 16:35:43 +01:00
Sucukdeluxe
f9b0bbe676 v1.5.57
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 16:15:49 +01:00
Sucukdeluxe
93d54c8f84 Fix multi-part RAR: get first stream via callback, track current volume name
Two bugs in SevenZipVolumeCallback caused multi-part RAR extraction to fail:

1. getProperty(NAME) always returned firstFileName instead of tracking the
   last opened volume name. 7z-JBinding needs this to compute subsequent
   volume filenames.

2. The first IInStream was created separately instead of through the
   callback's getStream() method, so the volume name tracker was not
   properly initialized.

Verified with real multi-part RAR5 test archives (3 parts, WinRAR 7.01).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 16:15:44 +01:00
Sucukdeluxe
26bf675a41 v1.5.56
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 16:03:06 +01:00
Sucukdeluxe
35e84e652e Fix multi-part RAR: use explicit ArchiveFormat instead of VolumedArchiveInStream
VolumedArchiveInStream only works for .7z.001 splits - it rejects RAR
filenames. For multi-part RAR (.partN.rar), use RandomAccessFileInStream
with explicit ArchiveFormat.RAR5/RAR format specification. Auto-detection
with null format can fail for multi-volume RAR archives.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 16:03:01 +01:00
Sucukdeluxe
462fc0397e v1.5.55
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 15:54:17 +01:00
Sucukdeluxe
d4bf574370 Fix multi-part RAR extraction: use VolumedArchiveInStream for .partN.rar
The JVM extractor used RandomAccessFileInStream for multi-part RAR archives,
which only provides a single file stream. 7z-JBinding requires
VolumedArchiveInStream to access additional volume parts via callback.

Added RAR_MULTIPART_RE and RAR_OLDSPLIT_RE patterns to detect multi-volume
RAR archives and route them through VolumedArchiveInStream, fixing
"Archive file can't be opened with any of the registered codecs" errors.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 15:54:09 +01:00
Sucukdeluxe
d3ec000da5 v1.5.54
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 14:59:08 +01:00
Sucukdeluxe
d7d256f716 Fix hybrid-extract: check per-archive prefix instead of whole package
The previous fix blocked ALL multi-part extractions when any item in the
package was pending. Now checks only parts of the SAME archive (by prefix
match on fileName/targetPath), so E01 can extract while E06 downloads.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 14:59:03 +01:00
Sucukdeluxe
804fbe2bdc v1.5.53
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 14:52:39 +01:00
Sucukdeluxe
1fde0a9951 Fix hybrid-extract multi-part archive + extractor CRC handling
- findReadyArchiveSets: for .part1.rar, require ALL package items
  to be terminal before allowing extraction (prevents premature
  extraction when later parts have no targetPath/fileName yet)
- JVM extractor: remove CRCERROR from isPasswordFailure() — only
  DATAERROR indicates wrong password. CRCERROR on archives where
  7z-JBinding falsely reports encrypted no longer triggers password
  cycling.
- looksLikeWrongPassword: remove CRC text matching, keep only
  explicit "data error" for encrypted archives.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 14:52:00 +01:00
Sucukdeluxe
0cf5ebe5e9 Release v1.5.52 2026-03-03 14:37:59 +01:00
Sucukdeluxe
0b7c658c8f Add Account Manager + fix Hybrid-Extract premature extraction
- Account Manager: table UI with add/remove/check for all 4 providers
  (Real-Debrid, Mega-Debrid, BestDebrid, AllDebrid)
- Backend: checkRealDebridAccount, checkAllDebridAccount, checkBestDebridAccount
- Hybrid-Extract fix: check item.fileName for queued items without targetPath,
  disable disk-fallback for multi-part archives, extend disk-fallback to catch
  active downloads by fileName match (prevents CRC errors on incomplete files)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 14:36:13 +01:00
Sucukdeluxe
e6ec1ed755 Add Mega-Debrid account info check (web scraping)
Scrapes the Mega-Debrid profile page to display username, premium status,
remaining days, and loyalty points. New "Account prüfen" button in Settings > Accounts.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 14:06:19 +01:00
Sucukdeluxe
ac479bb023 Add backup encryption (AES-256-GCM) and directory existence check
- Encrypt sensitive credentials (tokens, passwords) in backup exports
  using AES-256-GCM with PBKDF2 key derivation from OS username
- Backup format v2 with backwards-compatible v1 import
- Show dialog to create non-existent directories when changing
  outputDir, extractDir, or mkvLibraryDir settings

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 13:47:56 +01:00
Sucukdeluxe
9ac557b0a8 Fix app icon: use rcedit afterPack hook to embed custom icon in EXE
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 03:17:47 +01:00
Sucukdeluxe
4585db0281 Remove CHANGELOG.md from repo, link to Codeberg Releases instead
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 03:07:25 +01:00
Sucukdeluxe
0d86356f96 Remove internal files from repo: .github/workflows, docs/plans, verify_remote.mjs
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 03:06:43 +01:00
Sucukdeluxe
140ee488b7 Update README with new features: JVM extraction, auto-rename, progress bars, history, nested extraction
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 03:03:31 +01:00
Sucukdeluxe
486379183b Remove .claude folder from repo and add to .gitignore
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 03:02:11 +01:00
Sucukdeluxe
19a588a997 Show "Jetzt entpacken" context menu when any item is completed, re-enable paused packages
Some checks are pending
Build and Release / build (push) Waiting to run
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 02:52:19 +01:00
Sucukdeluxe
fa30e738d9 Fix UNSUPPORTEDMETHOD: init SevenZipJBinding native libs, pass password to extractSlow
Some checks are pending
Build and Release / build (push) Waiting to run
Root cause: SevenZip.initSevenZipFromPlatformJAR() was never called, so
native compression codecs (RAR5, LZMA2, etc.) were not loaded. Archives
could be opened (header parsing is pure Java) but all extractSlow() calls
returned UNSUPPORTEDMETHOD because no native decoder was available.

- Add ensureSevenZipInitialized() with lazy init before extraction
- Pass password to extractSlow(outStream, password) for RAR5 compatibility
- Add UNSUPPORTEDMETHOD -> legacy fallback in extractor.ts as safety net

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 02:35:19 +01:00
Sucukdeluxe
eefb536cb3 Fix path traversal false positive: skip subst drive mapping for JVM backend
Some checks are pending
Build and Release / build (push) Waiting to run
Java's getCanonicalFile() resolves subst drives inconsistently,
causing secureResolve() to falsely block valid filenames. JVM handles
long paths natively so subst is only needed for legacy UnRAR/7z.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 02:25:10 +01:00
Sucukdeluxe
02b136dac7 Fix JVM extractor: asarUnpack for class/jar files, add unpacked path candidate, default to jvm mode
Some checks are pending
Build and Release / build (push) Waiting to run
The JVM sidecar class files were packed inside app.asar where Java
cannot access them. asarUnpack extracts them to app.asar.unpacked/.
Default backend changed from auto to jvm (no legacy fallback).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 02:19:26 +01:00
Sucukdeluxe
b712282f62 Log which extraction backend was used (7zjbinding/zip4j/legacy)
Some checks are pending
Build and Release / build (push) Waiting to run
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 02:12:48 +01:00
Sucukdeluxe
de369f3bcd Replace extraction backend with SevenZipJBinding + Zip4j JVM sidecar
Some checks are pending
Build and Release / build (push) Waiting to run
- New JVM sidecar (resources/extractor-jvm/) using SevenZipJBinding for
  RAR/7z/TAR and Zip4j for ZIP multipart, matching JDownloader 2 stack
- Auto/JVM/Legacy backend modes via RD_EXTRACT_BACKEND env variable
- Fallback to legacy UnRAR/7z when JVM runtime unavailable
- Fix isJvmRuntimeMissingError false positives on valid extraction errors
- Cache JVM layout resolution to avoid repeated filesystem checks
- Route nested ZIP extraction through JVM backend consistently

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 02:08:42 +01:00
Sucukdeluxe
3ee3af03cf Match JDownloader 2 extraction behavior: normal I/O, -mt2 hybrid
Some checks are pending
Build and Release / build (push) Waiting to run
- Remove setWindowsBackgroundIO entirely (JD2 uses normal I/O priority)
- Keep only CPU priority IDLE (os.setPriority)
- Hybrid threads fixed at -mt2 (matches JD2's ~16 MB/s controlled throughput)
- Final extraction uses full thread count (unchanged)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 01:14:47 +01:00
Sucukdeluxe
9a646d516b Fix extraction speed: I/O priority only in hybrid mode, more threads
Some checks are pending
Build and Release / build (push) Waiting to run
- setWindowsBackgroundIO (Very Low I/O) now only applied in hybrid mode,
  not for all extractions (was causing massive slowdown)
- Hybrid threads changed from -mt1 to half CPU count (e.g. -mt4 on 8-core)
- Move retry count (R9, R22 etc.) from status text to tooltip only

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 01:08:40 +01:00
Sucukdeluxe
311fb00430 Release v1.5.40
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-03 01:01:46 +01:00
Sucukdeluxe
4008e20278 Reset stale status texts on session load and stop
Some checks are pending
Build and Release / build (push) Waiting to run
- normalizeSessionStatuses: reset all queued items to "Wartet" instead of
  only checking a few specific patterns (missed Retry, Unrestrict-Fehler etc.)
- Reset completed items with stale extraction status to "Fertig (size)"
- stop(): reset all non-finished items to queued/"Wartet" and packages to queued

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 00:58:28 +01:00
Sucukdeluxe
375b9885ee Disk space pre-check, nested extraction, lower I/O priority for hybrid extraction
Some checks are pending
Build and Release / build (push) Waiting to run
- Add disk space check before extraction (aborts if insufficient space)
- Add single-level nested archive extraction (archives inside archives)
- Blacklist .iso/.img/.bin/.dmg from nested extraction
- Set real Windows I/O priority (Very Low) on UnRAR via NtSetInformationProcess
- Reduce UnRAR threads to -mt1 during hybrid extraction
- Fix double episode renaming (s01e01e02 pattern)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 00:54:10 +01:00
Sucukdeluxe
ce01512537 Release v1.5.37
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-03 00:08:06 +01:00
Sucukdeluxe
7dc12aca0c Fix disk-backpressure stalls and improve episode-token parsing 2026-03-03 00:07:12 +01:00
Sucukdeluxe
a6c65acfcb Release v1.5.36
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-02 23:50:31 +01:00
Sucukdeluxe
19342647e5 Fix download freeze spikes and unrestrict slot overshoot handling 2026-03-02 23:47:54 +01:00
Sucukdeluxe
7fe7d93e83 Fix dark text on package header progress bars (CSS specificity override)
Some checks are pending
Build and Release / build (push) Waiting to run
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 23:35:22 +01:00
Sucukdeluxe
19769ea0bb Fix combined progress display during extraction, fix pause showing Warte auf Daten
Some checks are pending
Build and Release / build (push) Waiting to run
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 23:32:30 +01:00
Sucukdeluxe
c9bace1e5a Fix dual-layer text: use clip-path inset instead of wrapper for pixel-perfect alignment
Some checks are pending
Build and Release / build (push) Waiting to run
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 23:29:28 +01:00
Sucukdeluxe
2d1b6de51a Fix dual-layer text alignment: use clip wrapper for proper text centering
Some checks are pending
Build and Release / build (push) Waiting to run
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 23:25:59 +01:00
Sucukdeluxe
c07c0fbdf7 Dual-layer text on progress bars: dark text on bar, light text on track (JDownloader style)
Some checks are pending
Build and Release / build (push) Waiting to run
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 23:21:21 +01:00
Sucukdeluxe
578d050926 Merge Größe/Geladen into single progress bar column (JDownloader 2 style)
Some checks are pending
Build and Release / build (push) Waiting to run
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 23:16:45 +01:00
Sucukdeluxe
cb66661d9b Redesign history tab with package-card style, collapsible details, right-align remove button
Some checks are pending
Build and Release / build (push) Waiting to run
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 23:05:23 +01:00
Sucukdeluxe
e35fc1f31a v1.5.28: Visuelle Fortschrittsanzeigen (JDownloader 2 Style)
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-02 22:17:59 +01:00
Sucukdeluxe
0d6053fa76 Increase column widths for Fortschritt/Größe/Geladen spacing
Some checks are pending
Build and Release / build (push) Waiting to run
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 21:51:43 +01:00
Sucukdeluxe
546d6af598 Fix Fortschritt/Geladen display: show progress for all items with totalBytes, hide 0 B
Some checks are pending
Build and Release / build (push) Waiting to run
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 21:42:11 +01:00
Sucukdeluxe
2b12bd8c42 Add Fortschritt and Geladen columns to download grid
Some checks are pending
Build and Release / build (push) Waiting to run
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 21:35:57 +01:00
Sucukdeluxe
05dc0ca1c6 Clear stale status texts on session load
Items with transient status texts like Provider-Cooldown, Warte auf
Daten, Verbindungsfehler are reset to "Wartet" when the app restarts,
so they don't show misleading status from a previous session.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 21:24:58 +01:00
Sucukdeluxe
5023a99f91 Fix circuit breaker triggering from parallel download failures
- Debounce: simultaneous failures within 2s count as 1 failure
  (prevents 8 parallel unrestrict failures from instant-triggering)
- Raise threshold from 8 to 20 consecutive failures before cooldown
- Escalation tiers: 20→30s, 35→60s, 50→120s, 80+→300s

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 21:21:47 +01:00
Sucukdeluxe
b2b62aeb52 Reduce stall timeout to 10s for faster retry
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 21:15:56 +01:00
Sucukdeluxe
ba7fe24a0a Fix pause bugs, faster stall retry
- Pause now aborts active extractions (previously extraction continued
  during pause, showing wrong status like Provider-Cooldown)
- Unpause clears provider circuit breaker for fresh start
- Post-processing status checks account for global pause state
- Reduce stall detection timeout from 30s to 15s for faster retry
- Reduce stall retry base delay from 500ms to 300ms

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 21:15:00 +01:00
Sucukdeluxe
55e5b0079a Fix pause showing Provider-Cooldown, lower extraction I/O priority
- Add pause check at top of processItem retry loop so items show
  "Pausiert" instead of "Provider-Cooldown" when paused
- Lower extraction process priority from BELOW_NORMAL to IDLE
  (IDLE_PRIORITY_CLASS on Windows also lowers I/O priority, reducing
  disk contention between extraction and active downloads)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 21:06:00 +01:00
Sucukdeluxe
e90e731eaa Fix app freezes and false provider cooldowns
- Make saveSettings async to stop blocking the event loop during downloads
- Add 120ms minimum gap for forced state emissions to prevent rapid-fire IPC
- Fix circuit breaker feedback loop: reset failure count after cooldown expires
- Add 120s time-decay for failure counter (transient bursts don't snowball)
- Raise circuit breaker threshold from 5 to 8 consecutive failures
- Stop counting network stalls as provider failures
- Items without a provider only check primary provider cooldown, not all

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 21:02:09 +01:00
Sucukdeluxe
0b73ea1386 Add "Jetzt entpacken" context menu, fix start freeze on large queues
Some checks are pending
Build and Release / build (push) Waiting to run
- New "Jetzt entpacken" right-click option: triggers extraction for
  completed packages regardless of paused/stopped state
- Fix 5-10s freeze when pressing Start after Pause: recoverRetryableItems
  was calling fs.stat on every item (474+); now only checks failed/completed
- Full IPC pipeline: extractNow in manager, controller, preload, renderer

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 20:40:49 +01:00
Sucukdeluxe
e013c63c59 Fix long path extraction using subst drive mapping instead of \?\ prefix
Some checks are pending
Build and Release / build (push) Waiting to run
WinRAR doesn't support \?\ prefix (interprets it as UNC network path).
Replace with subst drive mapping: maps targetDir to a short drive letter
(Z:, Y:, etc.) before extraction, then removes mapping after. This keeps
total paths under 260 chars even when archives contain deep internal
directory structures.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 20:32:23 +01:00
Sucukdeluxe
2ae22f942e Fix extraction failure on long paths (>260 chars) with \?\ prefix
Some checks are pending
Build and Release / build (push) Waiting to run
Add longPathForWindows() helper that prefixes extract target directories
with \?\ on Windows, bypassing the 260-char MAX_PATH limit. Applied to
both WinRAR/UnRAR and 7z arguments. Fixes "Die Syntax für den
Dateinamen, Verzeichnisnamen" errors when archive internal directories
create deeply nested output paths.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 20:24:55 +01:00
Sucukdeluxe
a22a90adf3 Add retry extraction context menu, increase error text limit
- Right-click packages with extraction errors shows "Extraktion
  wiederholen" option to manually retry
- Increase WinRAR error text from 240 to 500 chars for better
  diagnostics in logs

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 20:15:00 +01:00
Sucukdeluxe
ecf56cb977 Retry failed extractions on unpause, fix delete callback deps
- triggerPendingExtractions() now runs when unpausing, so packages
  with extraction errors are automatically retried
- executeDeleteSelection no longer depends on snapshot objects
  (prevents unnecessary re-renders with large queues)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 19:58:01 +01:00
Sucukdeluxe
430ec7352b Fix session download counter resetting when packages are removed
Session counter now uses sessionDownloadedBytes (in-memory counter)
instead of summing completed items. Removing packages after extraction
no longer resets the session total.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 19:33:58 +01:00
Sucukdeluxe
bc70ff94cc Persist totalDownloadedAllTime across restarts
- Save settings every 30s during active downloads (not just session)
- Force settings save on shutdown and run finish
- Preserve live totalDownloadedAllTime when user saves settings
  (app-controller's stale copy no longer overwrites the counter)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 19:29:32 +01:00
Sucukdeluxe
549328893e Separate pause/resume buttons, fix hybrid extraction during pause
- Pause button is now one-way (orange glow when paused, disabled when
  already paused). Start button resumes from pause.
- Fix hybrid extraction attempting incomplete multi-part archives when
  paused: disk-fallback now blocks any non-terminal item status, not
  just downloading/validating/integrity_check.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 19:22:58 +01:00
Sucukdeluxe
e6c17393fb Fix chart label clipping, persist speed history across tabs, fix pause button
- Dynamic left padding based on measured label width (no more cut-off numbers)
- Speed history ref lifted to App so chart data survives tab switches
- Pause button uses optimistic UI update for instant visual feedback

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 19:11:36 +01:00
Sucukdeluxe
ba673b9e53 Fix bandwidth chart growing infinitely
Use container.clientWidth/clientHeight instead of getBoundingClientRect
and stop overriding canvas style dimensions. CSS 100% handles display
sizing, canvas.width/height handles DPR-scaled render resolution.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 19:04:06 +01:00
Sucukdeluxe
3c510fc265 Add persistent all-time download counter, session vs total stats
- Track totalDownloadedAllTime in settings (persists across restarts)
- Track sessionDownloadedBytes for current app session
- Status bar shows both: Session + Gesamt
- Statistics section shows Heruntergeladen (Session) + (Gesamt)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 19:00:17 +01:00
Sucukdeluxe
4ea3a75dc0 Fix bandwidth chart not updating, fix pause button blocked by actionBusy
- Bandwidth chart drawChart effect now runs on every re-render instead
  of only when running/paused changes (chart was stuck showing empty)
- Pause button no longer wrapped in performQuickAction/actionBusy guard,
  so it works immediately even during ongoing operations

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 18:55:27 +01:00
Sucukdeluxe
670c2f1ff5 Unified package/total speed calculation, confirm dialog for context menu delete, show progress bar when collapsed
- Track packageId in speed events so package speed uses same 3-second
  window as global speed (fixes mismatch between package and status bar)
- Add packageSpeedBps to UiSnapshot, computed from speed events
- Context menu delete actions now respect confirmDeleteSelection setting
- Progress bar visible even when package is collapsed

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 18:45:40 +01:00
Sucukdeluxe
0d1deadb6f Fix tab action button hover clipping into search bar
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 18:33:09 +01:00
Sucukdeluxe
f11190ee25 Fix Ctrl+Click selection, add Delete key with confirmation dialog
- Fix Ctrl+Click: mousedown no longer immediately adds item, preventing
  onClick from toggling it back off. Drag-select still works via mouseenter.
- Delete key removes selected items/packages with JDownloader-style
  confirmation dialog showing count and remaining items.
- "Nicht mehr anzeigen" checkbox disables future confirmations.
- New setting "Vor dem Löschen bestätigen" under Allgemein.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 18:31:45 +01:00
Sucukdeluxe
eb42fbabfd Fix update retry: reset abort controller on early return errors
Some checks are pending
Build and Release / build (push) Waiting to run
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 18:06:11 +01:00
Sucukdeluxe
84d5c0f13f JDownloader 2-style UI overhaul, multi-select, hoster display, settings sidebar
Some checks are pending
Build and Release / build (push) Waiting to run
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 17:56:54 +01:00
Sucukdeluxe
09bc354c18 Detect dead links as permanent errors, fix last-episode extraction race
Some checks are pending
Build and Release / build (push) Waiting to run
Dead link detection:
- Mega-Web: parse hoster error messages (hosterNotAvailable, etc.) from HTML
  and throw specific error instead of returning null
- MegaDebridClient: stop retrying on permanent hoster errors
- download-manager: isPermanentLinkError() immediately fails items with dead
  links instead of retrying forever

Extraction race condition:
- package_done cleanup policy checked if all items were "completed" (downloaded)
  but not if they were "extracted" — removing the package before the last
  episode could be extracted
- Both applyCompletedCleanupPolicy and applyPackageDoneCleanup now guard
  against premature removal when autoExtract is enabled

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 15:28:23 +01:00
Sucukdeluxe
3ed31b7994 Fix core download bugs: resume corruption, 416 handling, stream leak, drain timeout
Some checks are pending
Build and Release / build (push) Waiting to run
- HTTP 200 on resume: detect server ignoring Range header, write in truncate mode
  instead of appending (prevents doubled/corrupted files)
- HTTP 416 without Content-Range: assume complete if >1MB exists instead of
  deleting potentially multi-GB finished files
- Stream handle leak: explicit destroy() after finally to prevent fd exhaustion
- Drain timeout: don't abort controller on disk backpressure, let inner retry
  loop handle it instead of escalating to full stall pipeline

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 15:12:33 +01:00
Sucukdeluxe
550942aad7 Overhaul download retry pipeline: circuit breaker, escalating backoff, shelve logic
- Provider circuit breaker: track consecutive failures per provider with
  escalating cooldowns (30s/60s/120s/300s), auto-invalidate Mega-Debrid
  session on cooldown
- Escalating backoff: retry delays now scale up to 120s (was 30s max),
  unrestrict backoff exponential instead of linear 15s cap
- Shelve logic: after 15 consecutive failures, item pauses 5 min with
  counter halving for gradual recovery
- Periodic soft-reset: every 10 min, reset stale retry counters (>10 min
  queued) and old provider failures (>15 min), acts like mini-restart
- Mega-Debrid queue timeout: 90s wait limit in runExclusive to prevent
  cascade blocking behind stuck calls
- Provider-cooldown-aware retry delays: items wait for provider cooldown
  instead of retrying against broken service
- Fix: reconnect/package_toggle now persist retry counters (previously
  lost on interruption, defeating shelve logic)
- Mega-Debrid generate: tighter timeouts, progressive reload backoff,
  hoster retry limit (5x max)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 15:00:17 +01:00
Sucukdeluxe
a9c8ee2ff4 Sort active packages by completion percentage, not absolute count
Some checks are pending
Build and Release / build (push) Waiting to run
1/6 (16.7%) now ranks above 5/153 (3.3%) since it's further along.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 14:03:43 +01:00
Sucukdeluxe
10b4f18d33 Split progress bar: blue for download, green for extraction
Some checks are pending
Build and Release / build (push) Waiting to run
Left half (blue) shows download progress, right half (green) shows
extraction progress. When no extraction is active, full bar is blue.
Full bar = 50% blue (all downloaded) + 50% green (all extracted).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 12:57:11 +01:00
Sucukdeluxe
45b0d71dd0 Sort pinned packages by progress: most completed items first
Some checks are pending
Build and Release / build (push) Waiting to run
Active packages are now sorted by completed item count (descending),
then by downloaded bytes, so packages with real download progress
appear above packages still in link resolution phase.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 12:53:54 +01:00
Sucukdeluxe
e0eb3f0453 Pin actively downloading packages to top of list
Some checks are pending
Build and Release / build (push) Waiting to run
Packages with items in downloading/validating/extracting/integrity_check
state are always shown at the top regardless of sort order (A-Z or Z-A).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 12:50:14 +01:00
Sucukdeluxe
daf70211ac Add total download speed to stats bar
Some checks are pending
Build and Release / build (push) Waiting to run
Show current aggregate speed (all active downloads) in the stats bar
next to Pakete/Dateien/Gesamt. Only visible while session is running.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 12:47:11 +01:00
Sucukdeluxe
c28384e78d Detect hoster error-page downloads (<512 B) and trigger retry
Some checks are pending
Build and Release / build (push) Waiting to run
Some hosters return tiny error responses (e.g. 9 bytes) with HTTP 200.
- downloadToFile: detect files <512 B, log content, delete and throw for retry
- Post-download: catch <512 B files even when totalBytes is unknown
- Logs the error-page content for debugging

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 12:44:02 +01:00
Sucukdeluxe
122c797652 Fix 9-byte download regression: revert download.bin change, add size guards
Some checks are pending
Build and Release / build (push) Waiting to run
- Revert v1.4.79 download.bin filename logic that caused broken downloads
- Item-recovery: require file ≥10 KB (or ≥50% of expected size) instead of >0
- Post-download: reject files <1 KB when expected size >10 KB
- Disk-fallback: require parts ≥10 KB before considering archive ready

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 12:31:40 +01:00
Sucukdeluxe
14d5a3abb4 Fix zip volume cleanup regex and atomic progress file writes
Some checks are pending
Build and Release / build (push) Waiting to run
- Fix .z001/.z002 split zip volumes not deleted after extraction
  (regex matched only 2-digit, now matches 2-3 digit volumes)
- Make extract progress file writes atomic (write to .tmp then rename)
  to prevent corruption on crash during extraction

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 10:51:49 +01:00
Sucukdeluxe
ca05fa184d Fix download.bin filename flickering during unrestrict
Some checks are pending
Build and Release / build (push) Waiting to run
Keep existing good filename when debrid API returns "download.bin"
or opaque name. Only overwrite item.fileName if the resolved name
is actually better (not opaque, not download.bin).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 10:43:29 +01:00
Sucukdeluxe
13ff41aa95 Add item-recovery for stuck downloads with completed files on disk
Some checks are pending
Build and Release / build (push) Waiting to run
When post-processing runs, detect items in idle states (queued/paused)
whose target file already exists on disk with non-zero size, and
auto-recover them to "completed" status. This ensures allDone becomes
true, triggering final extraction with archive cleanup (delete mode).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 10:40:28 +01:00
Sucukdeluxe
d8a53dcea6 Fix hybrid extraction skipping archives when item status stuck
Some checks are pending
Build and Release / build (push) Waiting to run
Add disk-fallback to findReadyArchiveSets: when all archive parts
physically exist on disk with non-zero size and none are actively
downloading/validating, consider the archive ready for extraction.
This fixes episodes being skipped when a download item's status
was not updated to "completed" despite the file being fully written.

Also improve debug server: raise log limit to 10000 lines,
add grep filter, add /session endpoint for raw session data.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 10:36:08 +01:00
Sucukdeluxe
9f589439a1 Add debug HTTP server for remote monitoring
Some checks are pending
Build and Release / build (push) Waiting to run
Starts an HTTP server on port 9868 (configurable via debug_port.txt)
when debug_token.txt exists in the app runtime directory. Provides
/health, /log, /status, and /items endpoints for live monitoring.
Token-based auth required.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 22:58:54 +01:00
Sucukdeluxe
5bb984d410 Fix validating-stuck watchdog aborting before unrestrict timeout
Some checks are pending
Build and Release / build (push) Waiting to run
The validating-stuck timeout (45s) was shorter than the unrestrict
timeout (60s), causing items to be endlessly aborted and retried
before the debrid API call could complete. Now uses unrestrict
timeout + 15s buffer.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 22:51:17 +01:00
Sucukdeluxe
1825e8ba04 ️ perf: improve extraction status, stuck detection, and retry logic
Some checks are pending
Build and Release / build (push) Waiting to run
- Extraction status: "Entpackt - Done" / "Entpacken - Ausstehend"
- Per-item extraction progress (no cross-contamination)
- Validating-stuck watchdog: abort items stuck >45s in "Link wird umgewandelt"
- Global stall timeout reduced 90s → 60s, unrestrict timeout 120s → 60s
- Unrestrict retry: longer backoff (5/10/15s), reset partial downloads
- Stall retry: reset partial downloads for fresh link
- Mega-Web generate: max 30 polls (was 60), 45s overall timeout
- Mega-Web session refresh: 10min (was 20min)
- Comprehensive logging on all retry/failure paths

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 22:38:46 +01:00
Sucukdeluxe
0e55c28142 feat: replace default Electron icon with custom app icon
Some checks are pending
Build and Release / build (push) Waiting to run
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 22:18:34 +01:00
Sucukdeluxe
e485cf734b Async FS optimizations, exponential backoff, cleanup dedup and release v1.4.72
Some checks are pending
Build and Release / build (push) Waiting to run
- Convert all sync FS ops (existsSync, readdirSync, statSync, writeFileSync,
  rmSync, renameSync) to async equivalents across download-manager, extractor,
  cleanup, storage, and logger to prevent UI freezes
- Replace linear retry delays with exponential backoff + jitter to prevent
  retry storms with many parallel downloads
- Deduplicate resolveArchiveItems into single shared function
- Replace Array.shift() O(N) in bandwidth chart with slice-based trimming
- Make logger rotation async in the async flush path

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 21:53:07 +01:00
Sucukdeluxe
520ef91d2d Fix duplicate extraction and release v1.4.71
Some checks are pending
Build and Release / build (push) Waiting to run
- Don't clear extraction resume state during hybrid mode (skipPostCleanup)
- Mark ALL completed items as "Entpackt" after successful hybrid extraction
  to prevent full extraction from re-extracting already-extracted archives

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 21:09:53 +01:00
Sucukdeluxe
674cf101da Fix extraction status cross-contamination with filename pattern matching, release v1.4.70
Previous fix used pathKey-based maps which failed due to path resolution
mismatches on Windows. New approach matches items to archives using
filename regex patterns directly (e.g. prefix.part\d+.rar), which is
robust regardless of path casing/resolution.

Also marks items as "Entpackt" immediately when their archive finishes
instead of waiting for all archives to complete, so completed episodes
show correct status while later episodes are still extracting.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 20:57:56 +01:00
Sucukdeluxe
4371e53b86 Fix retry recovery, extraction status cross-contamination and UI freezes, release v1.4.69
- togglePause: clear retry delays and abort stuck tasks on unpause so
  Pause/Start actually recovers stuck downloads
- Fix retry display showing Number.MAX_SAFE_INTEGER instead of "inf"
  for unrestrict and generic error retries
- Fix extraction status applied to ALL items in package instead of only
  the items belonging to the currently extracting archive
- Make persistNow always async and item-completion stat async to reduce
  UI freezes

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 20:42:03 +01:00
Sucukdeluxe
bf2b685e83 Add session backup restore and release v1.4.68
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 20:13:16 +01:00
Sucukdeluxe
e7f0b1d1fd Fix start-conflict skip behavior and release v1.4.67
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 20:03:58 +01:00
Sucukdeluxe
647679f581 Fix Mega-Web unrestrict hangs and release v1.4.66
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 19:06:34 +01:00
Sucukdeluxe
237bf6731d Move Statistiken tab to the right of Einstellungen
Some checks are pending
Build and Release / build (push) Waiting to run
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 18:11:41 +01:00
Sucukdeluxe
25af89d7a0 Release v1.4.64 to test new buffer download method
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 17:23:52 +01:00
Sucukdeluxe
e384199c6e Replace streaming download with chunked buffer download to fix corruption
Some checks are pending
Build and Release / build (push) Waiting to run
- Replace Readable.fromWeb() + pipeline with ReadableStream.getReader() loop
- Collect chunks in memory, verify size, then write to disk in one shot
- Add Accept-Encoding: identity to prevent content encoding issues
- Eliminates stream conversion bugs that caused file corruption on some servers

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 17:20:02 +01:00
Sucukdeluxe
98425764d3 Release v1.4.62 to test updater fix
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 17:15:08 +01:00
Sucukdeluxe
8f186ad894 Fix updater SHA512 mismatch by patching latest.yml filenames and adding integrity retry
Some checks are pending
Build and Release / build (push) Waiting to run
- Patch latest.yml during release to use actual filenames (spaces) instead of electron-builder's dashed names
- Add download size validation before SHA512 check to catch incomplete downloads
- Retry download on integrity mismatch (up to 3 passes) with API refresh
- Re-resolve digest from latest.yml on each retry pass

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 17:04:12 +01:00
Sucukdeluxe
8b153bdabe Release v1.4.60 for update-path testing
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 16:51:16 +01:00
Sucukdeluxe
401ad3b9f9 Release v1.4.58 to replace stale updater assets
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 16:44:26 +01:00
Sucukdeluxe
5e1e62d5b6 Release v1.4.57 for updater validation
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 16:31:09 +01:00
Sucukdeluxe
afc70e6a10 Release v1.4.56 with updater 404 fallback fixes
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 16:19:39 +01:00
Sucukdeluxe
20c32d39c8 Harden updater candidate fallback on Codeberg 404s 2026-03-01 16:19:35 +01:00
Sucukdeluxe
73cd2ea6b9 Release v1.4.55 for updater path testing
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 16:06:53 +01:00
Sucukdeluxe
c5c862b516 Release v1.4.54 with bandwidth statistics tab
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 15:57:28 +01:00
Sucukdeluxe
a0cdac87e8 Add bandwidth statistics tab with live chart
- Add new Statistics tab between Downloads and Settings
- Implement real-time bandwidth chart using Canvas (60s history)
- Add session overview with 8 stats cards (speed, downloaded, files, packages, etc.)
- Add provider statistics with progress bars
- Add getSessionStats IPC endpoint
- Support dark/light theme in chart rendering
2026-03-01 15:56:57 +01:00
Sucukdeluxe
f5d7ee4d1a Release v1.4.53 with updater verification fallback
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 15:51:16 +01:00
Sucukdeluxe
43950014b2 Retry updater candidates after hash mismatch 2026-03-01 15:51:13 +01:00
Sucukdeluxe
ec45983810 Release v1.4.51 with MKV post-cleanup and updater fixes
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 05:04:31 +01:00
Sucukdeluxe
7795208332 Fix MKV collection cleanup and updater digest verification 2026-03-01 05:01:11 +01:00
Sucukdeluxe
b0dc7b80ab Release v1.4.50 with infinite retry default
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 04:32:27 +01:00
Sucukdeluxe
ab9b3e87b1 Set default auto-retry limit to infinite 2026-03-01 04:31:04 +01:00
Sucukdeluxe
116135289c Release v1.4.49 with retry-limit and extract-folder cleanup updates
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 04:27:42 +01:00
Sucukdeluxe
2bddd5b3b2 Apply configurable retry limit and clean empty extract dirs more aggressively 2026-03-01 04:26:33 +01:00
Sucukdeluxe
5f2eb907b6 Release v1.4.48 with configurable retry limit setting
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 04:19:40 +01:00
Sucukdeluxe
3f17cc8cb4 Add configurable auto-retry limit with optional infinite retries 2026-03-01 04:18:41 +01:00
Sucukdeluxe
33e2e126f1 Release v1.4.47 with stuck-download watchdog improvements
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 04:08:16 +01:00
Sucukdeluxe
467d4bbc58 Add watchdogs for stuck unrestrict and low-throughput downloads 2026-03-01 04:07:23 +01:00
Sucukdeluxe
e1e7f63f50 Release v1.4.46 for updater verification
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 04:01:29 +01:00
Sucukdeluxe
18fdbcba18 Release v1.4.45 with updater integrity hardening
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 03:58:15 +01:00
Sucukdeluxe
4bcb069ec7 Harden updater integrity checks with latest.yml SHA512 fallback 2026-03-01 03:57:23 +01:00
Sucukdeluxe
bfbaee8e5c Release v1.4.44 with extraction smoothness and rename fixes
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 03:49:11 +01:00
Sucukdeluxe
282c1ebf1d Reduce extract lag and improve long-path auto-rename stability 2026-03-01 03:47:18 +01:00
Sucukdeluxe
6e50841387 Release v1.4.43 with update progress and collector metrics
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 03:34:28 +01:00
Sucukdeluxe
508977e70b Add update install progress feedback and collector metrics line 2026-03-01 03:33:18 +01:00
Sucukdeluxe
cb1e4bb0c1 Release v1.4.42 with flat MKV collection support
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 03:24:56 +01:00
Sucukdeluxe
809aec69c2 Add optional flat MKV library collection per package 2026-03-01 03:23:26 +01:00
Sucukdeluxe
71aa9204f4 Add one-command Codeberg release workflow 2026-03-01 02:54:13 +01:00
Sucukdeluxe
65cf4c217f Release v1.4.41 for updater validation on Codeberg
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 02:47:53 +01:00
Sucukdeluxe
2d3a1df21d Release v1.4.40 with Codeberg updater migration
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 02:42:06 +01:00
Sucukdeluxe
43bc95b7fc Switch updater and docs from GitHub to Codeberg 2026-03-01 02:40:11 +01:00
Sucukdeluxe
fe59e064f0 Translate README to English and refine wording 2026-03-01 02:29:37 +01:00
Sucukdeluxe
67fc7a9226 Revise README with modern feature and setup overview 2026-03-01 02:28:15 +01:00
Sucukdeluxe
474ff8cd26 Release v1.4.39 with Multi Debrid Downloader title format
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 02:25:17 +01:00
Sucukdeluxe
310f4dc58a Release v1.4.38 with version display formatting tweak
- Change header to "Multi Debrid Downloader - vX.X.X" (dash separator)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 02:17:33 +01:00
Sucukdeluxe
d491c21b97 Release v1.4.37 with DLC filenames, instant delete, version display
- Parse <filename> tags from DLC XML for proper file names instead of
  deriving from opaque URLs (fixes download.bin display)
- Optimistic UI removal for package/item delete (instant feedback)
- Show app version in header ("Multi Debrid Downloader vX.X.X")

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 02:12:06 +01:00
Sucukdeluxe
a0c58aad2c Release v1.4.36 with tolerant DLC padding
- Local DLC decryption no longer throws on invalid PKCS7 padding
- Instead tries to parse the decrypted data as-is (Node.js base64
  decoder is lenient with trailing garbage bytes)
- This fixes large DLCs that previously failed locally and then
  hit dcrypt.it's 413 size limit on both endpoints
- Empty decryption result returns [] instead of throwing

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 02:03:37 +01:00
Sucukdeluxe
48c89713ba Release v1.4.35 with 413 handling for both dcrypt endpoints
- Handle 413 from paste endpoint (not just upload)
- Show clear German error "DLC-Datei zu groß für dcrypt.it" when both
  endpoints reject the file due to size
- Add tests for dual-413 and upload-413+paste-500 scenarios

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 02:00:52 +01:00
Sucukdeluxe
778124312c Release v1.4.34 with DLC import package-name and 413 fixes
- Use DLC filename as package name in dcrypt fallback instead of
  inferring from individual URLs (fixes mangled package names)
- Add paste endpoint fallback when dcrypt upload returns 413
- Split decryptDlcViaDcrypt into tryDcryptUpload/tryDcryptPaste
- Add DCRYPT_PASTE_URL constant
- Expand container tests for 413 fallback and dual-failure scenarios

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 01:51:40 +01:00
Sucukdeluxe
edbfba6663 Release v1.4.33 with DLC import and stats hotfixes
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 01:39:51 +01:00
Sucukdeluxe
ff1036563a Release v1.4.32 with intensive renamer hardening
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 01:21:13 +01:00
Sucukdeluxe
6ac56c0a77 Release v1.4.31 with full bug-audit hardening
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-01 00:33:26 +01:00
Sucukdeluxe
6ae687f3ab Release v1.4.30 with startup and UI race-condition fixes
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-28 22:33:19 +01:00
Sucukdeluxe
eda9754d30 Release v1.4.29 with downloader and API safety hardening
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-28 21:31:42 +01:00
Sucukdeluxe
84d8f37ba6 Release v1.4.28 with expanded bug audit fixes
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-28 19:47:46 +01:00
Sucukdeluxe
05a0c4fd55 Fix PackageCard memo comparator to include callback props
Prevents stale closures when callback identities change but data props remain the same.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-28 18:09:51 +01:00
Sucukdeluxe
8700db4a37 Release v1.4.27 with bug audit hardening fixes 2026-02-28 14:12:16 +01:00
Sucukdeluxe
cbc423e4b7 Release v1.4.26 with remaining bug audit fixes
- AllDebrid: add HTML response detection to unrestrictLink
- Cleanup: skip symlinks/junctions in all directory traversals
- Blob URL: increase revoke delay from 0ms to 60s
- Extractor: per-package progress file to prevent collision
- ADD_CONTAINERS: reject path traversal and relative paths

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-28 13:09:59 +01:00
Sucukdeluxe
06a272ccbd Release v1.4.25 with hybrid extraction status fix
- Items extracted during hybrid extraction now show "Entpackt" instead of "Fertig"
- Only items belonging to the extracted archive set get status updates during hybrid extraction
- Final extraction preserves "Entpackt" status from prior hybrid passes

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-28 13:01:49 +01:00
Sucukdeluxe
c1e614650a Release v1.4.24 with UI improvements
- Fix drag overlay appearing during internal package reorder
- Rename "Paket abbrechen" to "Paket löschen"
- Make package deletion instant (remove performQuickAction delay)
- Add A-Z / Z-A sorting buttons for packages

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-28 12:41:39 +01:00
Sucukdeluxe
9598fca34e Release v1.4.23 with critical bug audit fixes
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-28 12:16:08 +01:00
Sucukdeluxe
f70237f13d Release v1.4.22 with incremental hybrid extraction (JDownloader-style)
Implements hybrid extraction: when a package has multiple episodes with
multi-part archives, completed archive sets are extracted immediately
while the rest of the package continues downloading. Uses the existing
hybridExtract setting (already in UI/types/storage).

Key changes:
- Export findArchiveCandidates/pathSetKey from extractor.ts
- Add onlyArchives/skipPostCleanup options to ExtractOptions
- Add findReadyArchiveSets to identify complete archive sets
- Add runHybridExtraction for incremental extraction passes
- Requeue logic in runPackagePostProcessing for new completions
- Resume state preserved across hybrid passes (no premature clear)
- Guard against extracting incomplete multi-part archives
- Correct abort/toggle handling during hybrid extraction
- Package toggle now also aborts active post-processing

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-28 07:25:18 +01:00
Sucukdeluxe
d7162592e0 Release v1.4.21 with download engine performance optimizations
- Cache itemCount as class property instead of O(n) Object.keys().length on every emit/persist/UI update
- Eliminate redundant iteration: hasQueuedItems now delegates to findNextQueuedItem
- Remove expensive cloneSession from getSnapshot (IPC serialization handles the copy)
- Increase speed events compaction threshold (50 → 200) to reduce array reallocations
- Time-based UI emit throttling instead of per-percent progress checks
- Avoid Array.from allocation in global stall watchdog
- Optimize markQueuedAsReconnectWait to iterate only run items instead of all items
- Cache pathKey computation in claimTargetPath loop (avoid path.resolve per iteration)
- Use packageOrder.length instead of Object.keys(packages).length in getStats

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-28 06:49:55 +01:00
Sucukdeluxe
1ba635a793 Fix auto-rename double-episode bug and add 62 rename tests
Bug fixed: When a folder already contained the same episode token as
the source file (e.g. Show.S01E05.720p-4sf + s01e05.mkv), the episode
was inserted a second time producing Show.S01E05.720p.S01E05-4sf.

Root cause: The replace produced an identical string, the equality check
fell through to the suffix-insert branch which added the token again.
Fix: Use regex.test() first, then always apply the replacement when
an episode pattern exists in the folder name.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-28 06:34:46 +01:00
Sucukdeluxe
b906d8a2bd Fix electron-builder: remove type:module, use vite.config.mts instead
The "type": "module" in package.json caused tsup to emit .cjs files instead
of .js, breaking the electron-builder entry point check. Using .mts extension
for vite config achieves ESM for Vite without affecting the rest of the package.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-28 06:28:45 +01:00
Sucukdeluxe
63fd402083 Release v1.4.20 with comprehensive audit fixes (140 issues) and expanded test coverage
- Speed calculation: raised minimum elapsed floor to 0.5s preventing unrealistic spikes
- Reconnect: exponential backoff with consecutive counter, clock regression protection
- Download engine: retry byte tracking (itemContributedBytes), mkdir before createWriteStream, content-length validation
- Fire-and-forget promises: all void promises now have .catch() error handlers
- Session recovery: normalize stale active statuses to queued on crash recovery, clear speedBps
- Storage: config backup (.bak) before overwrite, EXDEV cross-device rename fallback with type guard
- IPC security: input validation on all string/array IPC handlers, CSP headers in production
- Main process: clipboard memory limit (50KB), installer timing increased to 800ms
- Debrid: attribute-order-independent meta tag regex for Rapidgator filename extraction
- Constants: named constants for magic numbers (MAX_MANIFEST_FILE_BYTES, MAX_LINK_ARTIFACT_BYTES, etc.)
- Extractor/integrity: use shared constants, document password visibility and TOCTOU limitations
- Tests: 103 tests total (55 new), covering utils, storage, integrity, cleanup, extractor, debrid, update

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-28 06:23:24 +01:00
Sucukdeluxe
556f0672dc Release v1.4.19 with 4SF/4SJ auto-rename support
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-28 05:50:14 +01:00
Sucukdeluxe
b971a79047 Release v1.4.18 with performance optimization and deep bug fixes
- Optimize session cloning: replace JSON.parse/stringify with shallow spread (~10x faster for large queues)
- Convert blocking fs.existsSync/statSync to async on download hot path
- Fix EXDEV cross-device rename in sync saveSettings/saveSession (network drive support)
- Fix double-delete bug in applyCompletedCleanupPolicy (package_done + immediate)
- Fix dangling runPackageIds/runCompletedPackages in removePackageFromSession
- Fix AdmZip partial extraction: use overwrite mode for external fallback
- Add null byte stripping to sanitizeFilename (path traversal prevention)
- Add 5MB size limit for hash manifest files (OOM prevention)
- Add 256KB size limit for link artifact file content check
- Deduplicate cleanup code via centralized removePackageFromSession

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-28 05:30:28 +01:00
Sucukdeluxe
d4dd266f6b Release v1.4.17 with security fixes, stability hardening and retry improvements
- Fix ZIP path traversal vulnerability (reject entries escaping target dir)
- Add single instance lock (prevent data corruption from multiple instances)
- Add unhandled exception/rejection handlers (prevent silent crashes)
- Fix mainWindow reference cleanup on close
- Add second-instance handler to focus existing window
- Fix claimTargetPath infinite loop (add 10k iteration bound)
- Add duplicate startItem guard (prevent concurrent downloads of same item)
- Clone session in getSnapshot to prevent live-reference mutation bugs
- Clear stateEmitTimer on clearAll to prevent dangling timer emissions
- Add extraction timeout safety (4h deadline with logging)
- Add dedicated unrestrict retry system with longer backoff for Mega-Debrid errors
- Add log rotation (10MB max, keeps one .old backup)
- Fix writeExtractResumeState missing mkdir (prevents crash on deleted dirs)
- Fix saveSessionAsync EXDEV cross-device rename with copy fallback

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-28 05:04:21 +01:00
Sucukdeluxe
ea6301d326 Release v1.4.16 with crash prevention and hang protection
- Add 30s fetch timeouts to ALL API calls (Real-Debrid, BestDebrid, AllDebrid, Mega-Web)
- Fix race condition in concurrent worker indexing (runWithConcurrency)
- Guard JSON.parse in RealDebrid response with try-catch
- Add try-catch to fs.mkdirSync in download pipeline (handles permission denied)
- Convert MD5/SHA1 hashing to streaming (prevents OOM on large files)
- Add error handling for hash manifest file reading
- Prevent infinite hangs on unresponsive API endpoints

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-27 21:43:40 +01:00
Sucukdeluxe
147269849d Release v1.4.15 with deep performance optimizations and crash prevention
- Convert session persistence to async during downloads (prevents 50-200ms UI freezes)
- Avoid unnecessary Buffer.from() copy on Uint8Array chunks (zero-copy when possible)
- Cache effective speed limit for 2s (eliminates Date object creation per chunk)
- Replace O(n) speed event shift() with pointer-based pruning
- Throttle speed event pruning to every 1.5s instead of per chunk
- Optimize refreshPackageStatus to single-loop counting (was 4 separate filter passes)
- Fix global stall watchdog race condition (re-check abort state before aborting)
- Add coalescing for async session saves (prevents write storms)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-27 21:35:05 +01:00
Sucukdeluxe
cc887eb8a1 Release v1.4.14 with extraction performance optimization and bug fixes
- Add multi-threaded extraction via WinRAR -mt flag (uses all CPU cores)
- Fix -idq flag suppressing progress output, replaced with -idc
- Fix extraction timeout for multi-part archives (now calculates total size across all parts)
- Raise extraction timeout cap from 40min to 2h for large archives (40GB+)
- Add natural episode sorting (E1, E2, E10 instead of E1, E10, E2)
- Add split archive support (.zip.001, .7z.001) with proper cleanup
- Add write-stream drain timeout to prevent download freezes on backpressure
- Fix regex global-state bug in progress percentage parsing
- Optimize speed event pruning (every 1.5s instead of every chunk)
- Add performance flag fallback for older WinRAR versions

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-27 21:28:03 +01:00
Sucukdeluxe
6d8ead8598 Release v1.4.13 with global stall watchdog and freeze recovery 2026-02-27 20:53:07 +01:00
Sucukdeluxe
0f85cd4c8d Release v1.4.12 with connection stall recovery and download freeze mitigation
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 20:35:10 +01:00
Sucukdeluxe
8b5c936177 Release v1.4.11 with stability hardening and full-function regression pass 2026-02-27 20:25:55 +01:00
Sucukdeluxe
6e72c63268 Release v1.4.10 with freeze mitigation and extraction throughput fixes
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 20:13:33 +01:00
Sucukdeluxe
306826ecb9 Add per-candidate retries (3x) for update downloads
Each download URL is now retried up to 3 times with increasing delay
(1.5s, 3s) before falling back to the next candidate URL. Recoverable
errors (404, 403, 429, 5xx, timeout, network) trigger retries.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-27 19:51:48 +01:00
Sucukdeluxe
e1286e02af Release v1.4.9 with extraction resume fix and faster update downloads
- Fix extraction status display after restart (shows "Entpacken ausstehend" instead of stale status)
- Fix Start button to trigger pending extractions for already-downloaded packages
- Fix extraction resume when archives already cleaned (recognizes completed state from resume file)
- Reduce update download connection timeout from 8min to 30s per candidate for faster fallback
- Add logging for update download candidates and failures
- Show manual download URL on update failure
- Sequential extraction preserved (one package at a time via queue)
- Extraction properly cancelled on shutdown, resumes on restart

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-27 19:47:53 +01:00
Sucukdeluxe
333a912d67 Release v1.4.8 with updater fallback recovery and extraction hardening
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 19:28:58 +01:00
Sucukdeluxe
3b9c4a4e88 Release v1.4.7 with ENOENT extraction recovery and lag optimizations
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 19:12:40 +01:00
Sucukdeluxe
dbf1c34282 Release v1.4.6 with extraction resume safety and smoother runtime
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 18:59:04 +01:00
Sucukdeluxe
05a75d0ac5 Release v1.4.5 with startup auto-recovery and lag hardening
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 18:37:32 +01:00
Sucukdeluxe
6a33e61c38 Release v1.4.4 with visible retries and HTTP 416 progress reset
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 18:24:44 +01:00
Sucukdeluxe
53212f45e3 Release v1.4.3 with unified controls and resilient retries
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 18:11:50 +01:00
Sucukdeluxe
01ed725136 Add start conflict prompts for existing extracted packages in v1.4.2 2026-02-27 17:54:56 +01:00
Sucukdeluxe
1c92591bf1 Release v1.4.1 2026-02-27 17:34:45 +01:00
Sucukdeluxe
f5e020da4e Bughunt: harden async UI flows and optimize large-queue rendering 2026-02-27 17:31:05 +01:00
Sucukdeluxe
c83fa3b86a Bughunt: smooth UI actions and skip redundant settings updates 2026-02-27 17:15:20 +01:00
Sucukdeluxe
6b65da7f66 Improve UI responsiveness for control actions and drag overlay 2026-02-27 16:36:52 +01:00
Sucukdeluxe
2ae3cb5fa5 Remove legacy Python artifacts from repository 2026-02-27 16:32:01 +01:00
Sucukdeluxe
4fc0ce26f3 Ship UI productivity upgrades and extraction progress flow in v1.4.0 2026-02-27 16:23:19 +01:00
Sucukdeluxe
7b5218ad98 Remove empty download package dirs after archive cleanup in v1.3.11 2026-02-27 15:55:43 +01:00
Sucukdeluxe
e2a8673c94 Harden extraction verification, cleanup safety, and logging in v1.3.10 2026-02-27 15:43:52 +01:00
Sucukdeluxe
ef821b69a5 Fix shutdown resume state and legacy extracted cleanup backfill v1.3.9 2026-02-27 15:29:49 +01:00
Sucukdeluxe
da51e03cef Backfill extracted archive cleanup on startup in v1.3.8 2026-02-27 15:15:16 +01:00
Sucukdeluxe
75fc582299 Fix split-archive cleanup after extraction and release v1.3.7 2026-02-27 15:07:12 +01:00
Sucukdeluxe
0a99d3c584 Fix stuck queue scheduling and auto-recover stalled streams v1.3.6 2026-02-27 14:55:31 +01:00
Sucukdeluxe
0de5a59a64 Stream filename scan updates and add provider fallback in v1.3.5 2026-02-27 14:45:42 +01:00
Sucukdeluxe
973885a147 Rescan queued hash names on startup and release v1.3.4 2026-02-27 14:32:07 +01:00
Sucukdeluxe
6fe7b7e7ee Fix rg.to filename scanning and release v1.3.3 2026-02-27 14:28:29 +01:00
Sucukdeluxe
447dd7feff Implement full UX upgrade and Rapidgator filename hardening in v1.3.2 2026-02-27 14:20:54 +01:00
Sucukdeluxe
7381e54f4f Fix update loop: read APP_VERSION from package.json v1.3.1
APP_VERSION was hardcoded as "1.1.29" in constants.ts causing the app
to always report the old version and re-trigger the update prompt.
Now reads version dynamically from package.json via import.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-27 13:43:33 +01:00
Sucukdeluxe
73c8b6e670 Replace 7zip-bin with WinRAR for archive extraction v1.3.0
- Remove 7zip-bin dependency and asarUnpack config
- Use WinRAR/UnRAR.exe from standard install paths with auto-detection
- Add resolver with probing to cache the found extractor command
- Add -y flag for auto-confirm and WinRAR.exe command support
- Update tests for WinRAR argument format

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-27 13:34:34 +01:00
Sucukdeluxe
d867c55e37 Fix extractor ENOENT stalls and add built-in archive passwords
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 12:53:56 +01:00
Sucukdeluxe
741a0d67cc Add archive password fallback and release v1.1.28
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 12:38:42 +01:00
Sucukdeluxe
88eb6dff5d Fix extraction recovery edge cases and release v1.1.27
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 12:23:54 +01:00
Sucukdeluxe
3525ecb569 Recover stalled extraction and add optional fallback providers
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 12:16:30 +01:00
Sucukdeluxe
0f61b0be08 Reduce cancel lag with non-blocking cleanup in v1.1.25
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 11:53:14 +01:00
Sucukdeluxe
4548d809f9 Polish settings UI and harden fetch-failed recovery
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 11:32:06 +01:00
Sucukdeluxe
6d777e2a56 Harden resume flows and ship v1.1.23 stability fixes
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 11:04:52 +01:00
Sucukdeluxe
583d74fcc9 Harden Mega web flow and smooth download runtime
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 06:17:15 +01:00
Sucukdeluxe
40bfda2ad7 Switch MegaDebrid to web-only flow and reduce UI lag
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 06:01:28 +01:00
Sucukdeluxe
b1b8ed4180 Switch Mega web fallback to real debrideur form flow and bump to 1.1.20
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 05:47:19 +01:00
Sucukdeluxe
0e898733d6 Restore in-app updater and add Mega web fallback path
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 05:28:50 +01:00
Sucukdeluxe
704826b421 Add Mega-Debrid unrestrict workarounds and bump to 1.1.18
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 05:06:05 +01:00
Sucukdeluxe
7ac61ce64a Fix provider selection persistence, queue naming, cancel removal, and update prompts
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 04:56:53 +01:00
Sucukdeluxe
3ef2ee732a Move provider settings to tab and improve DLC filename resolution
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 04:40:21 +01:00
Sucukdeluxe
02370a40b4 Restore update checks and startup notifications
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 04:22:00 +01:00
Sucukdeluxe
c7813c26a8 Normalize debrid filenames and bump to 1.1.14
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 04:16:14 +01:00
Sucukdeluxe
7fe7192bdb Fix packaged renderer asset paths and bump to 1.1.13
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 04:09:08 +01:00
Sucukdeluxe
cbc1ffa18b Add multi-provider fallback with AllDebrid and fix packaged UI path
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 04:02:31 +01:00
Sucukdeluxe
f27584d6ee Disable auto-publish in builder and bump to 1.1.11
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 03:32:02 +01:00
Sucukdeluxe
1049eb3c07 Fix Electron release metadata and bump version to 1.1.10
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 03:28:52 +01:00
Sucukdeluxe
b96ed1eb7a Migrate app to Node Electron with modern React UI
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 03:25:56 +01:00
Sucukdeluxe
56e4355d6b Polish UI with modern card layout and refined dark theme
Some checks are pending
Build and Release / build (push) Waiting to run
2026-02-27 02:55:51 +01:00
86 changed files with 44608 additions and 4460 deletions

View File

@ -1,65 +0,0 @@
name: Build and Release
permissions:
contents: write
on:
push:
tags:
- "v*"
jobs:
build:
runs-on: windows-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: "3.11"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install pyinstaller pillow
- name: Prepare release metadata
shell: pwsh
run: |
$version = "${{ github.ref_name }}".TrimStart('v')
python scripts/set_version.py $version
python scripts/prepare_icon.py
- name: Build exe
run: |
pyinstaller --noconfirm --windowed --onedir --name "Real-Debrid-Downloader" --icon "assets/app_icon.ico" real_debrid_downloader_gui.py
- name: Pack release zip
shell: pwsh
run: |
New-Item -ItemType Directory -Path release -Force | Out-Null
Compress-Archive -Path "dist/Real-Debrid-Downloader/*" -DestinationPath "Real-Debrid-Downloader-win64.zip" -Force
- name: Install Inno Setup
shell: pwsh
run: |
choco install innosetup --no-progress -y
- name: Build installer
shell: pwsh
run: |
$version = "${{ github.ref_name }}".TrimStart('v')
& "C:\Program Files (x86)\Inno Setup 6\ISCC.exe" "/DMyAppVersion=$version" "/DMySourceDir=..\\dist\\Real-Debrid-Downloader" "/DMyOutputDir=..\\release" "/DMyIconFile=..\\assets\\app_icon.ico" "installer\\RealDebridDownloader.iss"
- name: Publish GitHub Release
uses: softprops/action-gh-release@v2
with:
files: |
Real-Debrid-Downloader-win64.zip
release/Real-Debrid-Downloader-Setup-*.exe
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

21
.gitignore vendored
View File

@ -16,3 +16,24 @@ rd_downloader.log
rd_download_manifest.json
_update_staging/
apply_update.cmd
.claude/
.github/
docs/plans/
CHANGELOG.md
node_modules/
.vite/
coverage/
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Forgejo deployment runtime files
deploy/forgejo/.env
deploy/forgejo/forgejo/
deploy/forgejo/postgres/
deploy/forgejo/caddy/data/
deploy/forgejo/caddy/config/
deploy/forgejo/caddy/logs/
deploy/forgejo/backups/

23
CLAUDE.md Normal file
View File

@ -0,0 +1,23 @@
## Release + Update Source (Wichtig)
- Primäre Plattform ist `https://git.24-music.de`
- Standard-Repo: `Administrator/real-debrid-downloader`
- Nicht mehr primär über Codeberg/GitHub releasen
## Releasen
1. Token setzen:
- PowerShell: `$env:GITEA_TOKEN="<token>"`
2. Release ausführen:
- `npm run release:gitea -- <version> [notes]`
Das Script:
- bumped `package.json`
- baut Windows-Artefakte
- pusht `main` + Tag
- erstellt Release auf `git.24-music.de`
- lädt Assets hoch
## Auto-Update
- Updater nutzt aktuell `git.24-music.de` als Standardquelle

240
README.md
View File

@ -1,116 +1,174 @@
# Real-Debrid Downloader GUI
# Multi Debrid Downloader
Kleine Desktop-App mit GUI (Tkinter), um mehrere Links (z. B. 20+) einzufuegen,
ueber Real-Debrid zu unrestricten und direkt auf deinen PC zu laden.
Desktop downloader with fast queue management, automatic extraction, and robust error handling.
## Features
![Platform](https://img.shields.io/badge/platform-Windows%2010%2F11-0078D6)
![Electron](https://img.shields.io/badge/Electron-31.x-47848F)
![React](https://img.shields.io/badge/React-18.x-149ECA)
![TypeScript](https://img.shields.io/badge/TypeScript-5.x-3178C6)
![License](https://img.shields.io/badge/license-MIT-green)
- Mehrere Links auf einmal (ein Link pro Zeile)
- DLC Import (`.dlc`) ueber dcrypt.it inklusive Paket-Gruppierung
- DLC Drag-and-Drop: `.dlc` direkt in den Links-Bereich ziehen
- Nutzt die Real-Debrid API (`/unrestrict/link`)
- Download-Status pro Link
- Paket-Ansicht: Paket ist aufklappbar, darunter alle Einzel-Links
- Laufende Pakete koennen per Rechtsklick direkt abgebrochen/entfernt werden
- Download-Speed pro Link und gesamt
- Gesamt-Fortschritt
- Download-Ordner und Paketname waehlbar
- Einstellbare Parallel-Downloads (z. B. 20 gleichzeitig)
- Parallel-Wert kann waehrend laufender Downloads live angepasst werden
- Retry-Counter pro Link in der Tabelle
- Automatisches Entpacken nach dem Download
- Hybrid-Entpacken: entpackt sofort, sobald ein Archivsatz komplett ist
- Optionales Auto-Cleanup: Archivteile nach erfolgreichem Entpacken loeschen
- Speed-Limit (global oder pro Download), live aenderbar
- Linklisten als `.txt` speichern/laden
- DLC-Dateien als Paketliste importieren (`DLC import`)
- `Entpacken nach` + optional `Unterordner erstellen (Paketname)` wie bei JDownloader
- `Settings` (JDownloader-Style):
- Nach erfolgreichem Entpacken: keine / Papierkorb / unwiderruflich loeschen
- Bei Konflikten: ueberschreiben / ueberspringen / umbenennen
- ZIP-Passwort-Check mit `serienfans.org` und `serienjunkies.net`
- Multi-Part-RAR wird ueber `part1` entpackt (nur wenn alle Parts vorhanden sind)
- Auto-Update Check ueber GitHub Releases (fuer .exe)
- Optionales lokales Speichern vom API Token
## Why this tool?
## Voraussetzung
- Familiar download-manager workflow: collect links, start, pause, resume, and finish cleanly.
- Multiple debrid providers in one app, including automatic fallback.
- Built for stability with large queues: session persistence, reconnect handling, resume support, and integrity verification.
- Python 3.10+
- Optional, aber empfohlen: 7-Zip im PATH fuer RAR/7Z-Entpackung
- Alternative fuer RAR: WinRAR `UnRAR.exe` (wird automatisch erkannt)
## Core features
### Queue and download engine
- Package-based queue with file status, progress, ETA, speed, and retry counters.
- Start, pause, stop, and cancel for both single items and full packages.
- Multi-select via Ctrl+Click for batch operations on packages and items.
- Duplicate handling when adding links: keep, skip, or overwrite.
- Session recovery after restart, including optional auto-resume.
- Circuit breaker with escalating backoff cooldowns to handle provider outages gracefully.
### Debrid and link handling
- Supported providers: `realdebrid`, `megadebrid`, `bestdebrid`, `alldebrid`.
- Configurable provider order: primary + secondary + tertiary.
- Optional automatic fallback to alternative providers on failures.
- `.dlc` import via file picker and drag-and-drop.
### Extraction, cleanup, and quality
- JVM-based extraction backend using SevenZipJBinding + Zip4j (supports RAR, 7z, ZIP, and more).
- Automatic fallback to legacy UnRAR/7z CLI tools when JVM is unavailable.
- Auto-extract with separate target directory and conflict strategies.
- Hybrid extraction: simultaneous downloading and extracting with smart I/O priority throttling.
- Nested extraction: archives within archives are automatically extracted (one level deep).
- Pre-extraction disk space validation to prevent incomplete extracts.
- Right-click "Extract now" on any package with at least one completed item.
- Post-download integrity checks (`CRC32`, `MD5`, `SHA1`) with auto-retry on failures.
- Completed-item cleanup policy: `never`, `immediate`, `on_start`, `package_done`.
- Optional removal of link artifacts and sample files after extraction.
### Auto-rename
- Automatic renaming of extracted files based on series/episode patterns.
- Multi-episode token parsing for batch renames.
### UI and progress
- Visual progress bars with percentage overlay for packages and individual items.
- Real-time bandwidth chart showing current download speeds.
- Persistent download counters: all-time totals and per-session statistics.
- Download history for completed packages.
- Vertical sidebar with organized settings tabs.
- Hoster display showing both the original source and the debrid provider used.
### Convenience and automation
- Clipboard watcher for automatic link detection.
- Minimize-to-tray with tray menu controls.
- Speed limits globally or per download.
- Bandwidth schedules for time-based speed profiles.
- Built-in auto-updater via `git.24-music.de` Releases.
- Long path support (>260 characters) on Windows.
## Installation
```bash
python -m venv .venv
.venv\Scripts\activate
pip install -r requirements.txt
```
### Option A: prebuilt releases (recommended)
## Start
1. Download a release from the `git.24-music.de` Releases page.
2. Run the installer or portable build.
3. Add your debrid tokens in Settings.
Releases: `https://git.24-music.de/Administrator/real-debrid-downloader/releases`
### Option B: build from source
Requirements:
- Node.js `20+` (recommended `22+`)
- npm
- Windows `10/11` (for packaging and regular desktop use)
- Java Runtime `8+` (for SevenZipJBinding sidecar backend)
- Optional fallback: 7-Zip/UnRAR if you force legacy extraction mode
```bash
python real_debrid_downloader_gui.py
npm install
npm run dev
```
## Nutzung
## NPM scripts
1. API Token von Real-Debrid eintragen (`https://real-debrid.com/apitoken`)
2. Download-Ordner waehlen
3. Optional Paketname setzen (sonst wird automatisch einer erzeugt)
4. Optional Entpack-Ordner waehlen (`Entpacken nach`)
5. Optional `Unterordner erstellen (Paketname)` aktiv lassen
6. Optional `Hybrid-Entpacken` und `Cleanup` setzen
7. Parallel-Wert setzen (z. B. 20)
8. Optional Speed-Limit setzen (KB/s, Modus `global` oder `per_download`)
9. Links einfuegen oder per `Links laden` / `DLC import` importieren
10. `Download starten` klicken
| Command | Description |
| --- | --- |
| `npm run dev` | Starts main process, renderer, and Electron in dev mode |
| `npm run build` | Builds main and renderer bundles |
| `npm run start` | Starts the app locally in production mode |
| `npm test` | Runs Vitest unit tests |
| `npm run self-check` | Runs integrated end-to-end self-checks |
| `npm run release:win` | Creates Windows installer and portable build |
| `npm run release:gitea -- <version> [notes]` | One-command version bump + build + tag + release upload to `git.24-music.de` |
| `npm run release:codeberg -- <version> [notes]` | Legacy path for old Codeberg workflow |
Wenn du 20 Links einfuegst, werden sie als ein Paket behandelt. Downloads landen in einem Paketordner. Beim Entpacken kann derselbe Paketname automatisch als Unterordner genutzt werden.
Bei DLC-Import mit vielen Paketen setzt die App automatisch Paketmarker (`# package: ...`) und verarbeitet die Pakete in einer Queue.
## Auto-Update (GitHub)
1. Standard-Repo ist bereits gesetzt: `Sucukdeluxe/real-debrid-downloader`
2. Optional kannst du es in der App mit `GitHub Repo (owner/name)` ueberschreiben
3. Klicke `Update suchen` oder aktiviere `Beim Start auf Updates pruefen`
4. In der .exe wird ein neues Release heruntergeladen und beim Neustart installiert
Hinweis: Beim Python-Skript gibt es nur einen Release-Hinweis, kein Self-Replace.
## Release Build (.exe)
### One-command git.24-music release
```bash
./build_exe.ps1 -Version 1.1.0
npm run release:gitea -- 1.6.31 "- Maintenance update"
```
Danach liegt die App unter `dist/Real-Debrid-Downloader/`.
This command will:
## GitHub Release Workflow
1. Bump `package.json` version.
2. Build setup/portable artifacts (`npm run release:win`).
3. Commit and push `main` to your `git.24-music.de` remote.
4. Create and push tag `v<version>`.
5. Create/update the Gitea release and upload required assets.
- Workflow-Datei: `.github/workflows/release.yml`
- Bei Tag-Push wie `v1.0.1` wird automatisch eine Windows-EXE gebaut
- Release-Asset fuer Auto-Update: `Real-Debrid-Downloader-win64.zip`
- Zusaetzlich wird ein Installer gebaut: `Real-Debrid-Downloader-Setup-<version>.exe`
- Installer legt automatisch eine Desktop-Verknuepfung an
## Auto-Installer
- Im GitHub Release findest du direkt die Setup-Datei (`...Setup-<version>.exe`)
- Setup installiert die App unter `Programme/Real-Debrid Downloader`
- Setup erstellt automatisch eine Desktop-Verknuepfung mit App-Icon
## App-Icon
- Das Projekt nutzt `assets/app_icon.png` (aus deinem aktuellen Downloads-Icon)
- Beim Build wird automatisch `assets/app_icon.ico` erzeugt
Beispiel:
Required once before release:
```bash
git tag v1.0.1
git push origin v1.0.1
git remote add gitea https://git.24-music.de/<user>/<repo>.git
```
Hinweis: Die App kann nur Links laden, die von Real-Debrid unterstuetzt werden.
PowerShell token setup:
```powershell
$env:GITEA_TOKEN="<dein-token>"
```
## Typical workflow
1. Add provider tokens in Settings.
2. Paste/import links or `.dlc` containers.
3. Optionally set package names, target folders, extraction, and cleanup rules.
4. Start the queue and monitor progress in the Downloads tab.
5. Review integrity results and summary after completion.
## Project structure
- `src/main` - Electron main process, queue/download/provider logic
- `src/preload` - secure IPC bridge between main and renderer
- `src/renderer` - React UI
- `src/shared` - shared types and IPC contracts
- `tests` - unit tests and self-check tests
- `resources/extractor-jvm` - SevenZipJBinding + Zip4j sidecar JAR and native libraries
## Data and logs
The app stores runtime files in Electron's `userData` directory, including:
- `rd_downloader_config.json`
- `rd_session_state.json`
- `rd_downloader.log`
## Troubleshooting
- Download does not start: verify token and selected provider in Settings.
- Extraction fails: check archive passwords, JVM runtime (`resources/extractor-jvm`), or force legacy mode with `RD_EXTRACT_BACKEND=legacy`.
- Very slow downloads: check active speed limit and bandwidth schedules.
- Unexpected interruptions: enable reconnect and fallback providers.
- Stalled downloads: the app auto-detects stalls within 10 seconds and retries automatically.
## Changelog
Release history is available on [git.24-music.de Releases](https://git.24-music.de/Administrator/real-debrid-downloader/releases).
## License
MIT - see `LICENSE`.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 121 KiB

After

Width:  |  Height:  |  Size: 279 KiB

View File

@ -1,16 +0,0 @@
param(
[string]$Version = ""
)
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install pyinstaller pillow
if ($Version -ne "") {
python scripts/set_version.py $Version
}
python scripts/prepare_icon.py
pyinstaller --noconfirm --windowed --onedir --name "Real-Debrid-Downloader" --icon "assets/app_icon.ico" real_debrid_downloader_gui.py
Write-Host "Build fertig: dist/Real-Debrid-Downloader/Real-Debrid-Downloader.exe"

View File

@ -25,11 +25,11 @@ AppPublisher=Sucukdeluxe
DefaultDirName={autopf}\{#MyAppName}
DefaultGroupName={#MyAppName}
OutputDir={#MyOutputDir}
OutputBaseFilename=Real-Debrid-Downloader-Setup-{#MyAppVersion}
OutputBaseFilename=Real-Debrid-Downloader Setup {#MyAppVersion}
Compression=lzma
SolidCompression=yes
WizardStyle=modern
PrivilegesRequired=admin
PrivilegesRequired=lowest
ArchitecturesInstallIn64BitMode=x64compatible
UninstallDisplayIcon={app}\{#MyAppExeName}
SetupIconFile={#MyIconFile}
@ -39,8 +39,8 @@ Name: "german"; MessagesFile: "compiler:Languages\German.isl"
Name: "english"; MessagesFile: "compiler:Default.isl"
[Files]
Source: "{#MySourceDir}\\*"; DestDir: "{app}"; Flags: ignoreversion recursesubdirs createallsubdirs
Source: "{#MyIconFile}"; DestDir: "{app}"; DestName: "app_icon.ico"; Flags: ignoreversion
Source: "{#MySourceDir}\\*"; DestDir: "{app}"; Flags: recursesubdirs createallsubdirs
Source: "{#MyIconFile}"; DestDir: "{app}"; DestName: "app_icon.ico"
[Icons]
Name: "{group}\{#MyAppName}"; Filename: "{app}\{#MyAppExeName}"; IconFilename: "{app}\app_icon.ico"

9765
package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

80
package.json Normal file
View File

@ -0,0 +1,80 @@
{
"name": "real-debrid-downloader",
"version": "1.6.55",
"description": "Desktop downloader",
"main": "build/main/main/main.js",
"author": "Sucukdeluxe",
"license": "MIT",
"scripts": {
"dev": "concurrently -k \"npm:dev:main:watch\" \"npm:dev:renderer\" \"npm:dev:electron\"",
"dev:renderer": "vite",
"dev:main:watch": "tsup src/main/main.ts src/preload/preload.ts --out-dir build/main --format cjs --target node20 --external electron --sourcemap --watch",
"dev:electron": "wait-on tcp:5173 file:build/main/main/main.js && cross-env NODE_ENV=development electron .",
"build": "npm run build:main && npm run build:renderer",
"build:main": "tsup src/main/main.ts src/preload/preload.ts --out-dir build/main --format cjs --target node20 --external electron --sourcemap",
"build:renderer": "vite build",
"start": "cross-env NODE_ENV=production electron .",
"test": "vitest run",
"self-check": "tsx tests/self-check.ts",
"release:win": "npm run build && electron-builder --publish never --win nsis portable",
"release:gitea": "node scripts/release_gitea.mjs",
"release:forgejo": "node scripts/release_gitea.mjs"
},
"dependencies": {
"adm-zip": "^0.5.16",
"react": "^18.3.1",
"react-dom": "^18.3.1",
"uuid": "^11.1.0"
},
"devDependencies": {
"@types/adm-zip": "^0.5.7",
"@types/node": "^24.0.13",
"@types/react": "^18.3.12",
"@types/react-dom": "^18.3.1",
"@types/uuid": "^10.0.0",
"@vitejs/plugin-react": "^4.3.4",
"concurrently": "^9.0.1",
"cross-env": "^7.0.3",
"electron": "^31.7.7",
"electron-builder": "^25.1.8",
"rcedit": "^5.0.2",
"tsup": "^8.3.6",
"tsx": "^4.19.2",
"typescript": "^5.7.3",
"vite": "^6.0.5",
"vitest": "^2.1.8",
"wait-on": "^8.0.1"
},
"build": {
"appId": "com.sucukdeluxe.realdebrid",
"productName": "Real-Debrid-Downloader",
"directories": {
"buildResources": "assets",
"output": "release"
},
"files": [
"build/main/**/*",
"build/renderer/**/*",
"resources/extractor-jvm/**/*",
"package.json"
],
"asarUnpack": [
"resources/extractor-jvm/**/*"
],
"win": {
"target": [
"nsis",
"portable"
],
"icon": "assets/app_icon.ico",
"signAndEditExecutable": false
},
"nsis": {
"oneClick": false,
"perMachine": false,
"allowToChangeInstallationDirectory": true,
"createDesktopShortcut": true
},
"afterPack": "scripts/afterPack.cjs"
}
}

File diff suppressed because it is too large Load Diff

View File

@ -1,5 +0,0 @@
requests>=2.31.0
pyzipper>=0.3.6
send2trash>=1.8.2
keyring>=25.6.0
tkinterdnd2>=0.4.2

View File

@ -0,0 +1,22 @@
# JVM extractor runtime
This directory contains the Java sidecar runtime used by `src/main/extractor.ts`.
## Included backends
- `sevenzipjbinding` for the primary extraction path (RAR/7z/ZIP and others)
- `zip4j` for ZIP multipart handling (JD-style split ZIP behavior)
## Layout
- `classes/` compiled `JBindExtractorMain` classes
- `lib/` runtime jars required by the sidecar
- `src/` Java source for the sidecar
## Rebuild notes
The checked-in classes are Java 8 compatible and built from:
`resources/extractor-jvm/src/com/sucukdeluxe/extractor/JBindExtractorMain.java`
If you need to rebuild, compile against the jars in `lib/` with a Java 8-compatible compiler.

View File

@ -0,0 +1,12 @@
Bundled JVM extractor dependencies:
1) sevenzipjbinding (16.02-2.01)
- Maven artifact: net.sf.sevenzipjbinding:sevenzipjbinding
- Maven artifact: net.sf.sevenzipjbinding:sevenzipjbinding-all-platforms
- Upstream: https://sevenzipjbind.sourceforge.net/
2) zip4j (2.11.5)
- Maven artifact: net.lingala.zip4j:zip4j
- Upstream: https://github.com/srikanth-lingala/zip4j
Please review upstream licenses and notices before redistribution.

Binary file not shown.

Binary file not shown.

File diff suppressed because it is too large Load Diff

18
scripts/afterPack.cjs Normal file
View File

@ -0,0 +1,18 @@
const path = require("path");
const { rcedit } = require("rcedit");
module.exports = async function afterPack(context) {
const productFilename = context.packager?.appInfo?.productFilename;
if (!productFilename) {
console.warn(" • rcedit: skipped — productFilename not available");
return;
}
const exePath = path.join(context.appOutDir, `${productFilename}.exe`);
const iconPath = path.resolve(__dirname, "..", "assets", "app_icon.ico");
console.log(` • rcedit: patching icon → ${exePath}`);
try {
await rcedit(exePath, { icon: iconPath });
} catch (error) {
console.warn(` • rcedit: failed — ${String(error)}`);
}
};

View File

@ -0,0 +1,51 @@
import { DebridService } from "../src/main/debrid";
import { defaultSettings } from "../src/main/constants";
import { MegaWebFallback } from "../src/main/mega-web-fallback";
const links = [
"https://rapidgator.net/file/837ef967aede4935e3e0374c4e663b40/GTHDERTPIIP7P401.part1.rar.html",
"https://rapidgator.net/file/ef3c9d64c899f801d69d6888dad89dcd/GTHDERTPIIP7P401.part2.rar.html",
"https://rapidgator.net/file/b38130fcf1e8448953250b9a1ed7958d/GTHDERTPIIP7P401.part3.rar.html"
];
const settings = {
...defaultSettings(),
token: process.env.RD_TOKEN || "",
megaLogin: process.env.MEGA_LOGIN || "",
megaPassword: process.env.MEGA_PASSWORD || "",
bestToken: process.env.BEST_TOKEN || "",
allDebridToken: process.env.ALLDEBRID_TOKEN || "",
providerPrimary: "alldebrid" as const,
providerSecondary: "realdebrid" as const,
providerTertiary: "megadebrid" as const,
autoProviderFallback: true
};
if (!settings.token && !(settings.megaLogin && settings.megaPassword) && !settings.bestToken && !settings.allDebridToken) {
console.error("No provider credentials set. Use RD_TOKEN or MEGA_LOGIN+MEGA_PASSWORD or BEST_TOKEN or ALLDEBRID_TOKEN.");
process.exit(1);
}
async function main(): Promise<void> {
const megaWeb = new MegaWebFallback(() => ({
login: settings.megaLogin,
password: settings.megaPassword
}));
try {
const service = new DebridService(settings, {
megaWebUnrestrict: (link) => megaWeb.unrestrict(link)
});
for (const link of links) {
try {
const result = await service.unrestrictLink(link);
console.log(`[OK] ${result.providerLabel} -> ${result.fileName}`);
} catch (error) {
console.log(`[FAIL] ${String(error)}`);
}
}
} finally {
megaWeb.dispose();
}
}
main().catch(e => { console.error(e); process.exit(1); });

View File

@ -0,0 +1,148 @@
const LOGIN = process.env.MEGA_LOGIN || "";
const PASSWORD = process.env.MEGA_PASSWORD || "";
const LINKS = [
"https://rapidgator.net/file/90b5397dfc3e1a0e561db7d6b89d5604/scnb-rrw7-S08E01.part1.rar.html",
"https://rapidgator.net/file/8ddf856dc833310c5cae9db82caf9682/scnb-rrw7-S08E01.part2.rar.html",
"https://rapidgator.net/file/440eed67d266476866332ae224c3fad5/scnb-rrw7-S08E01.part3.rar.html"
];
if (!LOGIN || !PASSWORD) {
throw new Error("Set MEGA_LOGIN and MEGA_PASSWORD env vars");
}
function sleep(ms) {
return new Promise((resolve) => setTimeout(resolve, ms));
}
function cookieFrom(headers) {
const cookies = headers.getSetCookie();
return cookies.map((x) => x.split(";")[0].trim()).filter(Boolean).join("; ");
}
function parseDebridCodes(html) {
const re = /processDebrid\((\d+),'([^']+)',0\)/g;
const out = [];
let m;
while ((m = re.exec(html)) !== null) {
out.push({ id: Number(m[1]), code: m[2] });
}
return out;
}
async function resolveCode(cookie, code) {
for (let attempt = 1; attempt <= 50; attempt += 1) {
const res = await fetch("https://www.mega-debrid.eu/index.php?ajax=debrid&json", {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "Mozilla/5.0",
Cookie: cookie,
Referer: "https://www.mega-debrid.eu/index.php?page=debrideur&lang=de"
},
body: new URLSearchParams({
code,
autodl: "0"
})
});
const text = (await res.text()).trim();
if (text === "reload") {
if (attempt % 5 === 0) {
console.log(` [retry] code=${code} attempt=${attempt}/50 (waiting for server)`);
}
await sleep(800);
continue;
}
if (text === "false") {
return { ok: false, reason: "false" };
}
try {
const parsed = JSON.parse(text);
if (parsed?.link) {
return { ok: true, link: String(parsed.link), text: String(parsed.text || "") };
}
return { ok: false, reason: text };
} catch {
return { ok: false, reason: text };
}
}
return { ok: false, reason: "timeout" };
}
async function probeDownload(url) {
const res = await fetch(url, {
method: "GET",
headers: {
Range: "bytes=0-4095",
"User-Agent": "Mozilla/5.0"
},
redirect: "manual"
});
return {
status: res.status,
location: res.headers.get("location") || "",
contentType: res.headers.get("content-type") || "",
contentLength: res.headers.get("content-length") || ""
};
}
async function main() {
const loginRes = await fetch("https://www.mega-debrid.eu/index.php?form=login", {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "Mozilla/5.0"
},
body: new URLSearchParams({
login: LOGIN,
password: PASSWORD,
remember: "on"
}),
redirect: "manual"
});
if (loginRes.status >= 400) {
throw new Error(`Login failed with HTTP ${loginRes.status}`);
}
const cookie = cookieFrom(loginRes.headers);
if (!cookie) {
throw new Error("Login returned no session cookie");
}
console.log("login", loginRes.status, loginRes.headers.get("location") || "");
const debridRes = await fetch("https://www.mega-debrid.eu/index.php?form=debrid", {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "Mozilla/5.0",
Cookie: cookie,
Referer: "https://www.mega-debrid.eu/index.php?page=debrideur&lang=de"
},
body: new URLSearchParams({
links: LINKS.join("\n"),
password: "",
showLinks: "1"
})
});
const html = await debridRes.text();
const codes = parseDebridCodes(html);
console.log("codes", codes.length);
if (codes.length === 0) {
throw new Error("No processDebrid codes found");
}
for (let i = 0; i < Math.min(3, codes.length); i += 1) {
const c = codes[i];
const resolved = await resolveCode(cookie, c.code);
if (!resolved.ok) {
console.log(`[FAIL] code ${c.code}: ${resolved.reason}`);
continue;
}
console.log(`[OK] code ${c.code} -> ${resolved.link}`);
const probe = await probeDownload(resolved.link);
console.log(` probe status=${probe.status} type=${probe.contentType} len=${probe.contentLength} loc=${probe.location}`);
}
}
await main().catch((e) => { console.error(e); process.exit(1); });

View File

@ -1,29 +0,0 @@
from pathlib import Path
def main() -> int:
project_root = Path(__file__).resolve().parents[1]
png_path = project_root / "assets" / "app_icon.png"
ico_path = project_root / "assets" / "app_icon.ico"
if not png_path.exists():
print(f"Icon PNG not found: {png_path}")
return 1
try:
from PIL import Image
except ImportError:
print("Pillow missing. Install with: pip install pillow")
return 1
with Image.open(png_path) as image:
image = image.convert("RGBA")
sizes = [(16, 16), (24, 24), (32, 32), (48, 48), (64, 64), (128, 128), (256, 256)]
image.save(ico_path, format="ICO", sizes=sizes)
print(f"Wrote icon: {ico_path}")
return 0
if __name__ == "__main__":
raise SystemExit(main())

View File

@ -0,0 +1,297 @@
const RAPIDGATOR_LINKS = [
"https://rapidgator.net/file/837ef967aede4935e3e0374c4e663b40/GTHDERTPIIP7P401.part1.rar.html",
"https://rapidgator.net/file/ef3c9d64c899f801d69d6888dad89dcd/GTHDERTPIIP7P401.part2.rar.html",
"https://rapidgator.net/file/b38130fcf1e8448953250b9a1ed7958d/GTHDERTPIIP7P401.part3.rar.html"
];
const rdToken = process.env.RD_TOKEN || "";
const megaLogin = process.env.MEGA_LOGIN || "";
const megaPassword = process.env.MEGA_PASSWORD || "";
const bestToken = process.env.BEST_TOKEN || "";
const allDebridToken = process.env.ALLDEBRID_TOKEN || "";
let megaCookie = "";
if (!rdToken && !(megaLogin && megaPassword) && !bestToken && !allDebridToken) {
console.error("No provider credentials configured. Set RD_TOKEN and/or MEGA_LOGIN+MEGA_PASSWORD and/or BEST_TOKEN and/or ALLDEBRID_TOKEN.");
process.exit(1);
}
function asRecord(value) {
if (!value || typeof value !== "object" || Array.isArray(value)) {
return null;
}
return value;
}
function pickString(payload, keys) {
if (!payload) {
return "";
}
for (const key of keys) {
const value = payload[key];
if (typeof value === "string" && value.trim()) {
return value.trim();
}
}
return "";
}
function parseResponseError(status, bodyText, payload) {
return pickString(payload, ["response_text", "error", "message", "error_description"]) || bodyText || `HTTP ${status}`;
}
async function callRealDebrid(link) {
const response = await fetch("https://api.real-debrid.com/rest/1.0/unrestrict/link", {
method: "POST",
headers: {
Authorization: `Bearer ${rdToken}`,
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "RD-Node-Downloader/1.1.12"
},
body: new URLSearchParams({ link })
});
const text = await response.text();
const payload = asRecord(safeJson(text));
if (!response.ok) {
return { ok: false, error: parseResponseError(response.status, text, payload) };
}
const direct = pickString(payload, ["download", "link"]);
if (!direct) {
return { ok: false, error: "Real-Debrid returned no download URL" };
}
return {
ok: true,
direct,
fileName: pickString(payload, ["filename", "fileName"])
};
}
// megaCookie is intentionally cached at module scope so that multiple
// callMegaDebrid() invocations reuse the same session cookie.
async function callMegaDebrid(link) {
if (!megaCookie) {
const loginRes = await fetch("https://www.mega-debrid.eu/index.php?form=login", {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "Mozilla/5.0"
},
body: new URLSearchParams({ login: megaLogin, password: megaPassword, remember: "on" }),
redirect: "manual"
});
if (loginRes.status >= 400) {
return { ok: false, error: `Mega-Web login failed with HTTP ${loginRes.status}` };
}
megaCookie = loginRes.headers.getSetCookie()
.map((chunk) => chunk.split(";")[0].trim())
.filter(Boolean)
.join("; ");
if (!megaCookie) {
return { ok: false, error: "Mega-Web login returned no session cookie" };
}
}
const debridRes = await fetch("https://www.mega-debrid.eu/index.php?form=debrid", {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "Mozilla/5.0",
Cookie: megaCookie,
Referer: "https://www.mega-debrid.eu/index.php?page=debrideur&lang=de"
},
body: new URLSearchParams({ links: link, password: "", showLinks: "1" })
});
const html = await debridRes.text();
const code = html.match(/processDebrid\(\d+,'([^']+)',0\)/i)?.[1] || "";
if (!code) {
return { ok: false, error: "Mega-Web returned no processDebrid code" };
}
for (let attempt = 1; attempt <= 40; attempt += 1) {
const ajaxRes = await fetch("https://www.mega-debrid.eu/index.php?ajax=debrid&json", {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "Mozilla/5.0",
Cookie: megaCookie,
Referer: "https://www.mega-debrid.eu/index.php?page=debrideur&lang=de"
},
body: new URLSearchParams({ code, autodl: "0" })
});
const txt = (await ajaxRes.text()).trim();
if (txt === "reload") {
await new Promise((resolve) => setTimeout(resolve, 650));
continue;
}
if (txt === "false") {
return { ok: false, error: "Mega-Web returned false" };
}
const payload = safeJson(txt);
const direct = String(payload?.link || "");
if (!direct) {
const msg = String(payload?.text || txt || "Mega-Web no link");
if (/hoster does not respond correctly|could not be done for this moment/i.test(msg)) {
await new Promise((resolve) => setTimeout(resolve, 1200));
continue;
}
return { ok: false, error: msg };
}
return {
ok: true,
direct,
fileName: pickString(asRecord(payload), ["filename"]) || ""
};
}
return { ok: false, error: "Mega-Web timeout while generating link" };
}
async function callBestDebrid(link) {
const encoded = encodeURIComponent(link);
const requests = [
{
url: `https://bestdebrid.com/api/v1/generateLink?link=${encoded}`,
useHeader: true
},
{
url: `https://bestdebrid.com/api/v1/generateLink?auth=${encodeURIComponent(bestToken)}&link=${encoded}`,
useHeader: false
}
];
let lastError = "Unknown BestDebrid error";
for (const req of requests) {
const headers = {
"User-Agent": "RD-Node-Downloader/1.1.12"
};
if (req.useHeader) {
headers.Authorization = bestToken;
}
const response = await fetch(req.url, {
method: "GET",
headers
});
const text = await response.text();
const parsed = safeJson(text);
const payload = Array.isArray(parsed) ? asRecord(parsed[0]) : asRecord(parsed);
if (!response.ok) {
lastError = parseResponseError(response.status, text, payload);
continue;
}
const direct = pickString(payload, ["download", "debridLink", "link"]);
if (!direct) {
lastError = pickString(payload, ["response_text", "message", "error"]) || "BestDebrid returned no download URL";
continue;
}
return {
ok: true,
direct,
fileName: pickString(payload, ["filename", "fileName"])
};
}
return { ok: false, error: lastError };
}
async function callAllDebrid(link) {
const response = await fetch("https://api.alldebrid.com/v4/link/unlock", {
method: "POST",
headers: {
Authorization: `Bearer ${allDebridToken}`,
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "RD-Node-Downloader/1.1.12"
},
body: new URLSearchParams({ link })
});
const text = await response.text();
const payload = asRecord(safeJson(text));
if (!response.ok) {
return { ok: false, error: parseResponseError(response.status, text, payload) };
}
if (pickString(payload, ["status"]) === "error") {
const err = asRecord(payload?.error);
return { ok: false, error: pickString(err, ["message", "code"]) || "AllDebrid API error" };
}
const data = asRecord(payload?.data);
const direct = pickString(data, ["link"]);
if (!direct) {
return { ok: false, error: "AllDebrid returned no download URL" };
}
return {
ok: true,
direct,
fileName: pickString(data, ["filename"])
};
}
function safeJson(text) {
try {
return JSON.parse(text);
} catch {
return null;
}
}
function hostFromUrl(url) {
try {
return new URL(url).host;
} catch {
return "invalid-url";
}
}
async function main() {
const providers = [];
if (rdToken) {
providers.push({ name: "Real-Debrid", run: callRealDebrid });
}
if (megaLogin && megaPassword) {
providers.push({ name: "Mega-Debrid", run: callMegaDebrid });
}
if (bestToken) {
providers.push({ name: "BestDebrid", run: callBestDebrid });
}
if (allDebridToken) {
providers.push({ name: "AllDebrid", run: callAllDebrid });
}
let failures = 0;
for (const link of RAPIDGATOR_LINKS) {
console.log(`\nLink: ${link}`);
const results = [];
for (const provider of providers) {
try {
const result = await provider.run(link);
results.push({ provider: provider.name, ...result });
} catch (error) {
results.push({ provider: provider.name, ok: false, error: String(error) });
}
}
for (const result of results) {
if (result.ok) {
console.log(` [OK] ${result.provider} -> ${hostFromUrl(result.direct)} ${result.fileName ? `(${result.fileName})` : ""}`);
} else {
console.log(` [FAIL] ${result.provider} -> ${result.error}`);
}
}
const fallbackPick = results.find((entry) => entry.ok);
if (fallbackPick) {
console.log(` [AUTO] Selected by fallback order: ${fallbackPick.provider}`);
} else {
failures += 1;
console.log(" [AUTO] No provider could unrestrict this link");
}
}
if (failures > 0) {
process.exitCode = 2;
}
}
await main().catch((e) => { console.error(e); process.exit(1); });

348
scripts/release_gitea.mjs Normal file
View File

@ -0,0 +1,348 @@
import fs from "node:fs";
import path from "node:path";
import { spawnSync } from "node:child_process";
const NPM_EXECUTABLE = process.platform === "win32" ? "npm.cmd" : "npm";
function run(command, args, options = {}) {
const result = spawnSync(command, args, {
cwd: process.cwd(),
encoding: "utf8",
stdio: options.capture ? ["pipe", "pipe", "pipe"] : "inherit"
});
if (result.status !== 0) {
const stderr = result.stderr ? String(result.stderr).trim() : "";
const stdout = result.stdout ? String(result.stdout).trim() : "";
const details = [stderr, stdout].filter(Boolean).join("\n");
throw new Error(`Command failed: ${command} ${args.join(" ")}${details ? `\n${details}` : ""}`);
}
return options.capture ? String(result.stdout || "") : "";
}
function runCapture(command, args) {
const result = spawnSync(command, args, {
cwd: process.cwd(),
encoding: "utf8",
stdio: ["pipe", "pipe", "pipe"]
});
if (result.status !== 0) {
const stderr = String(result.stderr || "").trim();
throw new Error(stderr || `Command failed: ${command} ${args.join(" ")}`);
}
return String(result.stdout || "").trim();
}
function runWithInput(command, args, input) {
const result = spawnSync(command, args, {
cwd: process.cwd(),
encoding: "utf8",
input,
stdio: ["pipe", "pipe", "pipe"],
timeout: 10000
});
if (result.status !== 0) {
const stderr = String(result.stderr || "").trim();
throw new Error(stderr || `Command failed: ${command} ${args.join(" ")}`);
}
return String(result.stdout || "");
}
function parseArgs(argv) {
const args = argv.slice(2);
if (args.includes("--help") || args.includes("-h")) {
return { help: true };
}
const dryRun = args.includes("--dry-run");
const cleaned = args.filter((arg) => arg !== "--dry-run");
const version = cleaned[0] || "";
const notes = cleaned.slice(1).join(" ").trim();
return { help: false, dryRun, version, notes };
}
function parseRemoteUrl(url) {
const raw = String(url || "").trim();
const httpsMatch = raw.match(/^https?:\/\/([^/]+)\/([^/]+)\/([^/]+?)(?:\.git)?$/i);
if (httpsMatch) {
return { host: httpsMatch[1], owner: httpsMatch[2], repo: httpsMatch[3] };
}
const sshMatch = raw.match(/^git@([^:]+):([^/]+)\/([^/]+?)(?:\.git)?$/i);
if (sshMatch) {
return { host: sshMatch[1], owner: sshMatch[2], repo: sshMatch[3] };
}
const sshAltMatch = raw.match(/^ssh:\/\/git@([^/:]+)(?::\d+)?\/([^/]+)\/([^/]+?)(?:\.git)?$/i);
if (sshAltMatch) {
return { host: sshAltMatch[1], owner: sshAltMatch[2], repo: sshAltMatch[3] };
}
throw new Error(`Cannot parse remote URL: ${raw}`);
}
function normalizeBaseUrl(url) {
const raw = String(url || "").trim().replace(/\/+$/, "");
if (!raw) {
return "";
}
if (!/^https?:\/\//i.test(raw)) {
throw new Error("GITEA_BASE_URL must start with http:// or https://");
}
return raw;
}
function getGiteaRepo() {
const forcedRemote = String(process.env.GITEA_REMOTE || process.env.FORGEJO_REMOTE || "").trim();
const remotes = forcedRemote
? [forcedRemote]
: ["gitea", "forgejo", "origin", "github-new", "codeberg"];
const preferredBase = normalizeBaseUrl(process.env.GITEA_BASE_URL || process.env.FORGEJO_BASE_URL || "https://git.24-music.de");
const preferredProtocol = preferredBase ? new URL(preferredBase).protocol : "https:";
for (const remote of remotes) {
try {
const remoteUrl = runCapture("git", ["remote", "get-url", remote]);
const parsed = parseRemoteUrl(remoteUrl);
const remoteBase = `https://${parsed.host}`.toLowerCase();
if (preferredBase && remoteBase !== preferredBase.toLowerCase().replace(/^http:/, "https:")) {
continue;
}
return { remote, ...parsed, baseUrl: `${preferredProtocol}//${parsed.host}` };
} catch {
// try next remote
}
}
if (preferredBase) {
throw new Error(
`No remote found for ${preferredBase}. Add one with: git remote add gitea ${preferredBase}/<owner>/<repo>.git`
);
}
throw new Error("No suitable remote found. Set GITEA_REMOTE or GITEA_BASE_URL.");
}
function getAuthHeader(host) {
const explicitToken = String(process.env.GITEA_TOKEN || process.env.FORGEJO_TOKEN || "").trim();
if (explicitToken) {
return `token ${explicitToken}`;
}
const credentialText = runWithInput("git", ["credential", "fill"], `protocol=https\nhost=${host}\n\n`);
const map = new Map();
for (const line of credentialText.split(/\r?\n/)) {
if (!line.includes("=")) {
continue;
}
const [key, value] = line.split("=", 2);
map.set(key, value);
}
const username = map.get("username") || "";
const password = map.get("password") || "";
if (!username || !password) {
throw new Error(
`Missing credentials for ${host}. Set GITEA_TOKEN or store credentials for this host in git credential helper.`
);
}
const token = Buffer.from(`${username}:${password}`, "utf8").toString("base64");
return `Basic ${token}`;
}
async function apiRequest(method, url, authHeader, body, contentType = "application/json") {
const headers = {
Accept: "application/json",
Authorization: authHeader
};
if (body !== undefined) {
headers["Content-Type"] = contentType;
}
const response = await fetch(url, {
method,
headers,
body
});
const text = await response.text();
let parsed;
try {
parsed = text ? JSON.parse(text) : null;
} catch {
parsed = text;
}
return { ok: response.ok, status: response.status, body: parsed };
}
function ensureVersionString(version) {
const trimmed = String(version || "").trim();
if (!/^\d+\.\d+\.\d+(?:[-+][0-9A-Za-z.-]+)?$/.test(trimmed)) {
throw new Error("Invalid version format. Expected e.g. 1.4.42");
}
return trimmed;
}
function updatePackageVersion(rootDir, version) {
const packagePath = path.join(rootDir, "package.json");
const packageJson = JSON.parse(fs.readFileSync(packagePath, "utf8"));
if (String(packageJson.version || "") === version) {
process.stdout.write(`package.json is already at version ${version}, skipping update.\n`);
return;
}
packageJson.version = version;
fs.writeFileSync(packagePath, `${JSON.stringify(packageJson, null, 2)}\n`, "utf8");
}
function patchLatestYml(releaseDir, version) {
const ymlPath = path.join(releaseDir, "latest.yml");
let content = fs.readFileSync(ymlPath, "utf8");
const setupName = `Real-Debrid-Downloader Setup ${version}.exe`;
const dashedName = `Real-Debrid-Downloader-Setup-${version}.exe`;
if (content.includes(dashedName)) {
content = content.split(dashedName).join(setupName);
fs.writeFileSync(ymlPath, content, "utf8");
process.stdout.write(`Patched latest.yml: replaced "${dashedName}" with "${setupName}"\n`);
}
}
function ensureAssetsExist(rootDir, version) {
const releaseDir = path.join(rootDir, "release");
const files = [
`Real-Debrid-Downloader Setup ${version}.exe`,
`Real-Debrid-Downloader ${version}.exe`,
"latest.yml",
`Real-Debrid-Downloader Setup ${version}.exe.blockmap`
];
for (const fileName of files) {
const fullPath = path.join(releaseDir, fileName);
if (!fs.existsSync(fullPath)) {
throw new Error(`Missing release artifact: ${fullPath}`);
}
}
patchLatestYml(releaseDir, version);
return { releaseDir, files };
}
function ensureNoTrackedChanges() {
const output = runCapture("git", ["status", "--porcelain"]);
const lines = output.split(/\r?\n/).filter(Boolean);
const tracked = lines.filter((line) => !line.startsWith("?? "));
if (tracked.length > 0) {
throw new Error(`Working tree has tracked changes:\n${tracked.join("\n")}`);
}
}
function ensureTagMissing(tag) {
const result = spawnSync("git", ["rev-parse", "--verify", `refs/tags/${tag}`], {
cwd: process.cwd(),
stdio: "ignore"
});
if (result.status === 0) {
throw new Error(`Tag already exists: ${tag}`);
}
}
async function createOrGetRelease(baseApi, tag, authHeader, notes) {
const byTag = await apiRequest("GET", `${baseApi}/releases/tags/${encodeURIComponent(tag)}`, authHeader);
if (byTag.ok) {
return byTag.body;
}
const payload = {
tag_name: tag,
target_commitish: "main",
name: tag,
body: notes || `Release ${tag}`,
draft: false,
prerelease: false
};
const created = await apiRequest("POST", `${baseApi}/releases`, authHeader, JSON.stringify(payload));
if (!created.ok) {
throw new Error(`Failed to create release (${created.status}): ${JSON.stringify(created.body)}`);
}
return created.body;
}
async function uploadReleaseAssets(baseApi, releaseId, authHeader, releaseDir, files) {
for (const fileName of files) {
const filePath = path.join(releaseDir, fileName);
const fileSize = fs.statSync(filePath).size;
const uploadUrl = `${baseApi}/releases/${releaseId}/assets?name=${encodeURIComponent(fileName)}`;
// Stream large files instead of loading them entirely into memory
const fileStream = fs.createReadStream(filePath);
const response = await fetch(uploadUrl, {
method: "POST",
headers: {
Accept: "application/json",
Authorization: authHeader,
"Content-Type": "application/octet-stream",
"Content-Length": String(fileSize)
},
body: fileStream,
duplex: "half"
});
const text = await response.text();
let parsed;
try {
parsed = text ? JSON.parse(text) : null;
} catch {
parsed = text;
}
if (response.ok) {
process.stdout.write(`Uploaded: ${fileName}\n`);
continue;
}
if (response.status === 409 || response.status === 422) {
process.stdout.write(`Skipped existing asset: ${fileName}\n`);
continue;
}
throw new Error(`Asset upload failed for ${fileName} (${response.status}): ${JSON.stringify(parsed)}`);
}
}
async function main() {
const rootDir = process.cwd();
const args = parseArgs(process.argv);
if (args.help) {
process.stdout.write("Usage: npm run release:gitea -- <version> [release notes] [--dry-run]\n");
process.stdout.write("Env: GITEA_BASE_URL, GITEA_REMOTE, GITEA_TOKEN\n");
process.stdout.write("Compatibility envs still supported: FORGEJO_BASE_URL, FORGEJO_REMOTE, FORGEJO_TOKEN\n");
process.stdout.write("Example: npm run release:gitea -- 1.6.31 \"- Bugfixes\"\n");
return;
}
const version = ensureVersionString(args.version);
const tag = `v${version}`;
const releaseNotes = args.notes || `- Release ${tag}`;
const repo = getGiteaRepo();
ensureNoTrackedChanges();
ensureTagMissing(tag);
if (args.dryRun) {
process.stdout.write(`Dry run: would release ${tag}. No changes made.\n`);
return;
}
updatePackageVersion(rootDir, version);
process.stdout.write(`Building release artifacts for ${tag}...\n`);
run(NPM_EXECUTABLE, ["run", "release:win"]);
const assets = ensureAssetsExist(rootDir, version);
run("git", ["add", "package.json"]);
run("git", ["commit", "-m", `Release ${tag}`]);
run("git", ["push", repo.remote, "main"]);
run("git", ["tag", tag]);
run("git", ["push", repo.remote, tag]);
const authHeader = getAuthHeader(repo.host);
const baseApi = `${repo.baseUrl}/api/v1/repos/${repo.owner}/${repo.repo}`;
const release = await createOrGetRelease(baseApi, tag, authHeader, releaseNotes);
await uploadReleaseAssets(baseApi, release.id, authHeader, assets.releaseDir, assets.files);
process.stdout.write(`Release published: ${release.html_url || `${repo.baseUrl}/${repo.owner}/${repo.repo}/releases/tag/${tag}`}\n`);
}
main().catch((error) => {
process.stderr.write(`${String(error?.message || error)}\n`);
process.exit(1);
});

View File

@ -1,305 +0,0 @@
from __future__ import annotations
import json
import sys
import tempfile
import threading
import time
import zipfile
from pathlib import Path
from tkinter import messagebox
ROOT = Path(__file__).resolve().parents[1]
if str(ROOT) not in sys.path:
sys.path.insert(0, str(ROOT))
import real_debrid_downloader_gui as appmod
def assert_true(condition: bool, message: str) -> None:
if not condition:
raise AssertionError(message)
def run() -> None:
temp_root = Path(tempfile.mkdtemp(prefix="rd_self_check_"))
original_config = appmod.CONFIG_FILE
original_manifest = appmod.MANIFEST_FILE
appmod.CONFIG_FILE = temp_root / "rd_downloader_config.json"
appmod.MANIFEST_FILE = temp_root / "rd_download_manifest.json"
message_calls: list[tuple[str, str, str]] = []
original_showerror = messagebox.showerror
original_showwarning = messagebox.showwarning
original_showinfo = messagebox.showinfo
original_askyesno = messagebox.askyesno
def fake_message(kind: str):
def _inner(title: str, text: str):
message_calls.append((kind, str(title), str(text)))
return None
return _inner
messagebox.showerror = fake_message("error")
messagebox.showwarning = fake_message("warning")
messagebox.showinfo = fake_message("info")
app = appmod.DownloaderApp()
app.withdraw()
try:
app.token_var.set("demo-token")
app.output_dir_var.set(str(temp_root / "downloads"))
app.links_text.delete("1.0", "end")
app.links_text.insert("1.0", "not_a_link")
app.start_downloads()
assert_true(
any("Ungültige Links" in text for kind, _, text in message_calls if kind == "error"),
"Link-Validierung hat ungültige Eingabe nicht blockiert",
)
app.cleanup_mode_var.set("delete")
app.extract_conflict_mode_var.set("rename")
app.remove_link_files_after_extract_var.set(True)
app.remove_samples_var.set(True)
app.remember_token_var.set(True)
app.token_var.set("token-123")
original_can_secure = app._can_store_token_securely
original_store_keyring = app._store_token_in_keyring
app._can_store_token_securely = lambda: True
app._store_token_in_keyring = lambda token: False
app._save_config()
config_data = json.loads(appmod.CONFIG_FILE.read_text(encoding="utf-8"))
assert_true(config_data.get("token") == "token-123", "Token-Fallback in Config bei Keyring-Fehler fehlt")
app.cleanup_mode_var.set("none")
app.extract_conflict_mode_var.set("overwrite")
app.remove_link_files_after_extract_var.set(False)
app.remove_samples_var.set(False)
app.token_var.set("")
app._load_config()
assert_true(app.cleanup_mode_var.get() == "delete", "cleanup_mode wurde nicht aus Config geladen")
assert_true(app.extract_conflict_mode_var.get() == "rename", "extract_conflict_mode wurde nicht geladen")
assert_true(app.remove_link_files_after_extract_var.get() is True, "remove_link_files_after_extract fehlt")
assert_true(app.remove_samples_var.get() is True, "remove_samples_after_extract fehlt")
app._can_store_token_securely = original_can_secure
app._store_token_in_keyring = original_store_keyring
class DummyWorker:
@staticmethod
def is_alive() -> bool:
return True
app.worker_thread = DummyWorker()
app.pause_event.clear()
app.toggle_pause_downloads()
assert_true(app.pause_event.is_set(), "Pause wurde nicht aktiviert")
app.toggle_pause_downloads()
assert_true(not app.pause_event.is_set(), "Resume wurde nicht aktiviert")
app.pause_event.set()
started = time.monotonic()
def _unpause() -> None:
time.sleep(0.25)
app.pause_event.clear()
threading.Thread(target=_unpause, daemon=True).start()
app._wait_if_paused()
waited = time.monotonic() - started
assert_true(waited >= 0.2, "Pause-Wait hat nicht geblockt")
messagebox.askyesno = lambda *args, **kwargs: True
cancel_package_dir = temp_root / "cancel_pkg"
cancel_package_dir.mkdir(parents=True, exist_ok=True)
(cancel_package_dir / "release.part1.rar").write_bytes(b"x")
(cancel_package_dir / "keep_movie.mkv").write_bytes(b"x")
cancel_row = "package-cancel"
child_row = "package-cancel-link-1"
app.table.insert("", "end", iid=cancel_row, text="cancelpkg", values=("-", "Wartet", "0/1", "0 B/s", "0"), open=True)
app.table.insert(cancel_row, "end", iid=child_row, text="https://example.com/cancel", values=("-", "Wartet", "0%", "0 B/s", "0"))
app.links_text.delete("1.0", "end")
app.links_text.insert("1.0", "https://example.com/cancel\n")
app.package_contexts = [
{
"package_row_id": cancel_row,
"row_map": {1: child_row},
"job": {
"name": "cancelpkg",
"links": ["https://example.com/cancel"],
"package_dir": cancel_package_dir,
"extract_target_dir": None,
"completed_indices": [],
},
}
]
app.worker_thread = DummyWorker()
app.table.selection_set(child_row)
app._remove_selected_progress_rows()
assert_true(app._is_package_cancelled(cancel_row), "Paket-Abbruch wurde nicht markiert")
assert_true(not app.table.exists(cancel_row), "Paketzeile wurde nicht entfernt")
remaining_links = app.links_text.get("1.0", "end").strip()
assert_true(not remaining_links, "Link wurde bei Paketentfernung nicht aus Liste entfernt")
removed_cancel_files = app._cleanup_cancelled_package_artifacts(cancel_package_dir)
assert_true(removed_cancel_files >= 1, "Archiv-Cleanup bei Paketabbruch hat nichts gelöscht")
assert_true(not (cancel_package_dir / "release.part1.rar").exists(), "RAR-Teil wurde nicht entfernt")
assert_true((cancel_package_dir / "keep_movie.mkv").exists(), "Nicht-Archivdatei wurde fälschlich gelöscht")
status_events: list[tuple[float, str]] = []
extract_times: dict[str, float] = {}
download_starts: dict[str, float] = {}
original_queue_status = app._queue_status
original_download_single = app._download_single_link
original_extract_archive = app._extract_archive
def fake_queue_status(message: str) -> None:
status_events.append((time.monotonic(), message))
original_queue_status(message)
def fake_download_single(
token: str,
package_dir: Path,
index: int,
link: str,
package_row_id: str | None = None,
) -> appmod.DownloadResult:
package_name = package_dir.name
download_starts.setdefault(package_name, time.monotonic())
archive_path = package_dir / f"{package_name}_{index}.zip"
archive_path.parent.mkdir(parents=True, exist_ok=True)
with zipfile.ZipFile(archive_path, "w") as archive:
archive.writestr("movie.mkv", b"movie-data")
archive.writestr(f"Samples/{package_name}-sample.mkv", b"sample-data")
archive.writestr("download_links.txt", "https://example.com/file")
time.sleep(0.18)
return appmod.DownloadResult(path=archive_path, bytes_written=archive_path.stat().st_size)
def fake_extract_archive(archive_path: Path, extract_target_dir: Path, conflict_mode: str):
package_name = archive_path.parent.name
if package_name == "pkg1":
extract_times["pkg1_start"] = time.monotonic()
time.sleep(0.8)
else:
time.sleep(0.25)
with zipfile.ZipFile(archive_path) as archive:
archive.extractall(extract_target_dir)
if package_name == "pkg1":
extract_times["pkg1_end"] = time.monotonic()
return None
app._queue_status = fake_queue_status
app._download_single_link = fake_download_single
app._extract_archive = fake_extract_archive
app.table.delete(*app.table.get_children())
app.package_contexts = []
package_specs: list[tuple[str, Path, Path]] = []
for idx in (1, 2):
package_name = f"pkg{idx}"
package_dir = temp_root / package_name
extract_dir = temp_root / f"extract_{package_name}"
package_dir.mkdir(parents=True, exist_ok=True)
extract_dir.mkdir(parents=True, exist_ok=True)
package_row_id = f"package-{idx}"
app.table.insert("", "end", iid=package_row_id, text=package_name, values=("-", "Wartet", "0/1", "0 B/s", "0"), open=True)
row_id = f"{package_row_id}-link-1"
app.table.insert(package_row_id, "end", iid=row_id, text="https://example.com/file", values=("-", "Wartet", "0%", "0 B/s", "0"))
app.package_contexts.append(
{
"package_row_id": package_row_id,
"row_map": {1: row_id},
"job": {
"name": package_name,
"links": ["https://example.com/file"],
"package_dir": package_dir,
"extract_target_dir": extract_dir,
"completed_indices": [],
},
}
)
package_specs.append((package_name, package_dir, extract_dir))
app.run_started_at = time.monotonic()
app.total_downloaded_bytes = 0
app.stop_event.clear()
app.pause_event.clear()
app._set_manifest_for_run(
[
{"name": name, "links": ["https://example.com/file"]}
for name, _package_dir, _extract_dir in package_specs
],
temp_root / "downloads",
"self-check-signature",
resume_map={},
)
app._download_queue_worker(
token="demo-token",
max_parallel=1,
hybrid_extract=True,
cleanup_mode="none",
extract_conflict_mode="overwrite",
overall_total_links=2,
remove_link_files_after_extract=True,
remove_samples_after_extract=True,
)
app._process_ui_queue()
pkg1_extract_dir = temp_root / "extract_pkg1"
pkg2_extract_dir = temp_root / "extract_pkg2"
assert_true((pkg1_extract_dir / "movie.mkv").exists(), "Entpacken pkg1 fehlgeschlagen")
assert_true((pkg2_extract_dir / "movie.mkv").exists(), "Entpacken pkg2 fehlgeschlagen")
assert_true(not (pkg1_extract_dir / "download_links.txt").exists(), "Link-Artefakte wurden nicht entfernt")
assert_true(not (pkg2_extract_dir / "download_links.txt").exists(), "Link-Artefakte pkg2 wurden nicht entfernt")
assert_true(not (pkg1_extract_dir / "Samples").exists(), "Sample-Ordner pkg1 wurde nicht entfernt")
assert_true(not (pkg2_extract_dir / "Samples").exists(), "Sample-Ordner pkg2 wurde nicht entfernt")
assert_true("pkg1_start" in extract_times and "pkg1_end" in extract_times, "Entpack-Zeiten für pkg1 fehlen")
assert_true("pkg2" in download_starts, "Downloadstart für pkg2 fehlt")
assert_true(
download_starts["pkg2"] < extract_times["pkg1_end"],
"Paket 2 startete nicht parallel zum Entpacken von Paket 1",
)
manifest_data = json.loads(appmod.MANIFEST_FILE.read_text(encoding="utf-8"))
assert_true(bool(manifest_data.get("finished")), "Manifest wurde nach Lauf nicht abgeschlossen")
with app.path_lock:
app.reserved_target_keys.add("dummy-key")
app.ui_queue.put(("controls", False))
app._process_ui_queue()
with app.path_lock:
assert_true(len(app.reserved_target_keys) == 0, "reserved_target_keys wurden nicht bereinigt")
app._queue_status = original_queue_status
app._download_single_link = original_download_single
app._extract_archive = original_extract_archive
assert_true(any("Entpacken läuft parallel" in text for _, text in status_events), "Kein Parallel-Entpacken-Status geloggt")
print("Self-check erfolgreich")
finally:
try:
app.destroy()
except Exception:
pass
messagebox.showerror = original_showerror
messagebox.showwarning = original_showwarning
messagebox.showinfo = original_showinfo
messagebox.askyesno = original_askyesno
appmod.CONFIG_FILE = original_config
appmod.MANIFEST_FILE = original_manifest
if __name__ == "__main__":
run()

View File

@ -1,35 +0,0 @@
import re
import sys
from pathlib import Path
def main() -> int:
if len(sys.argv) < 2:
print("Usage: python scripts/set_version.py <version>")
return 1
version = sys.argv[1].strip().lstrip("v")
if not re.fullmatch(r"\d+(?:\.\d+){1,3}", version):
print(f"Invalid version: {version}")
return 1
target = Path(__file__).resolve().parents[1] / "real_debrid_downloader_gui.py"
content = target.read_text(encoding="utf-8")
updated, count = re.subn(
r'^APP_VERSION\s*=\s*"[^"]+"\s*$',
f'APP_VERSION = "{version}"',
content,
count=1,
flags=re.MULTILINE,
)
if count != 1:
print("APP_VERSION marker not found")
return 1
target.write_text(updated, encoding="utf-8")
print(f"Set APP_VERSION to {version}")
return 0
if __name__ == "__main__":
raise SystemExit(main())

383
src/main/app-controller.ts Normal file
View File

@ -0,0 +1,383 @@
import path from "node:path";
import { app } from "electron";
import {
AddLinksPayload,
AppSettings,
DuplicatePolicy,
HistoryEntry,
PackagePriority,
ParsedPackageInput,
SessionStats,
StartConflictEntry,
StartConflictResolutionResult,
UiSnapshot,
UpdateCheckResult,
UpdateInstallProgress,
UpdateInstallResult
} from "../shared/types";
import { importDlcContainers } from "./container";
import { APP_VERSION } from "./constants";
import { DownloadManager } from "./download-manager";
import { parseCollectorInput } from "./link-parser";
import { configureLogger, getLogFilePath, logger } from "./logger";
import { initSessionLog, getSessionLogPath, shutdownSessionLog } from "./session-log";
import { MegaWebFallback } from "./mega-web-fallback";
import { addHistoryEntry, cancelPendingAsyncSaves, clearHistory, createStoragePaths, loadHistory, loadSession, loadSettings, normalizeLoadedSession, normalizeLoadedSessionTransientFields, normalizeSettings, removeHistoryEntry, saveSession, saveSettings } from "./storage";
import { abortActiveUpdateDownload, checkGitHubUpdate, installLatestUpdate } from "./update";
import { startDebugServer, stopDebugServer } from "./debug-server";
function sanitizeSettingsPatch(partial: Partial<AppSettings>): Partial<AppSettings> {
const entries = Object.entries(partial || {}).filter(([, value]) => value !== undefined);
return Object.fromEntries(entries) as Partial<AppSettings>;
}
function settingsFingerprint(settings: AppSettings): string {
return JSON.stringify(normalizeSettings(settings));
}
export class AppController {
private settings: AppSettings;
private manager: DownloadManager;
private megaWebFallback: MegaWebFallback;
private lastUpdateCheck: UpdateCheckResult | null = null;
private lastUpdateCheckAt = 0;
private storagePaths = createStoragePaths(path.join(app.getPath("userData"), "runtime"));
private onStateHandler: ((snapshot: UiSnapshot) => void) | null = null;
private autoResumePending = false;
public constructor() {
configureLogger(this.storagePaths.baseDir);
initSessionLog(this.storagePaths.baseDir);
this.settings = loadSettings(this.storagePaths);
const session = loadSession(this.storagePaths);
this.megaWebFallback = new MegaWebFallback(() => ({
login: this.settings.megaLogin,
password: this.settings.megaPassword
}));
this.manager = new DownloadManager(this.settings, session, this.storagePaths, {
megaWebUnrestrict: (link: string, signal?: AbortSignal) => this.megaWebFallback.unrestrict(link, signal),
invalidateMegaSession: () => this.megaWebFallback.invalidateSession(),
onHistoryEntry: (entry: HistoryEntry) => {
addHistoryEntry(this.storagePaths, entry);
}
});
this.manager.on("state", (snapshot: UiSnapshot) => {
this.onStateHandler?.(snapshot);
});
logger.info(`App gestartet v${APP_VERSION}`);
logger.info(`Log-Datei: ${getLogFilePath()}`);
startDebugServer(this.manager, this.storagePaths.baseDir);
if (this.settings.autoResumeOnStart) {
const snapshot = this.manager.getSnapshot();
const hasPending = Object.values(snapshot.session.items).some((item) => item.status === "queued" || item.status === "reconnect_wait");
if (hasPending) {
void this.manager.getStartConflicts().then((conflicts) => {
const hasConflicts = conflicts.length > 0;
if (this.hasAnyProviderToken(this.settings) && !hasConflicts) {
// If the onState handler is already set (renderer connected), start immediately.
// Otherwise mark as pending so the onState setter triggers the start.
if (this.onStateHandler) {
logger.info("Auto-Resume beim Start aktiviert (nach Konflikt-Check)");
void this.manager.start().catch((err) => logger.warn(`Auto-Resume Start Fehler: ${String(err)}`));
} else {
this.autoResumePending = true;
logger.info("Auto-Resume beim Start vorgemerkt");
}
} else if (hasConflicts) {
logger.info("Auto-Resume übersprungen: Start-Konflikte erkannt");
}
}).catch((err) => logger.warn(`getStartConflicts Fehler (constructor): ${String(err)}`));
}
}
}
private hasAnyProviderToken(settings: AppSettings): boolean {
return Boolean(
settings.token.trim()
|| (settings.megaLogin.trim() && settings.megaPassword.trim())
|| settings.bestToken.trim()
|| settings.allDebridToken.trim()
|| (settings.ddownloadLogin.trim() && settings.ddownloadPassword.trim())
);
}
public get onState(): ((snapshot: UiSnapshot) => void) | null {
return this.onStateHandler;
}
public set onState(handler: ((snapshot: UiSnapshot) => void) | null) {
this.onStateHandler = handler;
if (handler) {
handler(this.manager.getSnapshot());
if (this.autoResumePending) {
this.autoResumePending = false;
void this.manager.start().catch((err) => logger.warn(`Auto-Resume Start Fehler: ${String(err)}`));
logger.info("Auto-Resume beim Start aktiviert");
} else {
// Trigger pending extractions without starting the session
this.manager.triggerIdleExtractions();
}
}
}
public getSnapshot(): UiSnapshot {
return this.manager.getSnapshot();
}
public getVersion(): string {
return APP_VERSION;
}
public getSettings(): AppSettings {
return this.settings;
}
public updateSettings(partial: Partial<AppSettings>): AppSettings {
const sanitizedPatch = sanitizeSettingsPatch(partial);
const nextSettings = normalizeSettings({
...this.settings,
...sanitizedPatch
});
if (settingsFingerprint(nextSettings) === settingsFingerprint(this.settings)) {
return this.settings;
}
// Preserve the live totalDownloadedAllTime from the download manager
const liveSettings = this.manager.getSettings();
nextSettings.totalDownloadedAllTime = Math.max(nextSettings.totalDownloadedAllTime || 0, liveSettings.totalDownloadedAllTime || 0);
this.settings = nextSettings;
saveSettings(this.storagePaths, this.settings);
this.manager.setSettings(this.settings);
return this.settings;
}
public async checkUpdates(): Promise<UpdateCheckResult> {
const result = await checkGitHubUpdate(this.settings.updateRepo);
if (!result.error) {
this.lastUpdateCheck = result;
this.lastUpdateCheckAt = Date.now();
}
return result;
}
public async installUpdate(onProgress?: (progress: UpdateInstallProgress) => void): Promise<UpdateInstallResult> {
// Stop active downloads before installing. Extractions may continue briefly
// until prepareForShutdown() is called during app quit.
if (this.manager.isSessionRunning()) {
this.manager.stop();
}
const cacheAgeMs = Date.now() - this.lastUpdateCheckAt;
const cached = this.lastUpdateCheck && !this.lastUpdateCheck.error && cacheAgeMs <= 10 * 60 * 1000
? this.lastUpdateCheck
: undefined;
const result = await installLatestUpdate(this.settings.updateRepo, cached, onProgress);
if (result.started) {
this.lastUpdateCheck = null;
this.lastUpdateCheckAt = 0;
}
return result;
}
public addLinks(payload: AddLinksPayload): { addedPackages: number; addedLinks: number; invalidCount: number } {
const parsed = parseCollectorInput(payload.rawText, payload.packageName || this.settings.packageName);
if (parsed.length === 0) {
return { addedPackages: 0, addedLinks: 0, invalidCount: 1 };
}
const result = this.manager.addPackages(parsed);
return { ...result, invalidCount: 0 };
}
public async addContainers(filePaths: string[]): Promise<{ addedPackages: number; addedLinks: number }> {
const packages = await importDlcContainers(filePaths);
const merged: ParsedPackageInput[] = packages.map((pkg) => ({
name: pkg.name,
links: pkg.links,
...(pkg.fileNames ? { fileNames: pkg.fileNames } : {})
}));
const result = this.manager.addPackages(merged);
return result;
}
public async getStartConflicts(): Promise<StartConflictEntry[]> {
return this.manager.getStartConflicts();
}
public async resolveStartConflict(packageId: string, policy: DuplicatePolicy): Promise<StartConflictResolutionResult> {
return this.manager.resolveStartConflict(packageId, policy);
}
public clearAll(): void {
this.manager.clearAll();
}
public async start(): Promise<void> {
await this.manager.start();
}
public async startPackages(packageIds: string[]): Promise<void> {
await this.manager.startPackages(packageIds);
}
public async startItems(itemIds: string[]): Promise<void> {
await this.manager.startItems(itemIds);
}
public stop(): void {
this.manager.stop();
}
public togglePause(): boolean {
return this.manager.togglePause();
}
public retryExtraction(packageId: string): void {
this.manager.retryExtraction(packageId);
}
public extractNow(packageId: string): void {
this.manager.extractNow(packageId);
}
public resetPackage(packageId: string): void {
this.manager.resetPackage(packageId);
}
public cancelPackage(packageId: string): void {
this.manager.cancelPackage(packageId);
}
public renamePackage(packageId: string, newName: string): void {
this.manager.renamePackage(packageId, newName);
}
public reorderPackages(packageIds: string[]): void {
this.manager.reorderPackages(packageIds);
}
public removeItem(itemId: string): void {
this.manager.removeItem(itemId);
}
public togglePackage(packageId: string): void {
this.manager.togglePackage(packageId);
}
public exportQueue(): string {
return this.manager.exportQueue();
}
public importQueue(json: string): { addedPackages: number; addedLinks: number } {
return this.manager.importQueue(json);
}
public getSessionStats(): SessionStats {
return this.manager.getSessionStats();
}
public exportBackup(): string {
const settings = { ...this.settings };
const SENSITIVE_KEYS: (keyof AppSettings)[] = ["token", "megaLogin", "megaPassword", "bestToken", "allDebridToken", "ddownloadLogin", "ddownloadPassword"];
for (const key of SENSITIVE_KEYS) {
const val = settings[key];
if (typeof val === "string" && val.length > 0) {
(settings as Record<string, unknown>)[key] = `***${val.slice(-4)}`;
}
}
const session = this.manager.getSession();
return JSON.stringify({ version: 1, settings, session }, null, 2);
}
public importBackup(json: string): { restored: boolean; message: string } {
let parsed: Record<string, unknown>;
try {
parsed = JSON.parse(json) as Record<string, unknown>;
} catch {
return { restored: false, message: "Ungültiges JSON" };
}
if (!parsed || typeof parsed !== "object" || !parsed.settings || !parsed.session) {
return { restored: false, message: "Kein gültiges Backup (settings/session fehlen)" };
}
const importedSettings = parsed.settings as AppSettings;
const SENSITIVE_KEYS: (keyof AppSettings)[] = ["token", "megaLogin", "megaPassword", "bestToken", "allDebridToken", "ddownloadLogin", "ddownloadPassword"];
for (const key of SENSITIVE_KEYS) {
const val = (importedSettings as Record<string, unknown>)[key];
if (typeof val === "string" && val.startsWith("***")) {
(importedSettings as Record<string, unknown>)[key] = (this.settings as Record<string, unknown>)[key];
}
}
const restoredSettings = normalizeSettings(importedSettings);
this.settings = restoredSettings;
saveSettings(this.storagePaths, this.settings);
this.manager.setSettings(this.settings);
// Full stop including extraction abort — the old session is being replaced,
// so no extraction tasks from it should keep running.
this.manager.stop();
this.manager.abortAllPostProcessing();
// Cancel any deferred persist timer and queued async writes so the old
// in-memory session does not overwrite the restored session file on disk.
this.manager.clearPersistTimer();
cancelPendingAsyncSaves();
const restoredSession = normalizeLoadedSessionTransientFields(
normalizeLoadedSession(parsed.session)
);
saveSession(this.storagePaths, restoredSession);
// Prevent prepareForShutdown from overwriting the restored session file
// with the old in-memory session when the app quits after backup restore.
this.manager.skipShutdownPersist = true;
// Block all persistence (including persistSoon from any IPC operations
// the user might trigger before restarting) to protect the restored backup.
this.manager.blockAllPersistence = true;
return { restored: true, message: "Backup wiederhergestellt. Bitte App neustarten." };
}
public getSessionLogPath(): string | null {
return getSessionLogPath();
}
public shutdown(): void {
stopDebugServer();
abortActiveUpdateDownload();
this.manager.prepareForShutdown();
this.megaWebFallback.dispose();
shutdownSessionLog();
logger.info("App beendet");
}
public getHistory(): HistoryEntry[] {
return loadHistory(this.storagePaths);
}
public clearHistory(): void {
clearHistory(this.storagePaths);
}
public setPackagePriority(packageId: string, priority: PackagePriority): void {
this.manager.setPackagePriority(packageId, priority);
}
public skipItems(itemIds: string[]): void {
this.manager.skipItems(itemIds);
}
public resetItems(itemIds: string[]): void {
this.manager.resetItems(itemIds);
}
public removeHistoryEntry(entryId: string): void {
removeHistoryEntry(this.storagePaths, entryId);
}
public addToHistory(entry: HistoryEntry): void {
addHistoryEntry(this.storagePaths, entry);
}
}

66
src/main/backup-crypto.ts Normal file
View File

@ -0,0 +1,66 @@
import crypto from "node:crypto";
export const SENSITIVE_KEYS = [
"token",
"megaLogin",
"megaPassword",
"bestToken",
"allDebridToken",
"archivePasswordList"
] as const;
export type SensitiveKey = (typeof SENSITIVE_KEYS)[number];
export interface EncryptedCredentials {
salt: string;
iv: string;
tag: string;
data: string;
}
const PBKDF2_ITERATIONS = 100_000;
const KEY_LENGTH = 32; // 256 bit
const IV_LENGTH = 12; // 96 bit for GCM
const SALT_LENGTH = 16;
function deriveKey(username: string, salt: Buffer): Buffer {
return crypto.pbkdf2Sync(username, salt, PBKDF2_ITERATIONS, KEY_LENGTH, "sha256");
}
export function encryptCredentials(
fields: Record<string, string>,
username: string
): EncryptedCredentials {
const salt = crypto.randomBytes(SALT_LENGTH);
const iv = crypto.randomBytes(IV_LENGTH);
const key = deriveKey(username, salt);
const cipher = crypto.createCipheriv("aes-256-gcm", key, iv);
const plaintext = JSON.stringify(fields);
const encrypted = Buffer.concat([cipher.update(plaintext, "utf8"), cipher.final()]);
const tag = cipher.getAuthTag();
return {
salt: salt.toString("hex"),
iv: iv.toString("hex"),
tag: tag.toString("hex"),
data: encrypted.toString("hex")
};
}
export function decryptCredentials(
encrypted: EncryptedCredentials,
username: string
): Record<string, string> {
const salt = Buffer.from(encrypted.salt, "hex");
const iv = Buffer.from(encrypted.iv, "hex");
const tag = Buffer.from(encrypted.tag, "hex");
const data = Buffer.from(encrypted.data, "hex");
const key = deriveKey(username, salt);
const decipher = crypto.createDecipheriv("aes-256-gcm", key, iv);
decipher.setAuthTag(tag);
const decrypted = Buffer.concat([decipher.update(data), decipher.final()]);
return JSON.parse(decrypted.toString("utf8")) as Record<string, string>;
}

241
src/main/cleanup.ts Normal file
View File

@ -0,0 +1,241 @@
import fs from "node:fs";
import path from "node:path";
import { ARCHIVE_TEMP_EXTENSIONS, LINK_ARTIFACT_EXTENSIONS, MAX_LINK_ARTIFACT_BYTES, RAR_SPLIT_RE, SAMPLE_DIR_NAMES, SAMPLE_TOKEN_RE, SAMPLE_VIDEO_EXTENSIONS } from "./constants";
async function yieldToLoop(): Promise<void> {
await new Promise<void>((resolve) => {
setTimeout(resolve, 0);
});
}
export function isArchiveOrTempFile(filePath: string): boolean {
const lowerName = path.basename(filePath).toLowerCase();
const ext = path.extname(lowerName);
if (ARCHIVE_TEMP_EXTENSIONS.has(ext)) {
return true;
}
if (lowerName.includes(".part") && lowerName.endsWith(".rar")) {
return true;
}
return RAR_SPLIT_RE.test(lowerName);
}
export function cleanupCancelledPackageArtifacts(packageDir: string): number {
if (!fs.existsSync(packageDir)) {
return 0;
}
let removed = 0;
const stack = [packageDir];
while (stack.length > 0) {
const current = stack.pop() as string;
let entries: fs.Dirent[] = [];
try { entries = fs.readdirSync(current, { withFileTypes: true }); } catch { continue; }
for (const entry of entries) {
const full = path.join(current, entry.name);
if (entry.isDirectory() && !entry.isSymbolicLink()) {
stack.push(full);
} else if (entry.isFile() && isArchiveOrTempFile(full)) {
try {
fs.rmSync(full, { force: true });
removed += 1;
} catch {
// ignore
}
}
}
}
return removed;
}
export async function cleanupCancelledPackageArtifactsAsync(packageDir: string): Promise<number> {
try {
await fs.promises.access(packageDir, fs.constants.F_OK);
} catch {
return 0;
}
let removed = 0;
let touched = 0;
const stack = [packageDir];
while (stack.length > 0) {
const current = stack.pop() as string;
let entries: fs.Dirent[] = [];
try {
entries = await fs.promises.readdir(current, { withFileTypes: true });
} catch {
continue;
}
for (const entry of entries) {
const full = path.join(current, entry.name);
if (entry.isDirectory() && !entry.isSymbolicLink()) {
stack.push(full);
} else if (entry.isFile() && isArchiveOrTempFile(full)) {
try {
await fs.promises.rm(full, { force: true });
removed += 1;
} catch {
// ignore
}
}
touched += 1;
if (touched % 80 === 0) {
await yieldToLoop();
}
}
}
return removed;
}
export async function removeDownloadLinkArtifacts(extractDir: string): Promise<number> {
try {
await fs.promises.access(extractDir);
} catch {
return 0;
}
let removed = 0;
const stack = [extractDir];
while (stack.length > 0) {
const current = stack.pop() as string;
let entries: fs.Dirent[] = [];
try { entries = await fs.promises.readdir(current, { withFileTypes: true }); } catch { continue; }
for (const entry of entries) {
const full = path.join(current, entry.name);
if (entry.isDirectory() && !entry.isSymbolicLink()) {
stack.push(full);
continue;
}
if (!entry.isFile()) {
continue;
}
const ext = path.extname(entry.name).toLowerCase();
const name = entry.name.toLowerCase();
let shouldDelete = LINK_ARTIFACT_EXTENSIONS.has(ext);
if (!shouldDelete && [".txt", ".html", ".htm", ".nfo"].includes(ext)) {
if (/[._\- ](links?|downloads?|urls?|dlc)([._\- ]|$)/i.test(name)) {
try {
const stat = await fs.promises.stat(full);
if (stat.size <= MAX_LINK_ARTIFACT_BYTES) {
const text = await fs.promises.readFile(full, "utf8");
shouldDelete = /https?:\/\//i.test(text);
}
} catch {
shouldDelete = false;
}
}
}
if (shouldDelete) {
try {
await fs.promises.rm(full, { force: true });
removed += 1;
} catch {
// ignore
}
}
}
}
return removed;
}
export async function removeSampleArtifacts(extractDir: string): Promise<{ files: number; dirs: number }> {
try {
await fs.promises.access(extractDir);
} catch {
return { files: 0, dirs: 0 };
}
let removedFiles = 0;
let removedDirs = 0;
const sampleDirs: string[] = [];
const stack = [extractDir];
const countFilesRecursive = async (rootDir: string): Promise<number> => {
let count = 0;
const dirs = [rootDir];
while (dirs.length > 0) {
const current = dirs.pop() as string;
let entries: fs.Dirent[] = [];
try {
entries = await fs.promises.readdir(current, { withFileTypes: true });
} catch {
continue;
}
for (const entry of entries) {
const full = path.join(current, entry.name);
if (entry.isDirectory()) {
try {
const stat = await fs.promises.lstat(full);
if (stat.isSymbolicLink()) {
continue;
}
} catch {
continue;
}
dirs.push(full);
} else if (entry.isFile()) {
count += 1;
}
}
}
return count;
};
while (stack.length > 0) {
const current = stack.pop() as string;
let entries: fs.Dirent[] = [];
try { entries = await fs.promises.readdir(current, { withFileTypes: true }); } catch { continue; }
for (const entry of entries) {
const full = path.join(current, entry.name);
if (entry.isDirectory() || entry.isSymbolicLink()) {
const base = entry.name.toLowerCase();
if (SAMPLE_DIR_NAMES.has(base)) {
sampleDirs.push(full);
continue;
}
if (entry.isDirectory()) {
stack.push(full);
}
continue;
}
if (!entry.isFile()) {
continue;
}
const stem = path.parse(entry.name).name.toLowerCase();
const ext = path.extname(entry.name).toLowerCase();
const isSampleVideo = SAMPLE_VIDEO_EXTENSIONS.has(ext) && SAMPLE_TOKEN_RE.test(stem);
if (isSampleVideo) {
try {
await fs.promises.rm(full, { force: true });
removedFiles += 1;
} catch {
// ignore
}
}
}
}
sampleDirs.sort((a, b) => b.length - a.length);
for (const dir of sampleDirs) {
try {
const stat = await fs.promises.lstat(dir);
if (stat.isSymbolicLink()) {
await fs.promises.rm(dir, { force: true });
removedDirs += 1;
continue;
}
const filesInDir = await countFilesRecursive(dir);
await fs.promises.rm(dir, { recursive: true, force: true });
removedFiles += filesInDir;
removedDirs += 1;
} catch {
// ignore
}
}
return { files: removedFiles, dirs: removedDirs };
}

94
src/main/constants.ts Normal file
View File

@ -0,0 +1,94 @@
import path from "node:path";
import os from "node:os";
import { AppSettings } from "../shared/types";
import packageJson from "../../package.json";
export const APP_NAME = "Multi Debrid Downloader";
export const APP_VERSION: string = packageJson.version;
export const API_BASE_URL = "https://api.real-debrid.com/rest/1.0";
export const DCRYPT_UPLOAD_URL = "https://dcrypt.it/decrypt/upload";
export const DCRYPT_PASTE_URL = "https://dcrypt.it/decrypt/paste";
export const DLC_SERVICE_URL = "https://service.jdownloader.org/dlcrypt/service.php?srcType=dlc&destType=pylo&data={KEY}";
export const DLC_AES_KEY = Buffer.from("cb99b5cbc24db398", "utf8");
export const DLC_AES_IV = Buffer.from("9bc24cb995cb8db3", "utf8");
export const REQUEST_RETRIES = 3;
export const CHUNK_SIZE = 512 * 1024;
export const WRITE_BUFFER_SIZE = 512 * 1024; // 512 KB write buffer (JDownloader: 500 KB)
export const WRITE_FLUSH_TIMEOUT_MS = 2000; // 2s flush timeout
export const ALLOCATION_UNIT_SIZE = 4096; // 4 KB NTFS alignment
export const STREAM_HIGH_WATER_MARK = 512 * 1024; // 512 KB stream buffer — lower than before (2 MB) so backpressure triggers sooner when disk is slow
export const DISK_BUSY_THRESHOLD_MS = 300; // Show "Warte auf Festplatte" if writableLength > 0 for this long
export const SAMPLE_DIR_NAMES = new Set(["sample", "samples"]);
export const SAMPLE_VIDEO_EXTENSIONS = new Set([".mkv", ".mp4", ".avi", ".mov", ".wmv", ".m4v", ".ts", ".m2ts", ".webm"]);
export const LINK_ARTIFACT_EXTENSIONS = new Set([".url", ".webloc", ".dlc", ".rsdf", ".ccf"]);
export const SAMPLE_TOKEN_RE = /(^|[._\-\s])sample([._\-\s]|$)/i;
export const ARCHIVE_TEMP_EXTENSIONS = new Set([".rar", ".zip", ".7z", ".tmp", ".part", ".tar", ".gz", ".bz2", ".xz", ".rev"]);
export const RAR_SPLIT_RE = /\.r\d{2,3}$/i;
export const MAX_MANIFEST_FILE_BYTES = 5 * 1024 * 1024;
export const MAX_LINK_ARTIFACT_BYTES = 256 * 1024;
export const SPEED_WINDOW_SECONDS = 1;
export const CLIPBOARD_POLL_INTERVAL_MS = 2000;
export const DEFAULT_UPDATE_REPO = "Administrator/real-debrid-downloader";
export function defaultSettings(): AppSettings {
const baseDir = path.join(os.homedir(), "Downloads", "RealDebrid");
return {
token: "",
megaLogin: "",
megaPassword: "",
bestToken: "",
allDebridToken: "",
ddownloadLogin: "",
ddownloadPassword: "",
archivePasswordList: "",
rememberToken: true,
providerPrimary: "realdebrid",
providerSecondary: "megadebrid",
providerTertiary: "bestdebrid",
autoProviderFallback: true,
outputDir: baseDir,
packageName: "",
autoExtract: true,
autoRename4sf4sj: false,
extractDir: path.join(baseDir, "_entpackt"),
collectMkvToLibrary: false,
mkvLibraryDir: path.join(baseDir, "_mkv"),
createExtractSubfolder: true,
hybridExtract: true,
cleanupMode: "none",
extractConflictMode: "overwrite",
removeLinkFilesAfterExtract: false,
removeSamplesAfterExtract: false,
enableIntegrityCheck: true,
autoResumeOnStart: true,
autoReconnect: false,
reconnectWaitSeconds: 45,
completedCleanupPolicy: "never",
maxParallel: 4,
maxParallelExtract: 2,
retryLimit: 0,
speedLimitEnabled: false,
speedLimitKbps: 0,
speedLimitMode: "global",
updateRepo: DEFAULT_UPDATE_REPO,
autoUpdateCheck: true,
clipboardWatch: false,
minimizeToTray: false,
theme: "dark" as const,
collapseNewPackages: true,
autoSkipExtracted: false,
confirmDeleteSelection: true,
totalDownloadedAllTime: 0,
bandwidthSchedules: [],
columnOrder: ["name", "size", "progress", "hoster", "account", "prio", "status", "speed"],
extractCpuPriority: "high",
autoExtractWhenStopped: true
};
}

319
src/main/container.ts Normal file
View File

@ -0,0 +1,319 @@
import fs from "node:fs";
import path from "node:path";
import crypto from "node:crypto";
import { DCRYPT_PASTE_URL, DCRYPT_UPLOAD_URL, DLC_AES_IV, DLC_AES_KEY, DLC_SERVICE_URL } from "./constants";
import { compactErrorText, inferPackageNameFromLinks, isHttpLink, sanitizeFilename, uniquePreserveOrder } from "./utils";
import { ParsedPackageInput } from "../shared/types";
const MAX_DLC_FILE_BYTES = 8 * 1024 * 1024;
function isContainerSizeValidationError(error: unknown): boolean {
const text = compactErrorText(error);
return /zu groß/i.test(text) || /DLC-Datei ungültig oder zu groß/i.test(text);
}
function decodeDcryptPayload(responseText: string): unknown {
let text = String(responseText || "").trim();
const m = text.match(/<textarea[^>]*>([\s\S]*?)<\/textarea>/i);
if (m) {
text = m[1].replace(/&quot;/g, '"').replace(/&amp;/g, "&").trim();
}
if (!text) {
return "";
}
try {
return JSON.parse(text);
} catch {
return text;
}
}
function extractUrlsRecursive(data: unknown): string[] {
if (typeof data === "string") {
const found = data.match(/https?:\/\/[^\s"'<>]+/gi) ?? [];
return uniquePreserveOrder(found.filter((url) => isHttpLink(url)));
}
if (Array.isArray(data)) {
return uniquePreserveOrder(data.flatMap((item) => extractUrlsRecursive(item)));
}
if (data && typeof data === "object") {
return uniquePreserveOrder(Object.values(data as Record<string, unknown>).flatMap((value) => extractUrlsRecursive(value)));
}
return [];
}
function groupLinksByName(links: string[]): ParsedPackageInput[] {
const unique = uniquePreserveOrder(links.filter((link) => isHttpLink(link)));
const grouped = new Map<string, string[]>();
for (const link of unique) {
const name = sanitizeFilename(inferPackageNameFromLinks([link]) || "Paket");
const current = grouped.get(name) ?? [];
current.push(link);
grouped.set(name, current);
}
return Array.from(grouped.entries()).map(([name, packageLinks]) => ({ name, links: packageLinks }));
}
function extractPackagesFromPayload(payload: unknown): ParsedPackageInput[] {
const urls = extractUrlsRecursive(payload);
if (urls.length === 0) {
return [];
}
return groupLinksByName(urls);
}
function decryptRcPayload(base64Rc: string): Buffer {
const rcBytes = Buffer.from(base64Rc, "base64");
const decipher = crypto.createDecipheriv("aes-128-cbc", DLC_AES_KEY, DLC_AES_IV);
decipher.setAutoPadding(false);
return Buffer.concat([decipher.update(rcBytes), decipher.final()]);
}
function readDlcFileWithLimit(filePath: string): Buffer {
const stat = fs.statSync(filePath);
if (stat.size <= 0 || stat.size > MAX_DLC_FILE_BYTES) {
throw new Error(`DLC-Datei ungültig oder zu groß (${Math.floor(stat.size)} B)`);
}
return fs.readFileSync(filePath);
}
function parsePackagesFromDlcXml(xml: string): ParsedPackageInput[] {
const packages: ParsedPackageInput[] = [];
const packageRegex = /<package\s+[^>]*name="([^"]*)"[^>]*>([\s\S]*?)<\/package>/gi;
for (let m = packageRegex.exec(xml); m; m = packageRegex.exec(xml)) {
const encodedName = m[1] || "";
const packageBody = m[2] || "";
let packageName = "";
if (encodedName) {
try {
packageName = Buffer.from(encodedName, "base64").toString("utf8");
} catch {
packageName = encodedName;
}
}
const links: string[] = [];
const fileNames: string[] = [];
const fileRegex = /<file>([\s\S]*?)<\/file>/gi;
for (let fm = fileRegex.exec(packageBody); fm; fm = fileRegex.exec(packageBody)) {
const fileBody = fm[1] || "";
const urlMatch = fileBody.match(/<url>(.*?)<\/url>/i);
if (!urlMatch) {
continue;
}
try {
const url = Buffer.from((urlMatch[1] || "").trim(), "base64").toString("utf8").trim();
if (!isHttpLink(url)) {
continue;
}
let fileName = "";
const fnMatch = fileBody.match(/<filename>(.*?)<\/filename>/i);
if (fnMatch?.[1]) {
try {
fileName = Buffer.from(fnMatch[1].trim(), "base64").toString("utf8").trim();
} catch {
// ignore
}
}
links.push(url);
fileNames.push(sanitizeFilename(fileName));
} catch {
// skip broken entries
}
}
if (links.length === 0) {
const urlRegex = /<url>(.*?)<\/url>/gi;
for (let um = urlRegex.exec(packageBody); um; um = urlRegex.exec(packageBody)) {
try {
const url = Buffer.from((um[1] || "").trim(), "base64").toString("utf8").trim();
if (isHttpLink(url)) {
links.push(url);
}
} catch {
// skip broken entries
}
}
}
const uniqueLinks = uniquePreserveOrder(links);
const hasFileNames = fileNames.some((fn) => fn.length > 0);
if (uniqueLinks.length > 0) {
const pkg: ParsedPackageInput = {
name: sanitizeFilename(packageName || inferPackageNameFromLinks(uniqueLinks) || `Paket-${packages.length + 1}`),
links: uniqueLinks
};
if (hasFileNames) {
pkg.fileNames = fileNames;
}
packages.push(pkg);
}
}
return packages;
}
async function decryptDlcLocal(filePath: string): Promise<ParsedPackageInput[]> {
const content = readDlcFileWithLimit(filePath).toString("ascii").trim();
if (content.length < 89) {
return [];
}
const dlcKey = content.slice(-88);
const dlcData = content.slice(0, -88);
const rcUrl = DLC_SERVICE_URL.replace("{KEY}", encodeURIComponent(dlcKey));
const rcResponse = await fetch(rcUrl, { method: "GET", signal: AbortSignal.timeout(30000) });
if (!rcResponse.ok) {
return [];
}
const rcText = await rcResponse.text();
const rcMatch = rcText.match(/<rc>(.*?)<\/rc>/i);
if (!rcMatch) {
return [];
}
const realKey = decryptRcPayload(rcMatch[1]).subarray(0, 16);
const encrypted = Buffer.from(dlcData, "base64");
const decipher = crypto.createDecipheriv("aes-128-cbc", realKey, realKey);
decipher.setAutoPadding(false);
let decrypted = Buffer.concat([decipher.update(encrypted), decipher.final()]);
if (decrypted.length === 0) {
return [];
}
const pad = decrypted[decrypted.length - 1];
if (pad > 0 && pad <= 16 && pad <= decrypted.length) {
let validPad = true;
for (let index = 1; index <= pad; index += 1) {
if (decrypted[decrypted.length - index] !== pad) {
validPad = false;
break;
}
}
if (validPad) {
decrypted = decrypted.subarray(0, decrypted.length - pad);
}
}
const xmlData = Buffer.from(decrypted.toString("utf8"), "base64").toString("utf8");
return parsePackagesFromDlcXml(xmlData);
}
function extractLinksFromResponse(text: string): string[] {
const payload = decodeDcryptPayload(text);
let links = extractUrlsRecursive(payload);
if (links.length === 0) {
links = extractUrlsRecursive(text);
}
return uniquePreserveOrder(links.filter((l) => isHttpLink(l)));
}
async function tryDcryptUpload(fileContent: Buffer, fileName: string): Promise<string[] | null> {
const blob = new Blob([new Uint8Array(fileContent)]);
const form = new FormData();
form.set("dlcfile", blob, fileName);
const response = await fetch(DCRYPT_UPLOAD_URL, {
method: "POST",
body: form,
signal: AbortSignal.timeout(30000)
});
if (response.status === 413) {
return null;
}
const text = await response.text();
if (!response.ok) {
throw new Error(compactErrorText(text));
}
return extractLinksFromResponse(text);
}
async function tryDcryptPaste(fileContent: Buffer): Promise<string[] | null> {
const form = new FormData();
form.set("content", fileContent.toString("ascii"));
const response = await fetch(DCRYPT_PASTE_URL, {
method: "POST",
body: form,
signal: AbortSignal.timeout(30000)
});
if (response.status === 413) {
return null;
}
const text = await response.text();
if (!response.ok) {
throw new Error(compactErrorText(text));
}
return extractLinksFromResponse(text);
}
async function decryptDlcViaDcrypt(filePath: string): Promise<ParsedPackageInput[]> {
const fileContent = readDlcFileWithLimit(filePath);
const fileName = path.basename(filePath);
const packageName = sanitizeFilename(path.basename(filePath, ".dlc")) || "Paket";
let links = await tryDcryptUpload(fileContent, fileName);
if (links === null) {
links = await tryDcryptPaste(fileContent);
}
if (links === null) {
throw new Error("DLC-Datei zu groß für dcrypt.it");
}
if (links.length === 0) {
return [];
}
return [{ name: packageName, links }];
}
export async function importDlcContainers(filePaths: string[]): Promise<ParsedPackageInput[]> {
const out: ParsedPackageInput[] = [];
const failures: string[] = [];
let sawDlc = false;
for (const filePath of filePaths) {
if (path.extname(filePath).toLowerCase() !== ".dlc") {
continue;
}
sawDlc = true;
let packages: ParsedPackageInput[] = [];
let fileFailed = false;
let fileFailureReasons: string[] = [];
try {
packages = await decryptDlcLocal(filePath);
} catch (error) {
if (isContainerSizeValidationError(error)) {
failures.push(`${path.basename(filePath)}: ${compactErrorText(error)}`);
continue;
}
fileFailed = true;
fileFailureReasons.push(`lokal: ${compactErrorText(error)}`);
packages = [];
}
if (packages.length === 0) {
try {
packages = await decryptDlcViaDcrypt(filePath);
} catch (error) {
if (isContainerSizeValidationError(error)) {
failures.push(`${path.basename(filePath)}: ${compactErrorText(error)}`);
continue;
}
fileFailed = true;
fileFailureReasons.push(`dcrypt: ${compactErrorText(error)}`);
packages = [];
}
}
if (packages.length === 0 && fileFailed) {
failures.push(`${path.basename(filePath)}: ${fileFailureReasons.join("; ")}`);
}
out.push(...packages);
}
if (out.length === 0 && sawDlc && failures.length > 0) {
const details = failures.slice(0, 2).join(" | ");
const suffix = failures.length > 2 ? ` (+${failures.length - 2} weitere)` : "";
throw new Error(`DLC konnte nicht importiert werden: ${details}${suffix}`);
}
return out;
}

1358
src/main/debrid.ts Normal file

File diff suppressed because it is too large Load Diff

279
src/main/debug-server.ts Normal file
View File

@ -0,0 +1,279 @@
import http from "node:http";
import fs from "node:fs";
import path from "node:path";
import { logger, getLogFilePath } from "./logger";
import type { DownloadManager } from "./download-manager";
const DEFAULT_PORT = 9868;
const MAX_LOG_LINES = 10000;
let server: http.Server | null = null;
let manager: DownloadManager | null = null;
let authToken = "";
function loadToken(baseDir: string): string {
const tokenPath = path.join(baseDir, "debug_token.txt");
try {
return fs.readFileSync(tokenPath, "utf8").trim();
} catch {
return "";
}
}
function getPort(baseDir: string): number {
const portPath = path.join(baseDir, "debug_port.txt");
try {
const n = Number(fs.readFileSync(portPath, "utf8").trim());
if (Number.isFinite(n) && n >= 1024 && n <= 65535) {
return n;
}
} catch {
// ignore
}
return DEFAULT_PORT;
}
function checkAuth(req: http.IncomingMessage): boolean {
if (!authToken) {
return false;
}
const header = req.headers.authorization || "";
if (header === `Bearer ${authToken}`) {
return true;
}
const url = new URL(req.url || "/", "http://localhost");
return url.searchParams.get("token") === authToken;
}
function jsonResponse(res: http.ServerResponse, status: number, data: unknown): void {
const body = JSON.stringify(data, null, 2);
res.writeHead(status, {
"Content-Type": "application/json; charset=utf-8",
"Access-Control-Allow-Origin": "*",
"Cache-Control": "no-cache"
});
res.end(body);
}
function readLogTail(lines: number): string[] {
const logPath = getLogFilePath();
try {
const content = fs.readFileSync(logPath, "utf8");
const allLines = content.split("\n").filter((l) => l.trim().length > 0);
return allLines.slice(-Math.min(lines, MAX_LOG_LINES));
} catch {
return ["(Log-Datei nicht lesbar)"];
}
}
function handleRequest(req: http.IncomingMessage, res: http.ServerResponse): void {
if (req.method === "OPTIONS") {
res.writeHead(204, {
"Access-Control-Allow-Origin": "*",
"Access-Control-Allow-Headers": "Authorization"
});
res.end();
return;
}
if (!checkAuth(req)) {
jsonResponse(res, 401, { error: "Unauthorized" });
return;
}
const url = new URL(req.url || "/", "http://localhost");
const pathname = url.pathname;
if (pathname === "/health") {
jsonResponse(res, 200, {
status: "ok",
uptime: Math.floor(process.uptime()),
memoryMB: Math.round(process.memoryUsage().rss / 1024 / 1024)
});
return;
}
if (pathname === "/log") {
const count = Math.min(Number(url.searchParams.get("lines") || "100"), MAX_LOG_LINES);
const grep = url.searchParams.get("grep") || "";
let lines = readLogTail(count);
if (grep) {
const pattern = grep.toLowerCase();
lines = lines.filter((l) => l.toLowerCase().includes(pattern));
}
jsonResponse(res, 200, { lines, count: lines.length });
return;
}
if (pathname === "/status") {
if (!manager) {
jsonResponse(res, 503, { error: "Manager not initialized" });
return;
}
const snapshot = manager.getSnapshot();
const items = Object.values(snapshot.session.items);
const packages = Object.values(snapshot.session.packages);
const byStatus: Record<string, number> = {};
for (const item of items) {
byStatus[item.status] = (byStatus[item.status] || 0) + 1;
}
const activeItems = items
.filter((i) => i.status === "downloading" || i.status === "validating")
.map((i) => ({
id: i.id,
fileName: i.fileName,
status: i.status,
fullStatus: i.fullStatus,
provider: i.provider,
progress: i.progressPercent,
speedMBs: +(i.speedBps / 1024 / 1024).toFixed(2),
downloadedMB: +(i.downloadedBytes / 1024 / 1024).toFixed(1),
totalMB: i.totalBytes ? +(i.totalBytes / 1024 / 1024).toFixed(1) : null,
retries: i.retries,
lastError: i.lastError
}));
const failedItems = items
.filter((i) => i.status === "failed")
.map((i) => ({
fileName: i.fileName,
lastError: i.lastError,
retries: i.retries,
provider: i.provider
}));
jsonResponse(res, 200, {
running: snapshot.session.running,
paused: snapshot.session.paused,
speed: snapshot.speedText,
eta: snapshot.etaText,
itemCounts: byStatus,
totalItems: items.length,
packages: packages.map((p) => ({
name: p.name,
status: p.status,
items: p.itemIds.length
})),
activeItems,
failedItems: failedItems.length > 0 ? failedItems : undefined
});
return;
}
if (pathname === "/items") {
if (!manager) {
jsonResponse(res, 503, { error: "Manager not initialized" });
return;
}
const snapshot = manager.getSnapshot();
const filter = url.searchParams.get("status");
const pkg = url.searchParams.get("package");
let items = Object.values(snapshot.session.items);
if (filter) {
items = items.filter((i) => i.status === filter);
}
if (pkg) {
const pkgLower = pkg.toLowerCase();
const matchedPkg = Object.values(snapshot.session.packages)
.find((p) => p.name.toLowerCase().includes(pkgLower));
if (matchedPkg) {
const ids = new Set(matchedPkg.itemIds);
items = items.filter((i) => ids.has(i.id));
}
}
jsonResponse(res, 200, {
count: items.length,
items: items.map((i) => ({
fileName: i.fileName,
status: i.status,
fullStatus: i.fullStatus,
provider: i.provider,
progress: i.progressPercent,
speedMBs: +(i.speedBps / 1024 / 1024).toFixed(2),
downloadedMB: +(i.downloadedBytes / 1024 / 1024).toFixed(1),
totalMB: i.totalBytes ? +(i.totalBytes / 1024 / 1024).toFixed(1) : null,
retries: i.retries,
lastError: i.lastError
}))
});
return;
}
if (pathname === "/session") {
if (!manager) {
jsonResponse(res, 503, { error: "Manager not initialized" });
return;
}
const snapshot = manager.getSnapshot();
const pkg = url.searchParams.get("package");
if (pkg) {
const pkgLower = pkg.toLowerCase();
const matchedPkg = Object.values(snapshot.session.packages)
.find((p) => p.name.toLowerCase().includes(pkgLower));
if (matchedPkg) {
const ids = new Set(matchedPkg.itemIds);
const pkgItems = Object.values(snapshot.session.items)
.filter((i) => ids.has(i.id));
jsonResponse(res, 200, {
package: matchedPkg,
items: pkgItems
});
return;
}
}
jsonResponse(res, 200, {
running: snapshot.session.running,
paused: snapshot.session.paused,
packageCount: Object.keys(snapshot.session.packages).length,
itemCount: Object.keys(snapshot.session.items).length,
packages: Object.values(snapshot.session.packages).map((p) => ({
id: p.id,
name: p.name,
status: p.status,
items: p.itemIds.length
}))
});
return;
}
jsonResponse(res, 404, {
error: "Not found",
endpoints: [
"GET /health",
"GET /log?lines=100&grep=keyword",
"GET /status",
"GET /items?status=downloading&package=Bloodline",
"GET /session?package=Criminal"
]
});
}
export function startDebugServer(mgr: DownloadManager, baseDir: string): void {
authToken = loadToken(baseDir);
if (!authToken) {
logger.info("Debug-Server: Kein Token in debug_token.txt, Server wird nicht gestartet");
return;
}
manager = mgr;
const port = getPort(baseDir);
server = http.createServer(handleRequest);
server.listen(port, "127.0.0.1", () => {
logger.info(`Debug-Server gestartet auf Port ${port}`);
});
server.on("error", (err) => {
logger.warn(`Debug-Server Fehler: ${String(err)}`);
server = null;
});
}
export function stopDebugServer(): void {
if (server) {
server.close();
server = null;
logger.info("Debug-Server gestoppt");
}
}

7365
src/main/download-manager.ts Normal file

File diff suppressed because it is too large Load Diff

2637
src/main/extractor.ts Normal file

File diff suppressed because it is too large Load Diff

163
src/main/integrity.ts Normal file
View File

@ -0,0 +1,163 @@
import fs from "node:fs";
import path from "node:path";
import crypto from "node:crypto";
import { ParsedHashEntry } from "../shared/types";
import { MAX_MANIFEST_FILE_BYTES } from "./constants";
const manifestCache = new Map<string, { at: number; entries: Map<string, ParsedHashEntry> }>();
const MANIFEST_CACHE_TTL_MS = 15000;
function normalizeManifestKey(value: string): string {
return String(value || "")
.replace(/\\/g, "/")
.replace(/^\.\//, "")
.trim()
.toLowerCase();
}
export function parseHashLine(line: string): ParsedHashEntry | null {
const text = String(line || "").trim();
if (!text || text.startsWith(";")) {
return null;
}
const md = text.match(/^([0-9a-fA-F]{32}|[0-9a-fA-F]{40})\s+\*?(.+)$/);
if (md) {
const digest = md[1].toLowerCase();
return {
fileName: md[2].trim(),
algorithm: digest.length === 32 ? "md5" : "sha1",
digest
};
}
const sfv = text.match(/^(.+?)\s+([0-9A-Fa-f]{8})$/);
if (sfv) {
return {
fileName: sfv[1].trim(),
algorithm: "crc32",
digest: sfv[2].toLowerCase()
};
}
return null;
}
export function readHashManifest(packageDir: string): Map<string, ParsedHashEntry> {
const cacheKey = path.resolve(packageDir);
const cached = manifestCache.get(cacheKey);
if (cached && Date.now() - cached.at <= MANIFEST_CACHE_TTL_MS) {
return new Map(cached.entries);
}
const map = new Map<string, ParsedHashEntry>();
const patterns: Array<[string, "crc32" | "md5" | "sha1"]> = [
[".sfv", "crc32"],
[".md5", "md5"],
[".sha1", "sha1"]
];
if (!fs.existsSync(packageDir)) {
return map;
}
const manifestFiles = fs.readdirSync(packageDir, { withFileTypes: true })
.filter((entry) => {
if (!entry.isFile()) {
return false;
}
const ext = path.extname(entry.name).toLowerCase();
return patterns.some(([pattern]) => pattern === ext);
})
.sort((a, b) => a.name.localeCompare(b.name, undefined, { numeric: true, sensitivity: "base" }));
for (const entry of manifestFiles) {
if (!entry.isFile()) {
continue;
}
const ext = path.extname(entry.name).toLowerCase();
const hit = patterns.find(([pattern]) => pattern === ext);
if (!hit) {
continue;
}
const filePath = path.join(packageDir, entry.name);
let lines: string[];
try {
const stat = fs.statSync(filePath);
if (stat.size > MAX_MANIFEST_FILE_BYTES) {
continue;
}
lines = fs.readFileSync(filePath, "utf8").split(/\r?\n/);
} catch {
continue;
}
for (const line of lines) {
const parsed = parseHashLine(line);
if (!parsed) {
continue;
}
const normalized: ParsedHashEntry = {
...parsed,
algorithm: hit[1]
};
const key = normalizeManifestKey(parsed.fileName);
if (map.has(key)) {
continue;
}
map.set(key, normalized);
}
}
manifestCache.set(cacheKey, { at: Date.now(), entries: new Map(map) });
return map;
}
const crcTable = new Int32Array(256);
for (let i = 0; i < 256; i++) {
let c = i;
for (let j = 0; j < 8; j++) c = c & 1 ? (0xedb88320 ^ (c >>> 1)) : (c >>> 1);
crcTable[i] = c;
}
function crc32Buffer(data: Buffer, seed = 0): number {
let crc = seed ^ -1;
for (let i = 0; i < data.length; i++) {
crc = (crc >>> 8) ^ crcTable[(crc ^ data[i]) & 0xff];
}
return crc ^ -1;
}
async function hashFile(filePath: string, algorithm: "crc32" | "md5" | "sha1"): Promise<string> {
if (algorithm === "crc32") {
const stream = fs.createReadStream(filePath, { highWaterMark: 1024 * 1024 });
let crc = 0;
for await (const chunk of stream) {
crc = crc32Buffer(Buffer.isBuffer(chunk) ? chunk : Buffer.from(chunk), crc);
await new Promise(r => setImmediate(r));
}
return (crc >>> 0).toString(16).padStart(8, "0").toLowerCase();
}
const hash = crypto.createHash(algorithm);
const stream = fs.createReadStream(filePath, { highWaterMark: 1024 * 1024 });
return await new Promise<string>((resolve, reject) => {
stream.on("data", (chunk: string | Buffer) => hash.update(typeof chunk === "string" ? Buffer.from(chunk) : chunk));
stream.on("error", reject);
stream.on("end", () => resolve(hash.digest("hex").toLowerCase()));
});
}
export async function validateFileAgainstManifest(filePath: string, packageDir: string): Promise<{ ok: boolean; message: string }> {
const manifest = readHashManifest(packageDir);
if (manifest.size === 0) {
return { ok: true, message: "Kein Hash verfügbar" };
}
const keyByBaseName = normalizeManifestKey(path.basename(filePath));
const keyByRelativePath = normalizeManifestKey(path.relative(packageDir, filePath));
const entry = manifest.get(keyByRelativePath) || manifest.get(keyByBaseName);
if (!entry) {
return { ok: true, message: "Kein Hash für Datei" };
}
const actual = await hashFile(filePath, entry.algorithm);
if (actual === entry.digest.toLowerCase()) {
return { ok: true, message: `${entry.algorithm.toUpperCase()} ok` };
}
return { ok: false, message: `${entry.algorithm.toUpperCase()} mismatch` };
}

26
src/main/link-parser.ts Normal file
View File

@ -0,0 +1,26 @@
import { ParsedPackageInput } from "../shared/types";
import { inferPackageNameFromLinks, parsePackagesFromLinksText, sanitizeFilename, uniquePreserveOrder } from "./utils";
export function mergePackageInputs(packages: ParsedPackageInput[]): ParsedPackageInput[] {
const grouped = new Map<string, string[]>();
for (const pkg of packages) {
const name = sanitizeFilename(pkg.name || inferPackageNameFromLinks(pkg.links));
const list = grouped.get(name) ?? [];
for (const link of pkg.links) {
list.push(link);
}
grouped.set(name, list);
}
return Array.from(grouped.entries()).map(([name, links]) => ({
name,
links: uniquePreserveOrder(links)
}));
}
export function parseCollectorInput(rawText: string, packageName = ""): ParsedPackageInput[] {
const parsed = parsePackagesFromLinksText(rawText, packageName);
if (parsed.length === 0) {
return [];
}
return mergePackageInputs(parsed);
}

225
src/main/logger.ts Normal file
View File

@ -0,0 +1,225 @@
import fs from "node:fs";
import path from "node:path";
let logFilePath = path.resolve(process.cwd(), "rd_downloader.log");
let fallbackLogFilePath: string | null = null;
const LOG_FLUSH_INTERVAL_MS = 120;
const LOG_BUFFER_LIMIT_CHARS = 1_000_000;
const LOG_MAX_FILE_BYTES = 10 * 1024 * 1024;
const rotateCheckAtByFile = new Map<string, number>();
type LogListener = (line: string) => void;
let logListener: LogListener | null = null;
let pendingLines: string[] = [];
let pendingChars = 0;
let flushTimer: NodeJS.Timeout | null = null;
let flushInFlight = false;
let exitHookAttached = false;
export function setLogListener(listener: LogListener | null): void {
logListener = listener;
}
export function configureLogger(baseDir: string): void {
logFilePath = path.join(baseDir, "rd_downloader.log");
const cwdLogPath = path.resolve(process.cwd(), "rd_downloader.log");
fallbackLogFilePath = cwdLogPath === logFilePath ? null : cwdLogPath;
}
function appendLine(filePath: string, line: string): { ok: boolean; errorText: string } {
try {
fs.mkdirSync(path.dirname(filePath), { recursive: true });
fs.appendFileSync(filePath, line, "utf8");
return { ok: true, errorText: "" };
} catch (error) {
return { ok: false, errorText: String(error) };
}
}
async function appendChunk(filePath: string, chunk: string): Promise<{ ok: boolean; errorText: string }> {
try {
await fs.promises.mkdir(path.dirname(filePath), { recursive: true });
await fs.promises.appendFile(filePath, chunk, "utf8");
return { ok: true, errorText: "" };
} catch (error) {
return { ok: false, errorText: String(error) };
}
}
function writeStderr(text: string): void {
try {
process.stderr.write(text);
} catch {
// ignore stderr failures
}
}
function flushSyncPending(): void {
if (pendingLines.length === 0) {
return;
}
const chunk = pendingLines.join("");
pendingLines = [];
pendingChars = 0;
rotateIfNeeded(logFilePath);
const primary = appendLine(logFilePath, chunk);
if (fallbackLogFilePath) {
rotateIfNeeded(fallbackLogFilePath);
const fallback = appendLine(fallbackLogFilePath, chunk);
if (!primary.ok && !fallback.ok) {
writeStderr(`LOGGER write failed (primary+fallback): ${primary.errorText} | ${fallback.errorText}\n`);
}
return;
}
if (!primary.ok) {
writeStderr(`LOGGER write failed: ${primary.errorText}\n`);
}
}
function scheduleFlush(immediate = false): void {
if (flushInFlight) {
return;
}
if (immediate) {
if (flushTimer) {
clearTimeout(flushTimer);
flushTimer = null;
}
void flushAsync();
return;
}
if (flushTimer) {
return;
}
flushTimer = setTimeout(() => {
flushTimer = null;
void flushAsync();
}, LOG_FLUSH_INTERVAL_MS);
}
function rotateIfNeeded(filePath: string): void {
try {
const now = Date.now();
const lastRotateCheckAt = rotateCheckAtByFile.get(filePath) || 0;
if (now - lastRotateCheckAt < 60_000) {
return;
}
rotateCheckAtByFile.set(filePath, now);
const stat = fs.statSync(filePath);
if (stat.size < LOG_MAX_FILE_BYTES) {
return;
}
const backup = `${filePath}.old`;
try {
fs.rmSync(backup, { force: true });
} catch {
// ignore
}
fs.renameSync(filePath, backup);
} catch {
// ignore - file may not exist yet
}
}
async function rotateIfNeededAsync(filePath: string): Promise<void> {
try {
const now = Date.now();
const lastRotateCheckAt = rotateCheckAtByFile.get(filePath) || 0;
if (now - lastRotateCheckAt < 60_000) {
return;
}
rotateCheckAtByFile.set(filePath, now);
const stat = await fs.promises.stat(filePath);
if (stat.size < LOG_MAX_FILE_BYTES) {
return;
}
const backup = `${filePath}.old`;
await fs.promises.rm(backup, { force: true }).catch(() => {});
await fs.promises.rename(filePath, backup);
} catch {
// ignore - file may not exist yet
}
}
async function flushAsync(): Promise<void> {
if (flushInFlight || pendingLines.length === 0) {
return;
}
flushInFlight = true;
const linesSnapshot = pendingLines.slice();
const chunk = linesSnapshot.join("");
try {
await rotateIfNeededAsync(logFilePath);
const primary = await appendChunk(logFilePath, chunk);
let wroteAny = primary.ok;
if (fallbackLogFilePath) {
await rotateIfNeededAsync(fallbackLogFilePath);
const fallback = await appendChunk(fallbackLogFilePath, chunk);
wroteAny = wroteAny || fallback.ok;
if (!primary.ok && !fallback.ok) {
writeStderr(`LOGGER write failed (primary+fallback): ${primary.errorText} | ${fallback.errorText}\n`);
}
} else if (!primary.ok) {
writeStderr(`LOGGER write failed: ${primary.errorText}\n`);
}
if (wroteAny) {
pendingLines = pendingLines.slice(linesSnapshot.length);
pendingChars = Math.max(0, pendingChars - chunk.length);
}
} finally {
flushInFlight = false;
if (pendingLines.length > 0) {
scheduleFlush();
}
}
}
function ensureExitHook(): void {
if (exitHookAttached) {
return;
}
exitHookAttached = true;
process.once("beforeExit", flushSyncPending);
process.once("exit", flushSyncPending);
}
function write(level: "INFO" | "WARN" | "ERROR", message: string): void {
ensureExitHook();
const line = `${new Date().toISOString()} [${level}] ${message}\n`;
pendingLines.push(line);
pendingChars += line.length;
if (logListener) {
try { logListener(line); } catch { /* ignore */ }
}
while (pendingChars > LOG_BUFFER_LIMIT_CHARS && pendingLines.length > 1) {
const removed = pendingLines.shift();
if (!removed) {
break;
}
pendingChars = Math.max(0, pendingChars - removed.length);
}
if (level === "ERROR") {
scheduleFlush(true);
return;
}
scheduleFlush();
}
export const logger = {
info: (msg: string): void => write("INFO", msg),
warn: (msg: string): void => write("WARN", msg),
error: (msg: string): void => write("ERROR", msg)
};
export function getLogFilePath(): string {
return logFilePath;
}

524
src/main/main.ts Normal file
View File

@ -0,0 +1,524 @@
import fs from "node:fs";
import path from "node:path";
import { app, BrowserWindow, clipboard, dialog, ipcMain, IpcMainInvokeEvent, Menu, shell, Tray } from "electron";
import { AddLinksPayload, AppSettings, UpdateInstallProgress } from "../shared/types";
import { AppController } from "./app-controller";
import { IPC_CHANNELS } from "../shared/ipc";
import { getLogFilePath, logger } from "./logger";
import { APP_NAME } from "./constants";
import { extractHttpLinksFromText } from "./utils";
import { cleanupStaleSubstDrives, shutdownDaemon } from "./extractor";
/* ── IPC validation helpers ────────────────────────────────────── */
function validateString(value: unknown, name: string): string {
if (typeof value !== "string") {
throw new Error(`${name} muss ein String sein`);
}
return value;
}
function validatePlainObject(value: unknown, name: string): Record<string, unknown> {
if (!value || typeof value !== "object" || Array.isArray(value)) {
throw new Error(`${name} muss ein Objekt sein`);
}
return value as Record<string, unknown>;
}
const IMPORT_QUEUE_MAX_BYTES = 10 * 1024 * 1024;
const RENAME_PACKAGE_MAX_CHARS = 240;
function validateStringArray(value: unknown, name: string): string[] {
if (!Array.isArray(value) || !value.every(v => typeof v === "string")) {
throw new Error(`${name} muss ein String-Array sein`);
}
return value as string[];
}
/* ── Single Instance Lock ───────────────────────────────────────── */
const gotLock = app.requestSingleInstanceLock();
if (!gotLock) {
app.exit(0);
process.exit(0);
}
/* ── Unhandled error protection ─────────────────────────────────── */
process.on("uncaughtException", (error) => {
logger.error(`Uncaught Exception: ${String(error?.stack || error)}`);
});
process.on("unhandledRejection", (reason) => {
logger.error(`Unhandled Rejection: ${String(reason)}`);
});
let mainWindow: BrowserWindow | null = null;
let tray: Tray | null = null;
let clipboardTimer: ReturnType<typeof setInterval> | null = null;
let updateQuitTimer: ReturnType<typeof setTimeout> | null = null;
let lastClipboardText = "";
const controller = new AppController();
const CLIPBOARD_MAX_TEXT_CHARS = 50_000;
function isDevMode(): boolean {
return process.env.NODE_ENV === "development";
}
function createWindow(): BrowserWindow {
const window = new BrowserWindow({
width: 1440,
height: 940,
minWidth: 1120,
minHeight: 760,
backgroundColor: "#070b14",
title: `${APP_NAME} - v${controller.getVersion()}`,
icon: path.join(app.getAppPath(), "assets", "app_icon.ico"),
webPreferences: {
contextIsolation: true,
nodeIntegration: false,
preload: path.join(__dirname, "../preload/preload.js")
}
});
if (!isDevMode()) {
window.webContents.session.webRequest.onHeadersReceived((details, callback) => {
callback({
responseHeaders: {
...details.responseHeaders,
"Content-Security-Policy": [
"default-src 'self'; script-src 'self'; style-src 'self' 'unsafe-inline'; img-src 'self' data:; connect-src 'self' https://api.real-debrid.com https://codeberg.org https://bestdebrid.com https://api.alldebrid.com https://www.mega-debrid.eu https://git.24-music.de https://ddownload.com https://ddl.to"
]
}
});
});
}
window.setMenuBarVisibility(false);
window.setAutoHideMenuBar(true);
if (isDevMode()) {
void window.loadURL("http://localhost:5173");
} else {
void window.loadFile(path.join(app.getAppPath(), "build", "renderer", "index.html"));
}
return window;
}
function bindMainWindowLifecycle(window: BrowserWindow): void {
window.on("close", (event) => {
const settings = controller.getSettings();
if (settings.minimizeToTray && tray) {
event.preventDefault();
window.hide();
}
});
window.on("closed", () => {
if (mainWindow === window) {
mainWindow = null;
}
});
}
function createTray(): void {
if (tray) {
return;
}
const iconPath = path.join(app.getAppPath(), "assets", "app_icon.ico");
try {
tray = new Tray(iconPath);
} catch {
return;
}
tray.setToolTip(APP_NAME);
const contextMenu = Menu.buildFromTemplate([
{ label: "Anzeigen", click: () => { mainWindow?.show(); mainWindow?.focus(); } },
{ type: "separator" },
{ label: "Start", click: () => { void controller.start().catch((err) => logger.warn(`Tray Start Fehler: ${String(err)}`)); } },
{ label: "Stop", click: () => { controller.stop(); } },
{ type: "separator" },
{ label: "Beenden", click: () => { app.quit(); } }
]);
tray.setContextMenu(contextMenu);
tray.on("double-click", () => {
mainWindow?.show();
mainWindow?.focus();
});
}
function destroyTray(): void {
if (tray) {
tray.destroy();
tray = null;
}
}
function extractLinksFromText(text: string): string[] {
return extractHttpLinksFromText(text);
}
function normalizeClipboardText(text: string): string {
const truncateUnicodeSafe = (value: string, maxChars: number): string => {
if (value.length <= maxChars) {
return value;
}
const points = Array.from(value);
if (points.length <= maxChars) {
return value;
}
return points.slice(0, maxChars).join("");
};
const normalized = String(text || "");
if (normalized.length <= CLIPBOARD_MAX_TEXT_CHARS) {
return normalized;
}
const truncated = truncateUnicodeSafe(normalized, CLIPBOARD_MAX_TEXT_CHARS);
const lastBreak = Math.max(
truncated.lastIndexOf("\n"),
truncated.lastIndexOf("\r"),
truncated.lastIndexOf("\t"),
truncated.lastIndexOf(" ")
);
if (lastBreak >= Math.floor(CLIPBOARD_MAX_TEXT_CHARS * 0.7)) {
return truncated.slice(0, lastBreak);
}
return truncated;
}
function startClipboardWatcher(): void {
if (clipboardTimer) {
return;
}
lastClipboardText = normalizeClipboardText(clipboard.readText());
clipboardTimer = setInterval(() => {
let text: string;
try {
text = normalizeClipboardText(clipboard.readText());
} catch {
return;
}
if (text === lastClipboardText || !text.trim()) {
return;
}
lastClipboardText = text;
const links = extractLinksFromText(text);
if (links.length > 0 && mainWindow && !mainWindow.isDestroyed()) {
mainWindow.webContents.send(IPC_CHANNELS.CLIPBOARD_DETECTED, links);
}
}, 2000);
}
function stopClipboardWatcher(): void {
if (clipboardTimer) {
clearInterval(clipboardTimer);
clipboardTimer = null;
}
}
function updateClipboardWatcher(): void {
const settings = controller.getSettings();
if (settings.clipboardWatch) {
startClipboardWatcher();
} else {
stopClipboardWatcher();
}
}
function updateTray(): void {
const settings = controller.getSettings();
if (settings.minimizeToTray) {
createTray();
} else {
destroyTray();
}
}
function registerIpcHandlers(): void {
ipcMain.handle(IPC_CHANNELS.GET_SNAPSHOT, () => controller.getSnapshot());
ipcMain.handle(IPC_CHANNELS.GET_VERSION, () => controller.getVersion());
ipcMain.handle(IPC_CHANNELS.CHECK_UPDATES, async () => controller.checkUpdates());
ipcMain.handle(IPC_CHANNELS.INSTALL_UPDATE, async () => {
const result = await controller.installUpdate((progress: UpdateInstallProgress) => {
if (!mainWindow || mainWindow.isDestroyed()) {
return;
}
mainWindow.webContents.send(IPC_CHANNELS.UPDATE_INSTALL_PROGRESS, progress);
});
if (result.started) {
updateQuitTimer = setTimeout(() => {
app.quit();
}, 2500);
}
return result;
});
ipcMain.handle(IPC_CHANNELS.OPEN_EXTERNAL, async (_event: IpcMainInvokeEvent, rawUrl: string) => {
try {
const parsed = new URL(String(rawUrl || "").trim());
if (parsed.protocol !== "https:" && parsed.protocol !== "http:") {
return false;
}
await shell.openExternal(parsed.toString());
return true;
} catch {
return false;
}
});
ipcMain.handle(IPC_CHANNELS.UPDATE_SETTINGS, (_event: IpcMainInvokeEvent, partial: Partial<AppSettings>) => {
const validated = validatePlainObject(partial ?? {}, "partial");
const result = controller.updateSettings(validated as Partial<AppSettings>);
updateClipboardWatcher();
updateTray();
return result;
});
ipcMain.handle(IPC_CHANNELS.ADD_LINKS, (_event: IpcMainInvokeEvent, payload: AddLinksPayload) => {
validatePlainObject(payload ?? {}, "payload");
validateString(payload?.rawText, "rawText");
if (payload.packageName !== undefined) {
validateString(payload.packageName, "packageName");
}
if (payload.duplicatePolicy !== undefined && payload.duplicatePolicy !== "keep" && payload.duplicatePolicy !== "skip" && payload.duplicatePolicy !== "overwrite") {
throw new Error("duplicatePolicy muss 'keep', 'skip' oder 'overwrite' sein");
}
return controller.addLinks(payload);
});
ipcMain.handle(IPC_CHANNELS.ADD_CONTAINERS, async (_event: IpcMainInvokeEvent, filePaths: string[]) => {
const validPaths = validateStringArray(filePaths ?? [], "filePaths");
const safePaths = validPaths.filter((p) => path.isAbsolute(p));
return controller.addContainers(safePaths);
});
ipcMain.handle(IPC_CHANNELS.GET_START_CONFLICTS, () => controller.getStartConflicts());
ipcMain.handle(IPC_CHANNELS.RESOLVE_START_CONFLICT, (_event: IpcMainInvokeEvent, packageId: string, policy: "keep" | "skip" | "overwrite") => {
validateString(packageId, "packageId");
validateString(policy, "policy");
if (policy !== "keep" && policy !== "skip" && policy !== "overwrite") {
throw new Error("policy muss 'keep', 'skip' oder 'overwrite' sein");
}
return controller.resolveStartConflict(packageId, policy);
});
ipcMain.handle(IPC_CHANNELS.CLEAR_ALL, () => controller.clearAll());
ipcMain.handle(IPC_CHANNELS.START, () => controller.start());
ipcMain.handle(IPC_CHANNELS.START_PACKAGES, (_event: IpcMainInvokeEvent, packageIds: string[]) => {
validateStringArray(packageIds ?? [], "packageIds");
return controller.startPackages(packageIds ?? []);
});
ipcMain.handle(IPC_CHANNELS.START_ITEMS, (_event: IpcMainInvokeEvent, itemIds: string[]) => {
validateStringArray(itemIds ?? [], "itemIds");
return controller.startItems(itemIds ?? []);
});
ipcMain.handle(IPC_CHANNELS.STOP, () => controller.stop());
ipcMain.handle(IPC_CHANNELS.TOGGLE_PAUSE, () => controller.togglePause());
ipcMain.handle(IPC_CHANNELS.CANCEL_PACKAGE, (_event: IpcMainInvokeEvent, packageId: string) => {
validateString(packageId, "packageId");
return controller.cancelPackage(packageId);
});
ipcMain.handle(IPC_CHANNELS.RENAME_PACKAGE, (_event: IpcMainInvokeEvent, packageId: string, newName: string) => {
validateString(packageId, "packageId");
validateString(newName, "newName");
if (newName.length > RENAME_PACKAGE_MAX_CHARS) {
throw new Error(`newName zu lang (max ${RENAME_PACKAGE_MAX_CHARS} Zeichen)`);
}
return controller.renamePackage(packageId, newName);
});
ipcMain.handle(IPC_CHANNELS.REORDER_PACKAGES, (_event: IpcMainInvokeEvent, packageIds: string[]) => {
validateStringArray(packageIds, "packageIds");
return controller.reorderPackages(packageIds);
});
ipcMain.handle(IPC_CHANNELS.REMOVE_ITEM, (_event: IpcMainInvokeEvent, itemId: string) => {
validateString(itemId, "itemId");
return controller.removeItem(itemId);
});
ipcMain.handle(IPC_CHANNELS.TOGGLE_PACKAGE, (_event: IpcMainInvokeEvent, packageId: string) => {
validateString(packageId, "packageId");
return controller.togglePackage(packageId);
});
ipcMain.handle(IPC_CHANNELS.RETRY_EXTRACTION, (_event: IpcMainInvokeEvent, packageId: string) => {
validateString(packageId, "packageId");
return controller.retryExtraction(packageId);
});
ipcMain.handle(IPC_CHANNELS.EXTRACT_NOW, (_event: IpcMainInvokeEvent, packageId: string) => {
validateString(packageId, "packageId");
return controller.extractNow(packageId);
});
ipcMain.handle(IPC_CHANNELS.RESET_PACKAGE, (_event: IpcMainInvokeEvent, packageId: string) => {
validateString(packageId, "packageId");
return controller.resetPackage(packageId);
});
ipcMain.handle(IPC_CHANNELS.SET_PACKAGE_PRIORITY, (_event: IpcMainInvokeEvent, packageId: string, priority: string) => {
validateString(packageId, "packageId");
validateString(priority, "priority");
if (priority !== "high" && priority !== "normal" && priority !== "low") {
throw new Error("priority muss 'high', 'normal' oder 'low' sein");
}
return controller.setPackagePriority(packageId, priority);
});
ipcMain.handle(IPC_CHANNELS.SKIP_ITEMS, (_event: IpcMainInvokeEvent, itemIds: string[]) => {
validateStringArray(itemIds ?? [], "itemIds");
return controller.skipItems(itemIds ?? []);
});
ipcMain.handle(IPC_CHANNELS.RESET_ITEMS, (_event: IpcMainInvokeEvent, itemIds: string[]) => {
validateStringArray(itemIds ?? [], "itemIds");
return controller.resetItems(itemIds ?? []);
});
ipcMain.handle(IPC_CHANNELS.GET_HISTORY, () => controller.getHistory());
ipcMain.handle(IPC_CHANNELS.CLEAR_HISTORY, () => controller.clearHistory());
ipcMain.handle(IPC_CHANNELS.REMOVE_HISTORY_ENTRY, (_event: IpcMainInvokeEvent, entryId: string) => {
validateString(entryId, "entryId");
return controller.removeHistoryEntry(entryId);
});
ipcMain.handle(IPC_CHANNELS.EXPORT_QUEUE, async () => {
const options = {
defaultPath: `rd-queue-export.json`,
filters: [{ name: "Queue Export", extensions: ["json"] }]
};
const result = mainWindow ? await dialog.showSaveDialog(mainWindow, options) : await dialog.showSaveDialog(options);
if (result.canceled || !result.filePath) {
return { saved: false };
}
const json = controller.exportQueue();
await fs.promises.writeFile(result.filePath, json, "utf8");
return { saved: true };
});
ipcMain.handle(IPC_CHANNELS.IMPORT_QUEUE, (_event: IpcMainInvokeEvent, json: string) => {
validateString(json, "json");
const bytes = Buffer.byteLength(json, "utf8");
if (bytes > IMPORT_QUEUE_MAX_BYTES) {
throw new Error(`Queue-Import zu groß (max ${IMPORT_QUEUE_MAX_BYTES} Bytes)`);
}
return controller.importQueue(json);
});
ipcMain.handle(IPC_CHANNELS.TOGGLE_CLIPBOARD, () => {
const settings = controller.getSettings();
const next = !settings.clipboardWatch;
controller.updateSettings({ clipboardWatch: next });
updateClipboardWatcher();
return next;
});
ipcMain.handle(IPC_CHANNELS.PICK_FOLDER, async () => {
const options = {
properties: ["openDirectory", "createDirectory"] as Array<"openDirectory" | "createDirectory">
};
const result = mainWindow ? await dialog.showOpenDialog(mainWindow, options) : await dialog.showOpenDialog(options);
return result.canceled ? null : result.filePaths[0] || null;
});
ipcMain.handle(IPC_CHANNELS.PICK_CONTAINERS, async () => {
const options = {
properties: ["openFile", "multiSelections"] as Array<"openFile" | "multiSelections">,
filters: [
{ name: "Container", extensions: ["dlc"] },
{ name: "Alle Dateien", extensions: ["*"] }
]
};
const result = mainWindow ? await dialog.showOpenDialog(mainWindow, options) : await dialog.showOpenDialog(options);
return result.canceled ? [] : result.filePaths;
});
ipcMain.handle(IPC_CHANNELS.GET_SESSION_STATS, () => controller.getSessionStats());
ipcMain.handle(IPC_CHANNELS.RESTART, () => {
app.relaunch();
app.quit();
});
ipcMain.handle(IPC_CHANNELS.QUIT, () => {
app.quit();
});
ipcMain.handle(IPC_CHANNELS.EXPORT_BACKUP, async () => {
const options = {
defaultPath: `mdd-backup-${new Date().toISOString().slice(0, 10)}.json`,
filters: [{ name: "Backup", extensions: ["json"] }]
};
const result = mainWindow ? await dialog.showSaveDialog(mainWindow, options) : await dialog.showSaveDialog(options);
if (result.canceled || !result.filePath) {
return { saved: false };
}
const json = controller.exportBackup();
await fs.promises.writeFile(result.filePath, json, "utf8");
return { saved: true };
});
ipcMain.handle(IPC_CHANNELS.OPEN_LOG, async () => {
const logPath = getLogFilePath();
await shell.openPath(logPath);
});
ipcMain.handle(IPC_CHANNELS.OPEN_SESSION_LOG, async () => {
const logPath = controller.getSessionLogPath();
if (logPath) {
await shell.openPath(logPath);
}
});
ipcMain.handle(IPC_CHANNELS.IMPORT_BACKUP, async () => {
const options = {
properties: ["openFile"] as Array<"openFile">,
filters: [
{ name: "Backup", extensions: ["json"] },
{ name: "Alle Dateien", extensions: ["*"] }
]
};
const result = mainWindow ? await dialog.showOpenDialog(mainWindow, options) : await dialog.showOpenDialog(options);
if (result.canceled || result.filePaths.length === 0) {
return { restored: false, message: "Abgebrochen" };
}
const filePath = result.filePaths[0];
const stat = await fs.promises.stat(filePath);
const BACKUP_MAX_BYTES = 50 * 1024 * 1024;
if (stat.size > BACKUP_MAX_BYTES) {
return { restored: false, message: `Backup-Datei zu groß (max 50 MB, Datei hat ${(stat.size / 1024 / 1024).toFixed(1)} MB)` };
}
const json = await fs.promises.readFile(filePath, "utf8");
return controller.importBackup(json);
});
controller.onState = (snapshot) => {
if (!mainWindow || mainWindow.isDestroyed()) {
return;
}
mainWindow.webContents.send(IPC_CHANNELS.STATE_UPDATE, snapshot);
};
}
app.on("second-instance", () => {
if (mainWindow) {
if (mainWindow.isMinimized()) {
mainWindow.restore();
}
mainWindow.show();
mainWindow.focus();
}
});
app.whenReady().then(() => {
cleanupStaleSubstDrives();
registerIpcHandlers();
mainWindow = createWindow();
bindMainWindowLifecycle(mainWindow);
updateClipboardWatcher();
updateTray();
app.on("activate", () => {
if (BrowserWindow.getAllWindows().length === 0) {
mainWindow = createWindow();
bindMainWindowLifecycle(mainWindow);
}
});
}).catch((error) => {
console.error("App startup failed:", error);
app.quit();
});
app.on("window-all-closed", () => {
if (process.platform !== "darwin") {
app.quit();
}
});
app.on("before-quit", () => {
if (updateQuitTimer) { clearTimeout(updateQuitTimer); updateQuitTimer = null; }
stopClipboardWatcher();
destroyTray();
shutdownDaemon();
try {
controller.shutdown();
} catch (error) {
logger.error(`Fehler beim Shutdown: ${String(error)}`);
}
});

View File

@ -0,0 +1,424 @@
import { UnrestrictedLink } from "./realdebrid";
import { compactErrorText, filenameFromUrl, sleep } from "./utils";
type MegaCredentials = {
login: string;
password: string;
};
type CodeEntry = {
code: string;
linkHint: string;
};
const LOGIN_URL = "https://www.mega-debrid.eu/index.php?form=login";
const DEBRID_URL = "https://www.mega-debrid.eu/index.php?form=debrid";
const DEBRID_AJAX_URL = "https://www.mega-debrid.eu/index.php?ajax=debrid&json";
const DEBRID_REFERER = "https://www.mega-debrid.eu/index.php?page=debrideur&lang=de";
function normalizeLink(link: string): string {
return link.trim().toLowerCase();
}
function parseSetCookieFromHeaders(headers: Headers): string {
const getSetCookie = (headers as unknown as { getSetCookie?: () => string[] }).getSetCookie;
if (typeof getSetCookie === "function") {
const values = getSetCookie.call(headers)
.map((entry) => entry.split(";")[0].trim())
.filter(Boolean);
if (values.length > 0) {
return values.join("; ");
}
}
const raw = headers.get("set-cookie") || "";
if (!raw) {
return "";
}
return raw
.split(/,(?=[^;=]+?=)/g)
.map((chunk) => chunk.split(";")[0].trim())
.filter(Boolean)
.join("; ");
}
const PERMANENT_HOSTER_ERRORS = [
"hosternotavailable",
"filenotfound",
"file_unavailable",
"file not found",
"link is dead",
"file has been removed",
"file has been deleted",
"file was deleted",
"file was removed",
"not available",
"file is no longer available"
];
function parsePageErrors(html: string): string[] {
const errors: string[] = [];
const errorRegex = /class=["'][^"']*\berror\b[^"']*["'][^>]*>([^<]+)</gi;
let m: RegExpExecArray | null;
while ((m = errorRegex.exec(html)) !== null) {
const text = m[1].replace(/^Fehler:\s*/i, "").trim();
if (text) {
errors.push(text);
}
}
return errors;
}
function isPermanentHosterError(errors: string[]): string | null {
for (const err of errors) {
const lower = err.toLowerCase();
for (const pattern of PERMANENT_HOSTER_ERRORS) {
if (lower.includes(pattern)) {
return err;
}
}
}
return null;
}
function parseCodes(html: string): CodeEntry[] {
const entries: CodeEntry[] = [];
const cardRegex = /<div[^>]*class=['"][^'"]*acp-box[^'"]*['"][^>]*>[\s\S]*?<\/div>/gi;
let cardMatch: RegExpExecArray | null;
while ((cardMatch = cardRegex.exec(html)) !== null) {
const block = cardMatch[0];
const linkTitle = (block.match(/<h3>\s*Link:\s*([^<]+)<\/h3>/i)?.[1] || "").trim();
const code = block.match(/processDebrid\(\d+,'([^']+)',0\)/i)?.[1] || "";
if (!code) {
continue;
}
entries.push({ code, linkHint: normalizeLink(linkTitle) });
}
if (entries.length === 0) {
const fallbackRegex = /processDebrid\(\d+,'([^']+)',0\)/gi;
let m: RegExpExecArray | null;
while ((m = fallbackRegex.exec(html)) !== null) {
entries.push({ code: m[1], linkHint: "" });
}
}
return entries;
}
function pickCode(entries: CodeEntry[], link: string): string {
if (entries.length === 0) {
return "";
}
const target = normalizeLink(link);
const match = entries.find((entry) => entry.linkHint && entry.linkHint.includes(target));
return (match?.code || entries[0].code || "").trim();
}
function parseDebridJson(text: string): { link: string; text: string } | null {
try {
const parsed = JSON.parse(text) as { link?: string; text?: string };
return {
link: String(parsed.link || ""),
text: String(parsed.text || "")
};
} catch {
return null;
}
}
function abortError(): Error {
return new Error("aborted:mega-web");
}
function withTimeoutSignal(signal: AbortSignal | undefined, timeoutMs: number): AbortSignal {
const timeoutSignal = AbortSignal.timeout(timeoutMs);
if (!signal) {
return timeoutSignal;
}
return AbortSignal.any([signal, timeoutSignal]);
}
function throwIfAborted(signal?: AbortSignal): void {
if (signal?.aborted) {
throw abortError();
}
}
async function sleepWithSignal(ms: number, signal?: AbortSignal): Promise<void> {
if (!signal) {
await sleep(ms);
return;
}
if (signal.aborted) {
throw abortError();
}
await new Promise<void>((resolve, reject) => {
let timer: NodeJS.Timeout | null = setTimeout(() => {
timer = null;
signal.removeEventListener("abort", onAbort);
resolve();
}, Math.max(0, ms));
const onAbort = (): void => {
if (timer) {
clearTimeout(timer);
timer = null;
}
signal.removeEventListener("abort", onAbort);
reject(abortError());
};
signal.addEventListener("abort", onAbort, { once: true });
});
}
async function raceWithAbort<T>(promise: Promise<T>, signal?: AbortSignal): Promise<T> {
if (!signal) {
return promise;
}
if (signal.aborted) {
throw abortError();
}
return new Promise<T>((resolve, reject) => {
let settled = false;
const onAbort = (): void => {
if (settled) {
return;
}
settled = true;
signal.removeEventListener("abort", onAbort);
reject(abortError());
};
signal.addEventListener("abort", onAbort, { once: true });
promise.then((value) => {
if (settled) {
return;
}
settled = true;
signal.removeEventListener("abort", onAbort);
resolve(value);
}, (error) => {
if (settled) {
return;
}
settled = true;
signal.removeEventListener("abort", onAbort);
reject(error);
});
});
}
export class MegaWebFallback {
private queue: Promise<unknown> = Promise.resolve();
private getCredentials: () => MegaCredentials;
private cookie = "";
private cookieSetAt = 0;
public constructor(getCredentials: () => MegaCredentials) {
this.getCredentials = getCredentials;
}
public async unrestrict(link: string, signal?: AbortSignal): Promise<UnrestrictedLink | null> {
const overallSignal = withTimeoutSignal(signal, 180000);
return this.runExclusive(async () => {
throwIfAborted(overallSignal);
const creds = this.getCredentials();
if (!creds.login.trim() || !creds.password.trim()) {
return null;
}
if (!this.cookie || Date.now() - this.cookieSetAt > 20 * 60 * 1000) {
await this.login(creds.login, creds.password, overallSignal);
}
const generated = await this.generate(link, overallSignal);
if (!generated) {
this.cookie = "";
await this.login(creds.login, creds.password, overallSignal);
const retry = await this.generate(link, overallSignal);
if (!retry) {
return null;
}
return {
directUrl: retry.directUrl,
fileName: retry.fileName || filenameFromUrl(link),
fileSize: null,
retriesUsed: 0
};
}
return {
directUrl: generated.directUrl,
fileName: generated.fileName || filenameFromUrl(link),
fileSize: null,
retriesUsed: 0
};
}, overallSignal);
}
public invalidateSession(): void {
this.cookie = "";
this.cookieSetAt = 0;
}
private async runExclusive<T>(job: () => Promise<T>, signal?: AbortSignal): Promise<T> {
const queuedAt = Date.now();
const QUEUE_WAIT_TIMEOUT_MS = 90000;
const guardedJob = async (): Promise<T> => {
throwIfAborted(signal);
const waited = Date.now() - queuedAt;
if (waited > QUEUE_WAIT_TIMEOUT_MS) {
throw new Error(`Mega-Web Queue-Timeout (${Math.floor(waited / 1000)}s gewartet)`);
}
return job();
};
const run = this.queue.then(guardedJob, guardedJob);
this.queue = run.then(() => undefined, () => undefined);
return raceWithAbort(run, signal);
}
private async login(login: string, password: string, signal?: AbortSignal): Promise<void> {
throwIfAborted(signal);
const response = await fetch(LOGIN_URL, {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "Mozilla/5.0"
},
body: new URLSearchParams({
login,
password,
remember: "on"
}),
redirect: "manual",
signal: withTimeoutSignal(signal, 30000)
});
const cookie = parseSetCookieFromHeaders(response.headers);
if (!cookie) {
throw new Error("Mega-Web Login liefert kein Session-Cookie");
}
const verify = await fetch(DEBRID_REFERER, {
method: "GET",
headers: {
"User-Agent": "Mozilla/5.0",
Cookie: cookie,
Referer: DEBRID_REFERER
},
signal: withTimeoutSignal(signal, 30000)
});
const verifyHtml = await verify.text();
const hasDebridForm = /id=["']debridForm["']/i.test(verifyHtml) || /name=["']links["']/i.test(verifyHtml);
if (!hasDebridForm) {
throw new Error("Mega-Web Login ungültig oder Session blockiert");
}
this.cookie = cookie;
this.cookieSetAt = Date.now();
}
private async generate(link: string, signal?: AbortSignal): Promise<{ directUrl: string; fileName: string } | null> {
throwIfAborted(signal);
const page = await fetch(DEBRID_URL, {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "Mozilla/5.0",
Cookie: this.cookie,
Referer: DEBRID_REFERER
},
body: new URLSearchParams({
links: link,
password: "",
showLinks: "1"
}),
signal: withTimeoutSignal(signal, 30000)
});
const html = await page.text();
// Check for permanent hoster errors before looking for debrid codes
const pageErrors = parsePageErrors(html);
const permanentError = isPermanentHosterError(pageErrors);
if (permanentError) {
throw new Error(`Mega-Web: Link permanent ungültig (${permanentError})`);
}
const code = pickCode(parseCodes(html), link);
if (!code) {
return null;
}
for (let attempt = 1; attempt <= 60; attempt += 1) {
throwIfAborted(signal);
const res = await fetch(DEBRID_AJAX_URL, {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "Mozilla/5.0",
Cookie: this.cookie,
Referer: DEBRID_REFERER
},
body: new URLSearchParams({
code,
autodl: "0"
}),
signal: withTimeoutSignal(signal, 15000)
});
const text = (await res.text()).trim();
if (text === "reload") {
await sleepWithSignal(650, signal);
continue;
}
if (text === "false") {
return null;
}
const parsed = parseDebridJson(text);
if (!parsed) {
return null;
}
if (!parsed.link) {
if (/hoster does not respond correctly|could not be done for this moment/i.test(parsed.text || "")) {
await sleepWithSignal(1200, signal);
continue;
}
return null;
}
const fromText = parsed.text
.replace(/<[^>]*>/g, " ")
.replace(/\s+/g, " ")
.trim();
const nameMatch = fromText.match(/([\w .\-\[\]\(\)]+\.(?:rar|r\d{2}|zip|7z|mkv|mp4|avi|mp3|flac))/i);
const fileName = (nameMatch?.[1] || filenameFromUrl(link)).trim();
return {
directUrl: parsed.link,
fileName
};
}
return null;
}
public dispose(): void {
this.cookie = "";
}
}
export function compactMegaWebError(error: unknown): string {
return compactErrorText(error);
}

203
src/main/realdebrid.ts Normal file
View File

@ -0,0 +1,203 @@
import { API_BASE_URL, APP_VERSION, REQUEST_RETRIES } from "./constants";
import { compactErrorText, sleep } from "./utils";
const DEBRID_USER_AGENT = `RD-Node-Downloader/${APP_VERSION}`;
export interface UnrestrictedLink {
fileName: string;
directUrl: string;
fileSize: number | null;
retriesUsed: number;
skipTlsVerify?: boolean;
}
function shouldRetryStatus(status: number): boolean {
return status === 429 || status >= 500;
}
function retryDelay(attempt: number): number {
return Math.min(5000, 400 * 2 ** attempt);
}
function parseRetryAfterMs(value: string | null): number {
const text = String(value || "").trim();
if (!text) {
return 0;
}
const asSeconds = Number(text);
if (Number.isFinite(asSeconds) && asSeconds >= 0) {
return Math.min(120000, Math.floor(asSeconds * 1000));
}
const asDate = Date.parse(text);
if (Number.isFinite(asDate)) {
return Math.min(120000, Math.max(0, asDate - Date.now()));
}
return 0;
}
function retryDelayForResponse(response: Response, attempt: number): number {
if (response.status !== 429) {
return retryDelay(attempt);
}
const fromHeader = parseRetryAfterMs(response.headers.get("retry-after"));
return fromHeader > 0 ? fromHeader : retryDelay(attempt);
}
function readHttpStatusFromErrorText(text: string): number {
const match = String(text || "").match(/HTTP\s+(\d{3})/i);
return match ? Number(match[1]) : 0;
}
function isRetryableErrorText(text: string): boolean {
const status = readHttpStatusFromErrorText(text);
if (status === 429 || status >= 500) {
return true;
}
const lower = String(text || "").toLowerCase();
return lower.includes("timeout")
|| lower.includes("network")
|| lower.includes("fetch failed")
|| lower.includes("aborted")
|| lower.includes("econnreset")
|| lower.includes("enotfound")
|| lower.includes("etimedout")
|| lower.includes("html statt json");
}
function withTimeoutSignal(signal: AbortSignal | undefined, timeoutMs: number): AbortSignal {
if (!signal) {
return AbortSignal.timeout(timeoutMs);
}
return AbortSignal.any([signal, AbortSignal.timeout(timeoutMs)]);
}
async function sleepWithSignal(ms: number, signal?: AbortSignal): Promise<void> {
if (!signal) {
await sleep(ms);
return;
}
// Check before entering the Promise constructor to avoid a race where the timer
// resolves before the aborted check runs (especially when ms=0).
if (signal.aborted) {
throw new Error("aborted");
}
await new Promise<void>((resolve, reject) => {
let timer: NodeJS.Timeout | null = setTimeout(() => {
timer = null;
signal.removeEventListener("abort", onAbort);
resolve();
}, Math.max(0, ms));
const onAbort = (): void => {
if (timer) {
clearTimeout(timer);
timer = null;
}
signal.removeEventListener("abort", onAbort);
reject(new Error("aborted"));
};
signal.addEventListener("abort", onAbort, { once: true });
});
}
function looksLikeHtmlResponse(contentType: string, body: string): boolean {
const type = String(contentType || "").toLowerCase();
if (type.includes("text/html") || type.includes("application/xhtml+xml")) {
return true;
}
return /^\s*<(!doctype\s+html|html\b)/i.test(String(body || ""));
}
function parseErrorBody(status: number, body: string, contentType: string): string {
if (looksLikeHtmlResponse(contentType, body)) {
return `Real-Debrid lieferte HTML statt JSON (HTTP ${status})`;
}
const clean = compactErrorText(body);
return clean || `HTTP ${status}`;
}
export class RealDebridClient {
private token: string;
public constructor(token: string) {
this.token = token;
}
public async unrestrictLink(link: string, signal?: AbortSignal): Promise<UnrestrictedLink> {
let lastError = "";
for (let attempt = 1; attempt <= REQUEST_RETRIES; attempt += 1) {
try {
const body = new URLSearchParams({ link });
const response = await fetch(`${API_BASE_URL}/unrestrict/link`, {
method: "POST",
headers: {
Authorization: `Bearer ${this.token}`,
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": DEBRID_USER_AGENT
},
body,
signal: withTimeoutSignal(signal, 30000)
});
const text = await response.text();
const contentType = String(response.headers.get("content-type") || "");
if (!response.ok) {
const parsed = parseErrorBody(response.status, text, contentType);
if (shouldRetryStatus(response.status) && attempt < REQUEST_RETRIES) {
await sleepWithSignal(retryDelayForResponse(response, attempt), signal);
continue;
}
throw new Error(parsed);
}
if (looksLikeHtmlResponse(contentType, text)) {
throw new Error("Real-Debrid lieferte HTML statt JSON");
}
let payload: Record<string, unknown>;
try {
payload = JSON.parse(text) as Record<string, unknown>;
} catch {
throw new Error("Ungültige JSON-Antwort von Real-Debrid");
}
const directUrl = String(payload.download || payload.link || "").trim();
if (!directUrl) {
throw new Error("Unrestrict ohne Download-URL");
}
try {
const parsedUrl = new URL(directUrl);
if (parsedUrl.protocol !== "https:" && parsedUrl.protocol !== "http:") {
throw new Error(`Ungültiges Download-URL-Protokoll (${parsedUrl.protocol})`);
}
} catch (urlError) {
if (urlError instanceof Error && urlError.message.includes("Protokoll")) throw urlError;
throw new Error("Real-Debrid Antwort enthält keine gültige Download-URL");
}
const fileName = String(payload.filename || "download.bin").trim() || "download.bin";
const fileSizeRaw = Number(payload.filesize ?? NaN);
return {
fileName,
directUrl,
fileSize: Number.isFinite(fileSizeRaw) && fileSizeRaw > 0 ? Math.floor(fileSizeRaw) : null,
retriesUsed: attempt - 1
};
} catch (error) {
lastError = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
break;
}
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(lastError)) {
break;
}
await sleepWithSignal(retryDelay(attempt), signal);
}
}
throw new Error(String(lastError || "Unrestrict fehlgeschlagen").replace(/^Error:\s*/i, ""));
}
}

128
src/main/session-log.ts Normal file
View File

@ -0,0 +1,128 @@
import fs from "node:fs";
import path from "node:path";
import { setLogListener } from "./logger";
const SESSION_LOG_FLUSH_INTERVAL_MS = 200;
let sessionLogPath: string | null = null;
let sessionLogsDir: string | null = null;
let pendingLines: string[] = [];
let flushTimer: NodeJS.Timeout | null = null;
function formatTimestamp(): string {
const now = new Date();
const y = now.getFullYear();
const mo = String(now.getMonth() + 1).padStart(2, "0");
const d = String(now.getDate()).padStart(2, "0");
const h = String(now.getHours()).padStart(2, "0");
const mi = String(now.getMinutes()).padStart(2, "0");
const s = String(now.getSeconds()).padStart(2, "0");
return `${y}-${mo}-${d}_${h}-${mi}-${s}`;
}
function flushPending(): void {
if (pendingLines.length === 0 || !sessionLogPath) {
return;
}
const chunk = pendingLines.join("");
pendingLines = [];
try {
fs.appendFileSync(sessionLogPath, chunk, "utf8");
} catch {
// ignore write errors
}
}
function scheduleFlush(): void {
if (flushTimer) {
return;
}
flushTimer = setTimeout(() => {
flushTimer = null;
flushPending();
}, SESSION_LOG_FLUSH_INTERVAL_MS);
}
function appendToSessionLog(line: string): void {
if (!sessionLogPath) {
return;
}
pendingLines.push(line);
scheduleFlush();
}
async function cleanupOldSessionLogs(dir: string, maxAgeDays: number): Promise<void> {
try {
const files = await fs.promises.readdir(dir);
const cutoff = Date.now() - maxAgeDays * 24 * 60 * 60 * 1000;
for (const file of files) {
if (!file.startsWith("session_") || !file.endsWith(".txt")) {
continue;
}
const filePath = path.join(dir, file);
try {
const stat = await fs.promises.stat(filePath);
if (stat.mtimeMs < cutoff) {
await fs.promises.unlink(filePath);
}
} catch {
// ignore - file may be locked
}
}
} catch {
// ignore - dir may not exist
}
}
export function initSessionLog(baseDir: string): void {
sessionLogsDir = path.join(baseDir, "session-logs");
try {
fs.mkdirSync(sessionLogsDir, { recursive: true });
} catch {
sessionLogsDir = null;
return;
}
const timestamp = formatTimestamp();
sessionLogPath = path.join(sessionLogsDir, `session_${timestamp}.txt`);
const isoTimestamp = new Date().toISOString();
try {
fs.writeFileSync(sessionLogPath, `=== Session gestartet: ${isoTimestamp} ===\n`, "utf8");
} catch {
sessionLogPath = null;
return;
}
setLogListener((line) => appendToSessionLog(line));
void cleanupOldSessionLogs(sessionLogsDir, 7);
}
export function getSessionLogPath(): string | null {
return sessionLogPath;
}
export function shutdownSessionLog(): void {
if (!sessionLogPath) {
return;
}
// Flush any pending lines
if (flushTimer) {
clearTimeout(flushTimer);
flushTimer = null;
}
flushPending();
// Write closing line
const isoTimestamp = new Date().toISOString();
try {
fs.appendFileSync(sessionLogPath, `=== Session beendet: ${isoTimestamp} ===\n`, "utf8");
} catch {
// ignore
}
setLogListener(null);
sessionLogPath = null;
}

747
src/main/storage.ts Normal file
View File

@ -0,0 +1,747 @@
import fs from "node:fs";
import fsp from "node:fs/promises";
import path from "node:path";
import { AppSettings, BandwidthScheduleEntry, DebridProvider, DownloadItem, DownloadStatus, HistoryEntry, PackageEntry, PackagePriority, SessionState } from "../shared/types";
import { defaultSettings } from "./constants";
import { logger } from "./logger";
const VALID_PRIMARY_PROVIDERS = new Set(["realdebrid", "megadebrid", "bestdebrid", "alldebrid", "ddownload"]);
const VALID_FALLBACK_PROVIDERS = new Set(["none", "realdebrid", "megadebrid", "bestdebrid", "alldebrid", "ddownload"]);
const VALID_CLEANUP_MODES = new Set(["none", "trash", "delete"]);
const VALID_CONFLICT_MODES = new Set(["overwrite", "skip", "rename", "ask"]);
const VALID_FINISHED_POLICIES = new Set(["never", "immediate", "on_start", "package_done"]);
const VALID_SPEED_MODES = new Set(["global", "per_download"]);
const VALID_THEMES = new Set(["dark", "light"]);
const VALID_EXTRACT_CPU_PRIORITIES = new Set(["high", "middle", "low"]);
const VALID_PACKAGE_PRIORITIES = new Set<string>(["high", "normal", "low"]);
const VALID_DOWNLOAD_STATUSES = new Set<DownloadStatus>([
"queued", "validating", "downloading", "paused", "reconnect_wait", "extracting", "integrity_check", "completed", "failed", "cancelled"
]);
const VALID_ITEM_PROVIDERS = new Set<DebridProvider>(["realdebrid", "megadebrid", "bestdebrid", "alldebrid", "ddownload"]);
const VALID_ONLINE_STATUSES = new Set(["online", "offline", "checking"]);
function asText(value: unknown): string {
return String(value ?? "").trim();
}
function clampNumber(value: unknown, fallback: number, min: number, max: number): number {
const num = Number(value);
if (!Number.isFinite(num)) {
return fallback;
}
return Math.max(min, Math.min(max, Math.floor(num)));
}
function createScheduleId(index: number): string {
return `sched-${Date.now().toString(36)}-${index.toString(36)}-${Math.random().toString(36).slice(2, 8)}`;
}
function normalizeBandwidthSchedules(raw: unknown): BandwidthScheduleEntry[] {
if (!Array.isArray(raw)) {
return [];
}
const normalized: BandwidthScheduleEntry[] = [];
for (let index = 0; index < raw.length; index += 1) {
const entry = raw[index];
if (!entry || typeof entry !== "object") {
continue;
}
const value = entry as Partial<BandwidthScheduleEntry>;
const rawId = typeof value.id === "string" ? value.id.trim() : "";
normalized.push({
id: rawId || createScheduleId(index),
startHour: clampNumber(value.startHour, 0, 0, 23),
endHour: clampNumber(value.endHour, 8, 0, 23),
speedLimitKbps: clampNumber(value.speedLimitKbps, 0, 0, 500000),
enabled: value.enabled === undefined ? true : Boolean(value.enabled)
});
}
return normalized;
}
function normalizeAbsoluteDir(value: unknown, fallback: string): string {
const text = asText(value);
if (!text || !path.isAbsolute(text)) {
return path.resolve(fallback);
}
return path.resolve(text);
}
const DEFAULT_COLUMN_ORDER = ["name", "size", "progress", "hoster", "account", "prio", "status", "speed"];
const ALL_VALID_COLUMNS = new Set([...DEFAULT_COLUMN_ORDER, "added"]);
function normalizeColumnOrder(raw: unknown): string[] {
if (!Array.isArray(raw) || raw.length === 0) {
return [...DEFAULT_COLUMN_ORDER];
}
const valid = ALL_VALID_COLUMNS;
const seen = new Set<string>();
const result: string[] = [];
for (const col of raw) {
if (typeof col === "string" && valid.has(col) && !seen.has(col)) {
seen.add(col);
result.push(col);
}
}
// "name" is mandatory — ensure it's always present
if (!seen.has("name")) {
result.unshift("name");
}
return result;
}
const DEPRECATED_UPDATE_REPOS = new Set([
"sucukdeluxe/real-debrid-downloader"
]);
function migrateUpdateRepo(raw: string, fallback: string): string {
const trimmed = raw.trim();
if (!trimmed || DEPRECATED_UPDATE_REPOS.has(trimmed.toLowerCase())) {
return fallback;
}
return trimmed;
}
export function normalizeSettings(settings: AppSettings): AppSettings {
const defaults = defaultSettings();
const normalized: AppSettings = {
token: asText(settings.token),
megaLogin: asText(settings.megaLogin),
megaPassword: asText(settings.megaPassword),
bestToken: asText(settings.bestToken),
allDebridToken: asText(settings.allDebridToken),
ddownloadLogin: asText(settings.ddownloadLogin),
ddownloadPassword: asText(settings.ddownloadPassword),
archivePasswordList: String(settings.archivePasswordList ?? "").replace(/\r\n|\r/g, "\n"),
rememberToken: Boolean(settings.rememberToken),
providerPrimary: settings.providerPrimary,
providerSecondary: settings.providerSecondary,
providerTertiary: settings.providerTertiary,
autoProviderFallback: Boolean(settings.autoProviderFallback),
outputDir: normalizeAbsoluteDir(settings.outputDir, defaults.outputDir),
packageName: asText(settings.packageName),
autoExtract: Boolean(settings.autoExtract),
autoRename4sf4sj: Boolean(settings.autoRename4sf4sj),
extractDir: normalizeAbsoluteDir(settings.extractDir, defaults.extractDir),
collectMkvToLibrary: Boolean(settings.collectMkvToLibrary),
mkvLibraryDir: normalizeAbsoluteDir(settings.mkvLibraryDir, defaults.mkvLibraryDir),
createExtractSubfolder: Boolean(settings.createExtractSubfolder),
hybridExtract: Boolean(settings.hybridExtract),
cleanupMode: settings.cleanupMode,
extractConflictMode: settings.extractConflictMode,
removeLinkFilesAfterExtract: Boolean(settings.removeLinkFilesAfterExtract),
removeSamplesAfterExtract: Boolean(settings.removeSamplesAfterExtract),
enableIntegrityCheck: Boolean(settings.enableIntegrityCheck),
autoResumeOnStart: Boolean(settings.autoResumeOnStart),
autoReconnect: Boolean(settings.autoReconnect),
maxParallel: clampNumber(settings.maxParallel, defaults.maxParallel, 1, 50),
maxParallelExtract: clampNumber(settings.maxParallelExtract, defaults.maxParallelExtract, 1, 8),
retryLimit: clampNumber(settings.retryLimit, defaults.retryLimit, 0, 99),
reconnectWaitSeconds: clampNumber(settings.reconnectWaitSeconds, defaults.reconnectWaitSeconds, 10, 600),
completedCleanupPolicy: settings.completedCleanupPolicy,
speedLimitEnabled: Boolean(settings.speedLimitEnabled),
speedLimitKbps: clampNumber(settings.speedLimitKbps, defaults.speedLimitKbps, 0, 500000),
speedLimitMode: settings.speedLimitMode,
autoUpdateCheck: Boolean(settings.autoUpdateCheck),
updateRepo: migrateUpdateRepo(asText(settings.updateRepo), defaults.updateRepo),
clipboardWatch: Boolean(settings.clipboardWatch),
minimizeToTray: Boolean(settings.minimizeToTray),
collapseNewPackages: settings.collapseNewPackages !== undefined ? Boolean(settings.collapseNewPackages) : defaults.collapseNewPackages,
autoSkipExtracted: settings.autoSkipExtracted !== undefined ? Boolean(settings.autoSkipExtracted) : defaults.autoSkipExtracted,
confirmDeleteSelection: settings.confirmDeleteSelection !== undefined ? Boolean(settings.confirmDeleteSelection) : defaults.confirmDeleteSelection,
totalDownloadedAllTime: typeof settings.totalDownloadedAllTime === "number" && settings.totalDownloadedAllTime >= 0 ? settings.totalDownloadedAllTime : defaults.totalDownloadedAllTime,
theme: VALID_THEMES.has(settings.theme) ? settings.theme : defaults.theme,
bandwidthSchedules: normalizeBandwidthSchedules(settings.bandwidthSchedules),
columnOrder: normalizeColumnOrder(settings.columnOrder),
extractCpuPriority: settings.extractCpuPriority,
autoExtractWhenStopped: settings.autoExtractWhenStopped !== undefined ? Boolean(settings.autoExtractWhenStopped) : defaults.autoExtractWhenStopped
};
if (!VALID_PRIMARY_PROVIDERS.has(normalized.providerPrimary)) {
normalized.providerPrimary = defaults.providerPrimary;
}
if (!VALID_FALLBACK_PROVIDERS.has(normalized.providerSecondary)) {
normalized.providerSecondary = "none";
}
if (!VALID_FALLBACK_PROVIDERS.has(normalized.providerTertiary)) {
normalized.providerTertiary = "none";
}
if (normalized.providerSecondary === normalized.providerPrimary) {
normalized.providerSecondary = "none";
}
if (normalized.providerTertiary === normalized.providerPrimary || normalized.providerTertiary === normalized.providerSecondary) {
normalized.providerTertiary = "none";
}
if (!VALID_CLEANUP_MODES.has(normalized.cleanupMode)) {
normalized.cleanupMode = defaults.cleanupMode;
}
if (!VALID_CONFLICT_MODES.has(normalized.extractConflictMode)) {
normalized.extractConflictMode = defaults.extractConflictMode;
}
if (!VALID_FINISHED_POLICIES.has(normalized.completedCleanupPolicy)) {
normalized.completedCleanupPolicy = defaults.completedCleanupPolicy;
}
if (!VALID_SPEED_MODES.has(normalized.speedLimitMode)) {
normalized.speedLimitMode = defaults.speedLimitMode;
}
if (!VALID_EXTRACT_CPU_PRIORITIES.has(normalized.extractCpuPriority)) {
normalized.extractCpuPriority = defaults.extractCpuPriority;
}
return normalized;
}
function sanitizeCredentialPersistence(settings: AppSettings): AppSettings {
if (settings.rememberToken) {
return settings;
}
return {
...settings,
token: "",
megaLogin: "",
megaPassword: "",
bestToken: "",
allDebridToken: "",
ddownloadLogin: "",
ddownloadPassword: ""
};
}
export interface StoragePaths {
baseDir: string;
configFile: string;
sessionFile: string;
historyFile: string;
}
export function createStoragePaths(baseDir: string): StoragePaths {
return {
baseDir,
configFile: path.join(baseDir, "rd_downloader_config.json"),
sessionFile: path.join(baseDir, "rd_session_state.json"),
historyFile: path.join(baseDir, "rd_history.json")
};
}
function ensureBaseDir(baseDir: string): void {
fs.mkdirSync(baseDir, { recursive: true });
}
function asRecord(value: unknown): Record<string, unknown> | null {
if (!value || typeof value !== "object" || Array.isArray(value)) {
return null;
}
return value as Record<string, unknown>;
}
function readSettingsFile(filePath: string): AppSettings | null {
try {
const parsed = JSON.parse(fs.readFileSync(filePath, "utf8")) as AppSettings;
const merged = normalizeSettings({
...defaultSettings(),
...parsed
});
return sanitizeCredentialPersistence(merged);
} catch {
return null;
}
}
export function normalizeLoadedSession(raw: unknown): SessionState {
const fallback = emptySession();
const parsed = asRecord(raw);
if (!parsed) {
return fallback;
}
const now = Date.now();
const itemsById: Record<string, DownloadItem> = {};
const rawItems = asRecord(parsed.items) ?? {};
for (const [entryId, rawItem] of Object.entries(rawItems)) {
const item = asRecord(rawItem);
if (!item) {
continue;
}
const id = asText(item.id) || entryId;
const packageId = asText(item.packageId);
const url = asText(item.url);
if (!id || !packageId || !url) {
continue;
}
const statusRaw = asText(item.status) as DownloadStatus;
const status: DownloadStatus = VALID_DOWNLOAD_STATUSES.has(statusRaw) ? statusRaw : "queued";
const providerRaw = asText(item.provider) as DebridProvider;
const onlineStatusRaw = asText(item.onlineStatus);
itemsById[id] = {
id,
packageId,
url,
provider: VALID_ITEM_PROVIDERS.has(providerRaw) ? providerRaw : null,
status,
retries: clampNumber(item.retries, 0, 0, 1_000_000),
speedBps: clampNumber(item.speedBps, 0, 0, 10_000_000_000),
downloadedBytes: clampNumber(item.downloadedBytes, 0, 0, 10_000_000_000_000),
totalBytes: item.totalBytes == null ? null : clampNumber(item.totalBytes, 0, 0, 10_000_000_000_000),
progressPercent: clampNumber(item.progressPercent, 0, 0, 100),
fileName: asText(item.fileName) || "download.bin",
targetPath: asText(item.targetPath),
resumable: item.resumable === undefined ? true : Boolean(item.resumable),
attempts: clampNumber(item.attempts, 0, 0, 10_000),
lastError: asText(item.lastError),
fullStatus: asText(item.fullStatus),
onlineStatus: VALID_ONLINE_STATUSES.has(onlineStatusRaw) ? onlineStatusRaw as "online" | "offline" | "checking" : undefined,
createdAt: clampNumber(item.createdAt, now, 0, Number.MAX_SAFE_INTEGER),
updatedAt: clampNumber(item.updatedAt, now, 0, Number.MAX_SAFE_INTEGER)
};
}
const packagesById: Record<string, PackageEntry> = {};
const rawPackages = asRecord(parsed.packages) ?? {};
for (const [entryId, rawPkg] of Object.entries(rawPackages)) {
const pkg = asRecord(rawPkg);
if (!pkg) {
continue;
}
const id = asText(pkg.id) || entryId;
if (!id) {
continue;
}
const statusRaw = asText(pkg.status) as DownloadStatus;
const status: DownloadStatus = VALID_DOWNLOAD_STATUSES.has(statusRaw) ? statusRaw : "queued";
const rawItemIds = Array.isArray(pkg.itemIds) ? pkg.itemIds : [];
packagesById[id] = {
id,
name: asText(pkg.name) || "Paket",
outputDir: asText(pkg.outputDir),
extractDir: asText(pkg.extractDir),
status,
itemIds: rawItemIds
.map((value) => asText(value))
.filter((value) => value.length > 0),
cancelled: Boolean(pkg.cancelled),
enabled: pkg.enabled === undefined ? true : Boolean(pkg.enabled),
priority: VALID_PACKAGE_PRIORITIES.has(asText(pkg.priority)) ? asText(pkg.priority) as PackagePriority : "normal",
createdAt: clampNumber(pkg.createdAt, now, 0, Number.MAX_SAFE_INTEGER),
updatedAt: clampNumber(pkg.updatedAt, now, 0, Number.MAX_SAFE_INTEGER)
};
}
for (const [itemId, item] of Object.entries(itemsById)) {
if (!packagesById[item.packageId]) {
delete itemsById[itemId];
}
}
for (const pkg of Object.values(packagesById)) {
pkg.itemIds = pkg.itemIds.filter((itemId) => {
const item = itemsById[itemId];
return Boolean(item) && item.packageId === pkg.id;
});
}
const rawOrder = Array.isArray(parsed.packageOrder) ? parsed.packageOrder : [];
const seenOrder = new Set<string>();
const packageOrder = rawOrder
.map((entry) => asText(entry))
.filter((id) => {
if (!(id in packagesById) || seenOrder.has(id)) {
return false;
}
seenOrder.add(id);
return true;
});
for (const packageId of Object.keys(packagesById)) {
if (!seenOrder.has(packageId)) {
seenOrder.add(packageId);
packageOrder.push(packageId);
}
}
return {
...fallback,
version: clampNumber(parsed.version, fallback.version, 1, 10),
packageOrder,
packages: packagesById,
items: itemsById,
runStartedAt: clampNumber(parsed.runStartedAt, 0, 0, Number.MAX_SAFE_INTEGER),
totalDownloadedBytes: clampNumber(parsed.totalDownloadedBytes, 0, 0, Number.MAX_SAFE_INTEGER),
summaryText: asText(parsed.summaryText),
reconnectUntil: clampNumber(parsed.reconnectUntil, 0, 0, Number.MAX_SAFE_INTEGER),
reconnectReason: asText(parsed.reconnectReason),
paused: Boolean(parsed.paused),
running: Boolean(parsed.running),
updatedAt: clampNumber(parsed.updatedAt, now, 0, Number.MAX_SAFE_INTEGER)
};
}
export function loadSettings(paths: StoragePaths): AppSettings {
ensureBaseDir(paths.baseDir);
if (!fs.existsSync(paths.configFile)) {
return defaultSettings();
}
const loaded = readSettingsFile(paths.configFile);
if (loaded) {
return loaded;
}
const backupFile = `${paths.configFile}.bak`;
const backupLoaded = fs.existsSync(backupFile) ? readSettingsFile(backupFile) : null;
if (backupLoaded) {
logger.warn("Konfiguration defekt, Backup-Datei wird verwendet");
try {
const payload = JSON.stringify(backupLoaded, null, 2);
const tempPath = `${paths.configFile}.tmp`;
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.configFile);
} catch {
// ignore restore write failure
}
return backupLoaded;
}
logger.error("Konfiguration konnte nicht geladen werden (auch Backup fehlgeschlagen)");
return defaultSettings();
}
function syncRenameWithExdevFallback(tempPath: string, targetPath: string): void {
try {
fs.renameSync(tempPath, targetPath);
} catch (renameError: unknown) {
if (renameError && typeof renameError === "object" && "code" in renameError && (renameError as NodeJS.ErrnoException).code === "EXDEV") {
fs.copyFileSync(tempPath, targetPath);
try { fs.rmSync(tempPath, { force: true }); } catch {}
} else {
throw renameError;
}
}
}
function sessionTempPath(sessionFile: string, kind: "sync" | "async"): string {
return `${sessionFile}.${kind}.tmp`;
}
function sessionBackupPath(sessionFile: string): string {
return `${sessionFile}.bak`;
}
export function normalizeLoadedSessionTransientFields(session: SessionState): SessionState {
// Reset transient fields that may be stale from a previous crash
const ACTIVE_STATUSES = new Set(["downloading", "validating", "extracting", "integrity_check", "paused", "reconnect_wait"]);
for (const item of Object.values(session.items)) {
if (ACTIVE_STATUSES.has(item.status)) {
item.status = "queued";
item.lastError = "";
}
// Always clear stale speed values
item.speedBps = 0;
}
// Reset package-level active statuses to queued (mirrors item reset above)
const ACTIVE_PKG_STATUSES = new Set(["downloading", "validating", "extracting", "integrity_check", "paused", "reconnect_wait"]);
for (const pkg of Object.values(session.packages)) {
if (ACTIVE_PKG_STATUSES.has(pkg.status)) {
pkg.status = "queued";
}
pkg.postProcessLabel = undefined;
}
// Clear stale session-level running/paused flags
session.running = false;
session.paused = false;
return session;
}
function readSessionFile(filePath: string): SessionState | null {
try {
const parsed = JSON.parse(fs.readFileSync(filePath, "utf8")) as unknown;
return normalizeLoadedSessionTransientFields(normalizeLoadedSession(parsed));
} catch {
return null;
}
}
export function saveSettings(paths: StoragePaths, settings: AppSettings): void {
ensureBaseDir(paths.baseDir);
// Create a backup of the existing config before overwriting
if (fs.existsSync(paths.configFile)) {
try {
fs.copyFileSync(paths.configFile, `${paths.configFile}.bak`);
} catch {
// Best-effort backup; proceed even if it fails
}
}
const persisted = sanitizeCredentialPersistence(normalizeSettings(settings));
const payload = JSON.stringify(persisted, null, 2);
const tempPath = `${paths.configFile}.tmp`;
try {
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.configFile);
} catch (error) {
try { fs.rmSync(tempPath, { force: true }); } catch { /* ignore */ }
throw error;
}
}
let asyncSettingsSaveRunning = false;
let asyncSettingsSaveQueued: { paths: StoragePaths; settings: AppSettings } | null = null;
async function writeSettingsPayload(paths: StoragePaths, payload: string): Promise<void> {
await fs.promises.mkdir(paths.baseDir, { recursive: true });
await fsp.copyFile(paths.configFile, `${paths.configFile}.bak`).catch(() => {});
const tempPath = `${paths.configFile}.settings.tmp`;
await fsp.writeFile(tempPath, payload, "utf8");
try {
await fsp.rename(tempPath, paths.configFile);
} catch (renameError: unknown) {
if (renameError && typeof renameError === "object" && "code" in renameError && (renameError as NodeJS.ErrnoException).code === "EXDEV") {
await fsp.copyFile(tempPath, paths.configFile);
await fsp.rm(tempPath, { force: true }).catch(() => {});
} else {
await fsp.rm(tempPath, { force: true }).catch(() => {});
throw renameError;
}
}
}
export async function saveSettingsAsync(paths: StoragePaths, settings: AppSettings): Promise<void> {
const persisted = sanitizeCredentialPersistence(normalizeSettings(settings));
const payload = JSON.stringify(persisted, null, 2);
if (asyncSettingsSaveRunning) {
asyncSettingsSaveQueued = { paths, settings };
return;
}
asyncSettingsSaveRunning = true;
try {
await writeSettingsPayload(paths, payload);
} catch (error) {
logger.error(`Async Settings-Save fehlgeschlagen: ${String(error)}`);
} finally {
asyncSettingsSaveRunning = false;
if (asyncSettingsSaveQueued) {
const queued = asyncSettingsSaveQueued;
asyncSettingsSaveQueued = null;
void saveSettingsAsync(queued.paths, queued.settings);
}
}
}
export function emptySession(): SessionState {
return {
version: 2,
packageOrder: [],
packages: {},
items: {},
runStartedAt: 0,
totalDownloadedBytes: 0,
summaryText: "",
reconnectUntil: 0,
reconnectReason: "",
paused: false,
running: false,
updatedAt: Date.now()
};
}
export function loadSession(paths: StoragePaths): SessionState {
ensureBaseDir(paths.baseDir);
if (!fs.existsSync(paths.sessionFile)) {
return emptySession();
}
const primary = readSessionFile(paths.sessionFile);
if (primary) {
return primary;
}
const backupFile = sessionBackupPath(paths.sessionFile);
const backup = fs.existsSync(backupFile) ? readSessionFile(backupFile) : null;
if (backup) {
logger.warn("Session defekt, Backup-Datei wird verwendet");
try {
const payload = JSON.stringify({ ...backup, updatedAt: Date.now() });
const tempPath = sessionTempPath(paths.sessionFile, "sync");
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.sessionFile);
} catch {
// ignore restore write failure
}
return backup;
}
logger.error("Session konnte nicht geladen werden (auch Backup fehlgeschlagen)");
return emptySession();
}
export function saveSession(paths: StoragePaths, session: SessionState): void {
syncSaveGeneration += 1;
ensureBaseDir(paths.baseDir);
if (fs.existsSync(paths.sessionFile)) {
try {
fs.copyFileSync(paths.sessionFile, sessionBackupPath(paths.sessionFile));
} catch {
// Best-effort backup; proceed even if it fails
}
}
const payload = JSON.stringify({ ...session, updatedAt: Date.now() });
const tempPath = sessionTempPath(paths.sessionFile, "sync");
try {
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.sessionFile);
} catch (error) {
try { fs.rmSync(tempPath, { force: true }); } catch { /* ignore */ }
throw error;
}
}
let asyncSaveRunning = false;
let asyncSaveQueued: { paths: StoragePaths; payload: string } | null = null;
let syncSaveGeneration = 0;
async function writeSessionPayload(paths: StoragePaths, payload: string, generation: number): Promise<void> {
await fs.promises.mkdir(paths.baseDir, { recursive: true });
await fsp.copyFile(paths.sessionFile, sessionBackupPath(paths.sessionFile)).catch(() => {});
const tempPath = sessionTempPath(paths.sessionFile, "async");
await fsp.writeFile(tempPath, payload, "utf8");
// If a synchronous save occurred after this async save started, discard the stale write
if (generation < syncSaveGeneration) {
await fsp.rm(tempPath, { force: true }).catch(() => {});
return;
}
try {
await fsp.rename(tempPath, paths.sessionFile);
} catch (renameError: unknown) {
if (renameError && typeof renameError === "object" && "code" in renameError && (renameError as NodeJS.ErrnoException).code === "EXDEV") {
if (generation < syncSaveGeneration) {
await fsp.rm(tempPath, { force: true }).catch(() => {});
return;
}
await fsp.copyFile(tempPath, paths.sessionFile);
await fsp.rm(tempPath, { force: true }).catch(() => {});
} else {
await fsp.rm(tempPath, { force: true }).catch(() => {});
throw renameError;
}
}
}
async function saveSessionPayloadAsync(paths: StoragePaths, payload: string): Promise<void> {
if (asyncSaveRunning) {
asyncSaveQueued = { paths, payload };
return;
}
asyncSaveRunning = true;
const gen = syncSaveGeneration;
try {
await writeSessionPayload(paths, payload, gen);
} catch (error) {
logger.error(`Async Session-Save fehlgeschlagen: ${String(error)}`);
} finally {
asyncSaveRunning = false;
if (asyncSaveQueued) {
const queued = asyncSaveQueued;
asyncSaveQueued = null;
void saveSessionPayloadAsync(queued.paths, queued.payload);
}
}
}
export function cancelPendingAsyncSaves(): void {
asyncSaveQueued = null;
asyncSettingsSaveQueued = null;
syncSaveGeneration += 1;
}
export async function saveSessionAsync(paths: StoragePaths, session: SessionState): Promise<void> {
const payload = JSON.stringify({ ...session, updatedAt: Date.now() });
await saveSessionPayloadAsync(paths, payload);
}
const MAX_HISTORY_ENTRIES = 500;
function normalizeHistoryEntry(raw: unknown, index: number): HistoryEntry | null {
const entry = asRecord(raw);
if (!entry) return null;
const id = asText(entry.id) || `hist-${Date.now().toString(36)}-${index}`;
const name = asText(entry.name) || "Unbenannt";
const providerRaw = asText(entry.provider);
return {
id,
name,
totalBytes: clampNumber(entry.totalBytes, 0, 0, Number.MAX_SAFE_INTEGER),
downloadedBytes: clampNumber(entry.downloadedBytes, 0, 0, Number.MAX_SAFE_INTEGER),
fileCount: clampNumber(entry.fileCount, 0, 0, 100000),
provider: VALID_ITEM_PROVIDERS.has(providerRaw as DebridProvider) ? providerRaw as DebridProvider : null,
completedAt: clampNumber(entry.completedAt, Date.now(), 0, Number.MAX_SAFE_INTEGER),
durationSeconds: clampNumber(entry.durationSeconds, 0, 0, Number.MAX_SAFE_INTEGER),
status: entry.status === "deleted" ? "deleted" : "completed",
outputDir: asText(entry.outputDir),
urls: Array.isArray(entry.urls) ? (entry.urls as unknown[]).map(String).filter(Boolean) : undefined
};
}
export function loadHistory(paths: StoragePaths): HistoryEntry[] {
ensureBaseDir(paths.baseDir);
if (!fs.existsSync(paths.historyFile)) {
return [];
}
try {
const raw = JSON.parse(fs.readFileSync(paths.historyFile, "utf8")) as unknown;
if (!Array.isArray(raw)) return [];
const entries: HistoryEntry[] = [];
for (let i = 0; i < raw.length && entries.length < MAX_HISTORY_ENTRIES; i++) {
const normalized = normalizeHistoryEntry(raw[i], i);
if (normalized) entries.push(normalized);
}
return entries;
} catch {
return [];
}
}
export function saveHistory(paths: StoragePaths, entries: HistoryEntry[]): void {
ensureBaseDir(paths.baseDir);
const trimmed = entries.slice(0, MAX_HISTORY_ENTRIES);
const payload = JSON.stringify(trimmed, null, 2);
const tempPath = `${paths.historyFile}.tmp`;
try {
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.historyFile);
} catch (error) {
try { fs.rmSync(tempPath, { force: true }); } catch { /* ignore */ }
throw error;
}
}
export function addHistoryEntry(paths: StoragePaths, entry: HistoryEntry): HistoryEntry[] {
const existing = loadHistory(paths);
const updated = [entry, ...existing].slice(0, MAX_HISTORY_ENTRIES);
saveHistory(paths, updated);
return updated;
}
export function removeHistoryEntry(paths: StoragePaths, entryId: string): HistoryEntry[] {
const existing = loadHistory(paths);
const updated = existing.filter(e => e.id !== entryId);
saveHistory(paths, updated);
return updated;
}
export function clearHistory(paths: StoragePaths): void {
ensureBaseDir(paths.baseDir);
if (fs.existsSync(paths.historyFile)) {
try {
fs.unlinkSync(paths.historyFile);
} catch {
// ignore
}
}
}

1179
src/main/update.ts Normal file

File diff suppressed because it is too large Load Diff

268
src/main/utils.ts Normal file
View File

@ -0,0 +1,268 @@
import path from "node:path";
import { ParsedPackageInput } from "../shared/types";
function safeDecodeURIComponent(value: string): string {
try {
return decodeURIComponent(value);
} catch {
return value;
}
}
const WINDOWS_RESERVED_BASENAMES = new Set([
"con", "prn", "aux", "nul",
"com1", "com2", "com3", "com4", "com5", "com6", "com7", "com8", "com9",
"lpt1", "lpt2", "lpt3", "lpt4", "lpt5", "lpt6", "lpt7", "lpt8", "lpt9"
]);
export function compactErrorText(message: unknown, maxLen = 220): string {
const raw = String(message ?? "").replace(/<[^>]+>/g, " ").replace(/\s+/g, " ").trim();
if (!raw) {
return "Unbekannter Fehler";
}
const safeMaxLen = Number.isFinite(maxLen) ? Math.max(4, Math.floor(maxLen)) : 220;
if (raw.length <= safeMaxLen) {
return raw;
}
return `${raw.slice(0, safeMaxLen - 3)}...`;
}
export function sanitizeFilename(name: string): string {
const cleaned = String(name || "")
.replace(/\0/g, "")
.replace(/[\\/:*?"<>|]/g, " ")
.replace(/\s+/g, " ")
.trim();
let normalized = cleaned
.replace(/^[.\s]+/g, "")
.replace(/[.\s]+$/g, "")
.trim();
if (!normalized || normalized === "." || normalized === ".." || /^\.+$/.test(normalized)) {
return "Paket";
}
const parsed = path.parse(normalized);
const reservedBase = (parsed.name.split(".")[0] || parsed.name).toLowerCase();
if (WINDOWS_RESERVED_BASENAMES.has(reservedBase)) {
normalized = `${parsed.name.replace(/^([^.]*)/, "$1_")}${parsed.ext}`;
}
return normalized || "Paket";
}
export function isHttpLink(value: string): boolean {
const text = String(value || "").trim();
if (!text) {
return false;
}
try {
const url = new URL(text);
return (url.protocol === "http:" || url.protocol === "https:") && !!url.hostname;
} catch {
return false;
}
}
export function extractHttpLinksFromText(text: string): string[] {
const matches = String(text || "").match(/https?:\/\/[^\s<>"']+/gi) ?? [];
const seen = new Set<string>();
const links: string[] = [];
for (const match of matches) {
let candidate = String(match || "").trim();
let openParen = 0;
let closeParen = 0;
let openBracket = 0;
let closeBracket = 0;
for (const char of candidate) {
if (char === "(") {
openParen += 1;
} else if (char === ")") {
closeParen += 1;
} else if (char === "[") {
openBracket += 1;
} else if (char === "]") {
closeBracket += 1;
}
}
while (candidate.length > 0) {
const lastChar = candidate[candidate.length - 1];
if (![")", "]", ",", ".", "!", "?", ";", ":"].includes(lastChar)) {
break;
}
if (lastChar === ")") {
if (closeParen <= openParen) {
break;
}
}
if (lastChar === "]") {
if (closeBracket <= openBracket) {
break;
}
}
if (lastChar === ")") {
closeParen = Math.max(0, closeParen - 1);
} else if (lastChar === "]") {
closeBracket = Math.max(0, closeBracket - 1);
}
candidate = candidate.slice(0, -1);
}
if (!candidate || !isHttpLink(candidate) || seen.has(candidate)) {
continue;
}
seen.add(candidate);
links.push(candidate);
}
return links;
}
export function humanSize(bytes: number): string {
const value = Number(bytes);
if (!Number.isFinite(value) || value < 0) {
return "0 B";
}
if (value < 1024) {
return `${Math.round(value)} B`;
}
const units = ["KB", "MB", "GB", "TB"];
let size = value / 1024;
let unit = 0;
while (size >= 1024 && unit < units.length - 1) {
size /= 1024;
unit += 1;
}
return `${size.toFixed(size < 10 ? 1 : 0)} ${units[unit]}`;
}
export function filenameFromUrl(url: string): string {
try {
const parsed = new URL(url);
if (parsed.protocol !== "http:" && parsed.protocol !== "https:") {
return "download.bin";
}
const queryName = parsed.searchParams.get("filename")
|| parsed.searchParams.get("file")
|| parsed.searchParams.get("name")
|| parsed.searchParams.get("download")
|| parsed.searchParams.get("title")
|| "";
const rawName = queryName || path.basename(parsed.pathname || "");
const decoded = safeDecodeURIComponent(rawName || "").trim();
const normalized = decoded
.replace(/\.(rar|zip|7z|tar|gz|bz2|xz|iso|part\d+\.rar|r\d{2,3})\.html$/i, ".$1")
.replace(/\.(mp4|mkv|avi|mp3|flac|srt)\.html$/i, ".$1");
return sanitizeFilename(normalized || "download.bin");
} catch {
return "download.bin";
}
}
export function looksLikeOpaqueFilename(name: string): boolean {
const cleaned = sanitizeFilename(name || "").toLowerCase();
if (!cleaned || cleaned === "download.bin") {
return true;
}
const parsed = path.parse(cleaned);
return /^[a-f0-9]{24,}$/i.test(parsed.name || cleaned);
}
export function inferPackageNameFromLinks(links: string[]): string {
if (links.length === 0) {
return "Paket";
}
const names = links.map((link) => filenameFromUrl(link).toLowerCase());
const first = names[0];
const match = first.match(/^([a-z0-9._\- ]{3,80}?)(?:\.|-|_)(?:part\d+|r\d{2}|s\d{2}e\d{2})/i);
if (match) {
return sanitizeFilename(match[1]);
}
return sanitizeFilename(path.parse(first).name || "Paket");
}
export function uniquePreserveOrder(items: string[]): string[] {
const seen = new Set<string>();
const out: string[] = [];
for (const item of items) {
const trimmed = item.trim();
if (!trimmed || seen.has(trimmed)) {
continue;
}
seen.add(trimmed);
out.push(trimmed);
}
return out;
}
export function parsePackagesFromLinksText(rawText: string, defaultPackageName: string): ParsedPackageInput[] {
const lines = String(rawText || "").split(/\r?\n/);
const packages: ParsedPackageInput[] = [];
let currentName = String(defaultPackageName || "").trim();
let currentLinks: string[] = [];
const flush = (): void => {
const links = uniquePreserveOrder(currentLinks.filter((line) => isHttpLink(line)));
if (links.length > 0) {
const normalizedCurrentName = String(currentName || "").trim();
packages.push({
name: normalizedCurrentName
? sanitizeFilename(normalizedCurrentName)
: inferPackageNameFromLinks(links),
links
});
}
currentLinks = [];
};
for (const line of lines) {
const text = line.trim();
if (!text) {
continue;
}
const marker = text.match(/^#\s*package\s*:\s*(.+)$/i);
if (marker) {
flush();
currentName = String(marker[1] || "").trim();
continue;
}
currentLinks.push(text);
}
flush();
if (packages.length === 0) {
return [];
}
return packages;
}
export function ensureDirPath(baseDir: string, packageName: string): string {
if (!path.isAbsolute(baseDir)) {
throw new Error("baseDir muss ein absoluter Pfad sein");
}
return path.join(baseDir, sanitizeFilename(packageName));
}
export function nowMs(): number {
return Date.now();
}
export function sleep(ms: number): Promise<void> {
return new Promise((resolve) => setTimeout(resolve, ms));
}
export function formatEta(seconds: number): string {
if (!Number.isFinite(seconds) || seconds < 0) {
return "--";
}
const s = Math.floor(seconds);
const sec = s % 60;
const minTotal = Math.floor(s / 60);
const min = minTotal % 60;
const hr = Math.floor(minTotal / 60);
if (hr > 0) {
return `${String(hr).padStart(2, "0")}:${String(min).padStart(2, "0")}:${String(sec).padStart(2, "0")}`;
}
return `${String(min).padStart(2, "0")}:${String(sec).padStart(2, "0")}`;
}

87
src/preload/preload.ts Normal file
View File

@ -0,0 +1,87 @@
import { contextBridge, ipcRenderer } from "electron";
import {
AddLinksPayload,
AppSettings,
DuplicatePolicy,
HistoryEntry,
PackagePriority,
SessionStats,
StartConflictEntry,
StartConflictResolutionResult,
UiSnapshot,
UpdateCheckResult,
UpdateInstallProgress
} from "../shared/types";
import { IPC_CHANNELS } from "../shared/ipc";
import { ElectronApi } from "../shared/preload-api";
const api: ElectronApi = {
getSnapshot: (): Promise<UiSnapshot> => ipcRenderer.invoke(IPC_CHANNELS.GET_SNAPSHOT),
getVersion: (): Promise<string> => ipcRenderer.invoke(IPC_CHANNELS.GET_VERSION),
checkUpdates: (): Promise<UpdateCheckResult> => ipcRenderer.invoke(IPC_CHANNELS.CHECK_UPDATES),
installUpdate: () => ipcRenderer.invoke(IPC_CHANNELS.INSTALL_UPDATE),
openExternal: (url: string): Promise<boolean> => ipcRenderer.invoke(IPC_CHANNELS.OPEN_EXTERNAL, url),
updateSettings: (settings: Partial<AppSettings>): Promise<AppSettings> => ipcRenderer.invoke(IPC_CHANNELS.UPDATE_SETTINGS, settings),
addLinks: (payload: AddLinksPayload): Promise<{ addedPackages: number; addedLinks: number; invalidCount: number }> =>
ipcRenderer.invoke(IPC_CHANNELS.ADD_LINKS, payload),
addContainers: (filePaths: string[]): Promise<{ addedPackages: number; addedLinks: number }> =>
ipcRenderer.invoke(IPC_CHANNELS.ADD_CONTAINERS, filePaths),
getStartConflicts: (): Promise<StartConflictEntry[]> => ipcRenderer.invoke(IPC_CHANNELS.GET_START_CONFLICTS),
resolveStartConflict: (packageId: string, policy: DuplicatePolicy): Promise<StartConflictResolutionResult> =>
ipcRenderer.invoke(IPC_CHANNELS.RESOLVE_START_CONFLICT, packageId, policy),
clearAll: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.CLEAR_ALL),
start: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.START),
startPackages: (packageIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.START_PACKAGES, packageIds),
stop: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.STOP),
togglePause: (): Promise<boolean> => ipcRenderer.invoke(IPC_CHANNELS.TOGGLE_PAUSE),
cancelPackage: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.CANCEL_PACKAGE, packageId),
renamePackage: (packageId: string, newName: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RENAME_PACKAGE, packageId, newName),
reorderPackages: (packageIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.REORDER_PACKAGES, packageIds),
removeItem: (itemId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.REMOVE_ITEM, itemId),
togglePackage: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.TOGGLE_PACKAGE, packageId),
exportQueue: (): Promise<{ saved: boolean }> => ipcRenderer.invoke(IPC_CHANNELS.EXPORT_QUEUE),
importQueue: (json: string): Promise<{ addedPackages: number; addedLinks: number }> => ipcRenderer.invoke(IPC_CHANNELS.IMPORT_QUEUE, json),
toggleClipboard: (): Promise<boolean> => ipcRenderer.invoke(IPC_CHANNELS.TOGGLE_CLIPBOARD),
pickFolder: (): Promise<string | null> => ipcRenderer.invoke(IPC_CHANNELS.PICK_FOLDER),
pickContainers: (): Promise<string[]> => ipcRenderer.invoke(IPC_CHANNELS.PICK_CONTAINERS),
getSessionStats: (): Promise<SessionStats> => ipcRenderer.invoke(IPC_CHANNELS.GET_SESSION_STATS),
restart: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RESTART),
quit: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.QUIT),
exportBackup: (): Promise<{ saved: boolean }> => ipcRenderer.invoke(IPC_CHANNELS.EXPORT_BACKUP),
importBackup: (): Promise<{ restored: boolean; message: string }> => ipcRenderer.invoke(IPC_CHANNELS.IMPORT_BACKUP),
openLog: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.OPEN_LOG),
openSessionLog: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.OPEN_SESSION_LOG),
retryExtraction: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RETRY_EXTRACTION, packageId),
extractNow: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.EXTRACT_NOW, packageId),
resetPackage: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RESET_PACKAGE, packageId),
getHistory: (): Promise<HistoryEntry[]> => ipcRenderer.invoke(IPC_CHANNELS.GET_HISTORY),
clearHistory: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.CLEAR_HISTORY),
removeHistoryEntry: (entryId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.REMOVE_HISTORY_ENTRY, entryId),
setPackagePriority: (packageId: string, priority: PackagePriority): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.SET_PACKAGE_PRIORITY, packageId, priority),
skipItems: (itemIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.SKIP_ITEMS, itemIds),
resetItems: (itemIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RESET_ITEMS, itemIds),
startItems: (itemIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.START_ITEMS, itemIds),
onStateUpdate: (callback: (snapshot: UiSnapshot) => void): (() => void) => {
const listener = (_event: unknown, snapshot: UiSnapshot): void => callback(snapshot);
ipcRenderer.on(IPC_CHANNELS.STATE_UPDATE, listener);
return () => {
ipcRenderer.removeListener(IPC_CHANNELS.STATE_UPDATE, listener);
};
},
onClipboardDetected: (callback: (links: string[]) => void): (() => void) => {
const listener = (_event: unknown, links: string[]): void => callback(links);
ipcRenderer.on(IPC_CHANNELS.CLIPBOARD_DETECTED, listener);
return () => {
ipcRenderer.removeListener(IPC_CHANNELS.CLIPBOARD_DETECTED, listener);
};
},
onUpdateInstallProgress: (callback: (progress: UpdateInstallProgress) => void): (() => void) => {
const listener = (_event: unknown, progress: UpdateInstallProgress): void => callback(progress);
ipcRenderer.on(IPC_CHANNELS.UPDATE_INSTALL_PROGRESS, listener);
return () => {
ipcRenderer.removeListener(IPC_CHANNELS.UPDATE_INSTALL_PROGRESS, listener);
};
}
};
contextBridge.exposeInMainWorld("rd", api);

3513
src/renderer/App.tsx Normal file

File diff suppressed because it is too large Load Diff

12
src/renderer/index.html Normal file
View File

@ -0,0 +1,12 @@
<!doctype html>
<html lang="de">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Multi Debrid Downloader</title>
</head>
<body>
<div id="root"></div>
<script type="module" src="./main.tsx"></script>
</body>
</html>

15
src/renderer/main.tsx Normal file
View File

@ -0,0 +1,15 @@
import React from "react";
import { createRoot } from "react-dom/client";
import { App } from "./App";
import "./styles.css";
const rootElement = document.getElementById("root");
if (!rootElement) {
throw new Error("Root element fehlt");
}
createRoot(rootElement).render(
<React.StrictMode>
<App />
</React.StrictMode>
);

View File

@ -0,0 +1,25 @@
import type { PackageEntry } from "../shared/types";
export function reorderPackageOrderByDrop(order: string[], draggedPackageId: string, targetPackageId: string): string[] {
const fromIndex = order.indexOf(draggedPackageId);
const toIndex = order.indexOf(targetPackageId);
if (fromIndex < 0 || toIndex < 0 || fromIndex === toIndex) {
return order;
}
const next = [...order];
const [dragged] = next.splice(fromIndex, 1);
const insertIndex = Math.max(0, Math.min(next.length, toIndex));
next.splice(insertIndex, 0, dragged);
return next;
}
export function sortPackageOrderByName(order: string[], packages: Record<string, PackageEntry>, descending: boolean): string[] {
const sorted = [...order];
sorted.sort((a, b) => {
const nameA = (packages[a]?.name ?? "").toLowerCase();
const nameB = (packages[b]?.name ?? "").toLowerCase();
const cmp = nameA.localeCompare(nameB, undefined, { numeric: true, sensitivity: "base" });
return descending ? -cmp : cmp;
});
return sorted;
}

1921
src/renderer/styles.css Normal file

File diff suppressed because it is too large Load Diff

11
src/renderer/vite-env.d.ts vendored Normal file
View File

@ -0,0 +1,11 @@
/// <reference types="vite/client" />
import type { ElectronApi } from "../shared/preload-api";
declare global {
interface Window {
rd: ElectronApi;
}
}
export {};

47
src/shared/ipc.ts Normal file
View File

@ -0,0 +1,47 @@
export const IPC_CHANNELS = {
GET_SNAPSHOT: "app:get-snapshot",
GET_VERSION: "app:get-version",
CHECK_UPDATES: "app:check-updates",
INSTALL_UPDATE: "app:install-update",
UPDATE_INSTALL_PROGRESS: "app:update-install-progress",
OPEN_EXTERNAL: "app:open-external",
UPDATE_SETTINGS: "app:update-settings",
ADD_LINKS: "queue:add-links",
ADD_CONTAINERS: "queue:add-containers",
GET_START_CONFLICTS: "queue:get-start-conflicts",
RESOLVE_START_CONFLICT: "queue:resolve-start-conflict",
CLEAR_ALL: "queue:clear-all",
START: "queue:start",
START_PACKAGES: "queue:start-packages",
STOP: "queue:stop",
TOGGLE_PAUSE: "queue:toggle-pause",
CANCEL_PACKAGE: "queue:cancel-package",
RENAME_PACKAGE: "queue:rename-package",
REORDER_PACKAGES: "queue:reorder-packages",
REMOVE_ITEM: "queue:remove-item",
TOGGLE_PACKAGE: "queue:toggle-package",
EXPORT_QUEUE: "queue:export",
IMPORT_QUEUE: "queue:import",
PICK_FOLDER: "dialog:pick-folder",
PICK_CONTAINERS: "dialog:pick-containers",
STATE_UPDATE: "state:update",
CLIPBOARD_DETECTED: "clipboard:detected",
TOGGLE_CLIPBOARD: "clipboard:toggle",
GET_SESSION_STATS: "stats:get-session-stats",
RESTART: "app:restart",
QUIT: "app:quit",
EXPORT_BACKUP: "app:export-backup",
IMPORT_BACKUP: "app:import-backup",
OPEN_LOG: "app:open-log",
OPEN_SESSION_LOG: "app:open-session-log",
RETRY_EXTRACTION: "queue:retry-extraction",
EXTRACT_NOW: "queue:extract-now",
RESET_PACKAGE: "queue:reset-package",
GET_HISTORY: "history:get",
CLEAR_HISTORY: "history:clear",
REMOVE_HISTORY_ENTRY: "history:remove-entry",
SET_PACKAGE_PRIORITY: "queue:set-package-priority",
SKIP_ITEMS: "queue:skip-items",
RESET_ITEMS: "queue:reset-items",
START_ITEMS: "queue:start-items"
} as const;

62
src/shared/preload-api.ts Normal file
View File

@ -0,0 +1,62 @@
import type {
AddLinksPayload,
AppSettings,
DuplicatePolicy,
HistoryEntry,
PackagePriority,
SessionStats,
StartConflictEntry,
StartConflictResolutionResult,
UiSnapshot,
UpdateCheckResult,
UpdateInstallProgress,
UpdateInstallResult
} from "./types";
export interface ElectronApi {
getSnapshot: () => Promise<UiSnapshot>;
getVersion: () => Promise<string>;
checkUpdates: () => Promise<UpdateCheckResult>;
installUpdate: () => Promise<UpdateInstallResult>;
openExternal: (url: string) => Promise<boolean>;
updateSettings: (settings: Partial<AppSettings>) => Promise<AppSettings>;
addLinks: (payload: AddLinksPayload) => Promise<{ addedPackages: number; addedLinks: number; invalidCount: number }>;
addContainers: (filePaths: string[]) => Promise<{ addedPackages: number; addedLinks: number }>;
getStartConflicts: () => Promise<StartConflictEntry[]>;
resolveStartConflict: (packageId: string, policy: DuplicatePolicy) => Promise<StartConflictResolutionResult>;
clearAll: () => Promise<void>;
start: () => Promise<void>;
startPackages: (packageIds: string[]) => Promise<void>;
stop: () => Promise<void>;
togglePause: () => Promise<boolean>;
cancelPackage: (packageId: string) => Promise<void>;
renamePackage: (packageId: string, newName: string) => Promise<void>;
reorderPackages: (packageIds: string[]) => Promise<void>;
removeItem: (itemId: string) => Promise<void>;
togglePackage: (packageId: string) => Promise<void>;
exportQueue: () => Promise<{ saved: boolean }>;
importQueue: (json: string) => Promise<{ addedPackages: number; addedLinks: number }>;
toggleClipboard: () => Promise<boolean>;
pickFolder: () => Promise<string | null>;
pickContainers: () => Promise<string[]>;
getSessionStats: () => Promise<SessionStats>;
restart: () => Promise<void>;
quit: () => Promise<void>;
exportBackup: () => Promise<{ saved: boolean }>;
importBackup: () => Promise<{ restored: boolean; message: string }>;
openLog: () => Promise<void>;
openSessionLog: () => Promise<void>;
retryExtraction: (packageId: string) => Promise<void>;
extractNow: (packageId: string) => Promise<void>;
resetPackage: (packageId: string) => Promise<void>;
getHistory: () => Promise<HistoryEntry[]>;
clearHistory: () => Promise<void>;
removeHistoryEntry: (entryId: string) => Promise<void>;
setPackagePriority: (packageId: string, priority: PackagePriority) => Promise<void>;
skipItems: (itemIds: string[]) => Promise<void>;
resetItems: (itemIds: string[]) => Promise<void>;
startItems: (itemIds: string[]) => Promise<void>;
onStateUpdate: (callback: (snapshot: UiSnapshot) => void) => () => void;
onClipboardDetected: (callback: (links: string[]) => void) => () => void;
onUpdateInstallProgress: (callback: (progress: UpdateInstallProgress) => void) => () => void;
}

288
src/shared/types.ts Normal file
View File

@ -0,0 +1,288 @@
export type DownloadStatus =
| "queued"
| "validating"
| "downloading"
| "paused"
| "reconnect_wait"
| "extracting"
| "integrity_check"
| "completed"
| "failed"
| "cancelled";
export type CleanupMode = "none" | "trash" | "delete";
export type ConflictMode = "overwrite" | "skip" | "rename" | "ask";
export type SpeedMode = "global" | "per_download";
export type FinishedCleanupPolicy = "never" | "immediate" | "on_start" | "package_done";
export type DebridProvider = "realdebrid" | "megadebrid" | "bestdebrid" | "alldebrid" | "ddownload";
export type DebridFallbackProvider = DebridProvider | "none";
export type AppTheme = "dark" | "light";
export type PackagePriority = "high" | "normal" | "low";
export type ExtractCpuPriority = "high" | "middle" | "low";
export interface BandwidthScheduleEntry {
id: string;
startHour: number;
endHour: number;
speedLimitKbps: number;
enabled: boolean;
}
export interface DownloadStats {
totalDownloaded: number;
totalDownloadedAllTime: number;
totalFiles: number;
totalPackages: number;
sessionStartedAt: number;
}
export interface AppSettings {
token: string;
megaLogin: string;
megaPassword: string;
bestToken: string;
allDebridToken: string;
ddownloadLogin: string;
ddownloadPassword: string;
archivePasswordList: string;
rememberToken: boolean;
providerPrimary: DebridProvider;
providerSecondary: DebridFallbackProvider;
providerTertiary: DebridFallbackProvider;
autoProviderFallback: boolean;
outputDir: string;
packageName: string;
autoExtract: boolean;
autoRename4sf4sj: boolean;
extractDir: string;
collectMkvToLibrary: boolean;
mkvLibraryDir: string;
createExtractSubfolder: boolean;
hybridExtract: boolean;
cleanupMode: CleanupMode;
extractConflictMode: ConflictMode;
removeLinkFilesAfterExtract: boolean;
removeSamplesAfterExtract: boolean;
enableIntegrityCheck: boolean;
autoResumeOnStart: boolean;
autoReconnect: boolean;
reconnectWaitSeconds: number;
completedCleanupPolicy: FinishedCleanupPolicy;
maxParallel: number;
maxParallelExtract: number;
retryLimit: number;
speedLimitEnabled: boolean;
speedLimitKbps: number;
speedLimitMode: SpeedMode;
updateRepo: string;
autoUpdateCheck: boolean;
clipboardWatch: boolean;
minimizeToTray: boolean;
theme: AppTheme;
collapseNewPackages: boolean;
autoSkipExtracted: boolean;
confirmDeleteSelection: boolean;
totalDownloadedAllTime: number;
bandwidthSchedules: BandwidthScheduleEntry[];
columnOrder: string[];
extractCpuPriority: ExtractCpuPriority;
autoExtractWhenStopped: boolean;
}
export interface DownloadItem {
id: string;
packageId: string;
url: string;
provider: DebridProvider | null;
status: DownloadStatus;
retries: number;
speedBps: number;
downloadedBytes: number;
totalBytes: number | null;
progressPercent: number;
fileName: string;
targetPath: string;
resumable: boolean;
attempts: number;
lastError: string;
fullStatus: string;
createdAt: number;
updatedAt: number;
onlineStatus?: "online" | "offline" | "checking";
}
export interface PackageEntry {
id: string;
name: string;
outputDir: string;
extractDir: string;
status: DownloadStatus;
itemIds: string[];
cancelled: boolean;
enabled: boolean;
priority: PackagePriority;
postProcessLabel?: string;
createdAt: number;
updatedAt: number;
}
export interface SessionState {
version: number;
packageOrder: string[];
packages: Record<string, PackageEntry>;
items: Record<string, DownloadItem>;
runStartedAt: number;
totalDownloadedBytes: number;
summaryText: string;
reconnectUntil: number;
reconnectReason: string;
paused: boolean;
running: boolean;
updatedAt: number;
}
export interface DownloadSummary {
total: number;
success: number;
failed: number;
cancelled: number;
extracted: number;
durationSeconds: number;
averageSpeedBps: number;
}
export interface ParsedPackageInput {
name: string;
links: string[];
fileNames?: string[];
}
export interface ContainerImportResult {
packages: ParsedPackageInput[];
source: "dlc";
}
export interface UiSnapshot {
settings: AppSettings;
session: SessionState;
summary: DownloadSummary | null;
stats: DownloadStats;
speedText: string;
etaText: string;
canStart: boolean;
canStop: boolean;
canPause: boolean;
clipboardActive: boolean;
reconnectSeconds: number;
packageSpeedBps: Record<string, number>;
}
export interface AddLinksPayload {
rawText: string;
packageName?: string;
duplicatePolicy?: DuplicatePolicy;
}
export interface AddContainerPayload {
filePaths: string[];
}
export type DuplicatePolicy = "keep" | "skip" | "overwrite";
export interface QueueAddResult {
addedPackages: number;
addedLinks: number;
skippedExistingPackages: string[];
overwrittenPackages: string[];
}
export interface ContainerConflictResult {
conflicts: string[];
packageCount: number;
linkCount: number;
}
export interface StartConflictEntry {
packageId: string;
packageName: string;
extractDir: string;
}
export interface StartConflictResolutionResult {
skipped: boolean;
overwritten: boolean;
}
export interface UpdateCheckResult {
updateAvailable: boolean;
currentVersion: string;
latestVersion: string;
latestTag: string;
releaseUrl: string;
setupAssetUrl?: string;
setupAssetName?: string;
setupAssetDigest?: string;
releaseNotes?: string;
error?: string;
}
export interface UpdateInstallResult {
started: boolean;
message: string;
}
export interface UpdateInstallProgress {
stage: "starting" | "downloading" | "verifying" | "launching" | "done" | "error";
percent: number | null;
downloadedBytes: number;
totalBytes: number | null;
message: string;
}
export interface ParsedHashEntry {
fileName: string;
algorithm: "crc32" | "md5" | "sha1";
digest: string;
}
export interface BandwidthSample {
timestamp: number;
speedBps: number;
}
export interface BandwidthStats {
samples: BandwidthSample[];
currentSpeedBps: number;
averageSpeedBps: number;
maxSpeedBps: number;
totalBytesSession: number;
sessionDurationSeconds: number;
}
export interface SessionStats {
bandwidth: BandwidthStats;
totalDownloads: number;
completedDownloads: number;
failedDownloads: number;
activeDownloads: number;
queuedDownloads: number;
}
export interface HistoryEntry {
id: string;
name: string;
totalBytes: number;
downloadedBytes: number;
fileCount: number;
provider: DebridProvider | null;
completedAt: number;
durationSeconds: number;
status: "completed" | "deleted";
outputDir: string;
urls?: string[];
}
export interface HistoryState {
entries: HistoryEntry[];
maxEntries: number;
}

49
tests/app-order.test.ts Normal file
View File

@ -0,0 +1,49 @@
import { describe, expect, it } from "vitest";
import { reorderPackageOrderByDrop, sortPackageOrderByName } from "../src/renderer/package-order";
describe("reorderPackageOrderByDrop", () => {
it("moves adjacent package down by one on drop", () => {
const next = reorderPackageOrderByDrop(["a", "b", "c"], "b", "c");
expect(next).toEqual(["a", "c", "b"]);
});
it("moves package after lower drop target", () => {
const next = reorderPackageOrderByDrop(["a", "b", "c", "d"], "a", "c");
expect(next).toEqual(["b", "c", "a", "d"]);
});
it("returns original order when ids are invalid", () => {
const order = ["a", "b", "c"];
expect(reorderPackageOrderByDrop(order, "x", "b")).toEqual(order);
expect(reorderPackageOrderByDrop(order, "a", "x")).toEqual(order);
expect(reorderPackageOrderByDrop(order, "a", "a")).toEqual(order);
});
});
describe("sortPackageOrderByName", () => {
it("sorts package IDs alphabetically ascending", () => {
const sorted = sortPackageOrderByName(
["pkg3", "pkg1", "pkg2"],
{
pkg1: { id: "pkg1", name: "Alpha", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
pkg2: { id: "pkg2", name: "beta", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
pkg3: { id: "pkg3", name: "Gamma", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 }
},
false
);
expect(sorted).toEqual(["pkg1", "pkg2", "pkg3"]);
});
it("sorts package IDs alphabetically descending", () => {
const sorted = sortPackageOrderByName(
["pkg1", "pkg2", "pkg3"],
{
pkg1: { id: "pkg1", name: "Alpha", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
pkg2: { id: "pkg2", name: "beta", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
pkg3: { id: "pkg3", name: "Gamma", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 }
},
true
);
expect(sorted).toEqual(["pkg3", "pkg2", "pkg1"]);
});
});

694
tests/auto-rename.test.ts Normal file
View File

@ -0,0 +1,694 @@
import { describe, it, expect } from "vitest";
import {
extractEpisodeToken,
applyEpisodeTokenToFolderName,
sourceHasRpToken,
ensureRepackToken,
buildAutoRenameBaseName,
buildAutoRenameBaseNameFromFolders,
buildAutoRenameBaseNameFromFoldersWithOptions
} from "../src/main/download-manager";
describe("extractEpisodeToken", () => {
it("extracts S01E01 from standard scene format", () => {
expect(extractEpisodeToken("show.name.s01e01.720p")).toBe("S01E01");
});
it("extracts episode with dot separators", () => {
expect(extractEpisodeToken("Show.S02E15.1080p")).toBe("S02E15");
});
it("extracts episode with dash separators", () => {
expect(extractEpisodeToken("show-s3e5-720p")).toBe("S03E05");
});
it("extracts episode with underscore separators", () => {
expect(extractEpisodeToken("show_s10e100_hdtv")).toBe("S10E100");
});
it("extracts episode with space separators", () => {
expect(extractEpisodeToken("Show Name s1e2 720p")).toBe("S01E02");
});
it("pads single-digit season and episode to 2 digits", () => {
expect(extractEpisodeToken("show.s1e3.720p")).toBe("S01E03");
});
it("handles 3-digit episode numbers", () => {
expect(extractEpisodeToken("show.s01e123")).toBe("S01E123");
});
it("returns null for no episode token", () => {
expect(extractEpisodeToken("some.random.file.720p")).toBeNull();
});
it("returns null for season-only pattern (no episode)", () => {
expect(extractEpisodeToken("show.s01.720p")).toBeNull();
});
it("returns null for empty string", () => {
expect(extractEpisodeToken("")).toBeNull();
});
it("is case-insensitive", () => {
expect(extractEpisodeToken("Show.S05E10.1080p")).toBe("S05E10");
expect(extractEpisodeToken("show.s05e10.1080p")).toBe("S05E10");
});
it("extracts from episode token at start of string", () => {
expect(extractEpisodeToken("s01e01.720p")).toBe("S01E01");
});
it("extracts from episode token at end of string", () => {
expect(extractEpisodeToken("show.s02e03")).toBe("S02E03");
});
it("extracts double episode token s01e01e02", () => {
expect(extractEpisodeToken("tvr-mammon-s01e01e02-720p")).toBe("S01E01E02");
});
it("extracts double episode with dot separators", () => {
expect(extractEpisodeToken("Show.S01E03E04.720p")).toBe("S01E03E04");
});
it("extracts double episode at end of string", () => {
expect(extractEpisodeToken("show.s02e05e06")).toBe("S02E05E06");
});
it("extracts double episode with single-digit numbers", () => {
expect(extractEpisodeToken("show-s1e1e2-720p")).toBe("S01E01E02");
});
});
describe("applyEpisodeTokenToFolderName", () => {
it("replaces existing episode token in folder name", () => {
expect(applyEpisodeTokenToFolderName("Show.S01E01.720p-4sf", "S02E05")).toBe("Show.S02E05.720p-4sf");
});
it("replaces season-only token when no episode in folder", () => {
expect(applyEpisodeTokenToFolderName("Show.S01.720p-4sf", "S01E03")).toBe("Show.S01E03.720p-4sf");
});
it("inserts before -4sf suffix when no season/episode in folder", () => {
expect(applyEpisodeTokenToFolderName("Show.720p-4sf", "S01E05")).toBe("Show.720p.S01E05-4sf");
});
it("inserts before -4sj suffix", () => {
expect(applyEpisodeTokenToFolderName("Show.720p-4sj", "S01E05")).toBe("Show.720p.S01E05-4sj");
});
it("appends episode token when no recognized pattern", () => {
expect(applyEpisodeTokenToFolderName("SomeFolder", "S01E01")).toBe("SomeFolder.S01E01");
});
it("returns episode token when folder name is empty", () => {
expect(applyEpisodeTokenToFolderName("", "S01E01")).toBe("S01E01");
});
it("handles folder with existing multi-digit episode", () => {
expect(applyEpisodeTokenToFolderName("Show.S01E99.720p-4sf", "S01E05")).toBe("Show.S01E05.720p-4sf");
});
it("is case-insensitive for -4SF/-4SJ suffix", () => {
expect(applyEpisodeTokenToFolderName("Show.720p-4SF", "S01E01")).toBe("Show.720p.S01E01-4SF");
});
it("applies double episode token to season-only folder", () => {
expect(applyEpisodeTokenToFolderName("Mammon.S01.German.1080P.Bluray.x264-SMAHD", "S01E01E02"))
.toBe("Mammon.S01E01E02.German.1080P.Bluray.x264-SMAHD");
});
it("replaces existing double episode in folder with new token", () => {
expect(applyEpisodeTokenToFolderName("Show.S01E01E02.720p-4sf", "S01E03E04"))
.toBe("Show.S01E03E04.720p-4sf");
});
it("replaces existing single episode in folder with double episode token", () => {
expect(applyEpisodeTokenToFolderName("Show.S01E01.720p-4sf", "S01E01E02"))
.toBe("Show.S01E01E02.720p-4sf");
});
});
describe("sourceHasRpToken", () => {
it("detects .rp. in filename", () => {
expect(sourceHasRpToken("show.s01e01.rp.720p")).toBe(true);
});
it("detects -rp- in filename", () => {
expect(sourceHasRpToken("show-s01e01-rp-720p")).toBe(true);
});
it("detects _rp_ in filename", () => {
expect(sourceHasRpToken("show_s01e01_rp_720p")).toBe(true);
});
it("detects rp at end of string", () => {
expect(sourceHasRpToken("show.s01e01.rp")).toBe(true);
});
it("does not match rp inside a word", () => {
expect(sourceHasRpToken("enterprise.s01e01")).toBe(false);
});
it("returns false for empty string", () => {
expect(sourceHasRpToken("")).toBe(false);
});
it("is case-insensitive", () => {
expect(sourceHasRpToken("show.RP.720p")).toBe(true);
});
});
describe("ensureRepackToken", () => {
it("inserts REPACK before quality token", () => {
expect(ensureRepackToken("Show.S01E01.1080p-4sf")).toBe("Show.S01E01.REPACK.1080p-4sf");
});
it("inserts REPACK before 720p", () => {
expect(ensureRepackToken("Show.S01E01.720p-4sf")).toBe("Show.S01E01.REPACK.720p-4sf");
});
it("inserts REPACK before 2160p", () => {
expect(ensureRepackToken("Show.S01E01.2160p-4sf")).toBe("Show.S01E01.REPACK.2160p-4sf");
});
it("inserts REPACK before -4sf when no quality token", () => {
expect(ensureRepackToken("Show.S01E01-4sf")).toBe("Show.S01E01.REPACK-4sf");
});
it("inserts REPACK before -4sj when no quality token", () => {
expect(ensureRepackToken("Show.S01E01-4sj")).toBe("Show.S01E01.REPACK-4sj");
});
it("appends REPACK when no recognized insertion point", () => {
expect(ensureRepackToken("Show.S01E01")).toBe("Show.S01E01.REPACK");
});
it("does not double-add REPACK if already present", () => {
expect(ensureRepackToken("Show.S01E01.REPACK.1080p-4sf")).toBe("Show.S01E01.REPACK.1080p-4sf");
});
it("does not double-add repack (case-insensitive)", () => {
expect(ensureRepackToken("Show.s01e01.repack.720p-4sf")).toBe("Show.s01e01.repack.720p-4sf");
});
});
describe("buildAutoRenameBaseName", () => {
it("renames with episode token from source file", () => {
const result = buildAutoRenameBaseName("Show.S01.720p-4sf", "show.s01e05.720p.mkv");
expect(result).toBe("Show.S01E05.720p-4sf");
});
it("works with -4sj suffix", () => {
const result = buildAutoRenameBaseName("Show.S01.720p-4sj", "show.s01e03.720p.mkv");
expect(result).toBe("Show.S01E03.720p-4sj");
});
it("renames generic scene folder with group suffix", () => {
const result = buildAutoRenameBaseName("Show.S01.720p-GROUP", "show.s01e05.720p.mkv");
expect(result).toBe("Show.S01.720p-GROUP");
});
it("returns null when source has no episode token", () => {
const result = buildAutoRenameBaseName("Show.S01.720p-4sf", "random.file.720p.mkv");
expect(result).toBeNull();
});
it("adds REPACK when source has rp token", () => {
const result = buildAutoRenameBaseName("Show.S01.720p-4sf", "show.s01e05.rp.720p.mkv");
expect(result).toBe("Show.S01E05.REPACK.720p-4sf");
});
it("handles folder with existing episode that gets replaced", () => {
const result = buildAutoRenameBaseName("Show.S01E01.720p-4sf", "show.s01e10.720p.mkv");
expect(result).toBe("Show.S01E10.720p-4sf");
});
it("inserts episode before -4sf when folder has no season/episode", () => {
const result = buildAutoRenameBaseName("Show.720p-4sf", "show.s01e05.720p.mkv");
expect(result).toBe("Show.720p.S01E05-4sf");
});
it("handles case-insensitive 4SF suffix", () => {
const result = buildAutoRenameBaseName("Show.S01.720p-4SF", "show.s01e02.720p.mkv");
expect(result).toBe("Show.S01E02.720p-4SF");
});
it("handles rp + no quality token in folder", () => {
const result = buildAutoRenameBaseName("Show.S01-4sf", "show.s01e05.rp.mkv");
expect(result).toBe("Show.S01E05.REPACK-4sf");
});
it("returns null for empty folder name", () => {
const result = buildAutoRenameBaseName("", "show.s01e01.mkv");
expect(result).toBeNull();
});
it("returns null for empty source file name", () => {
const result = buildAutoRenameBaseName("Show.S01-4sf", "");
expect(result).toBeNull();
});
// Edge cases
it("handles 2160p quality token", () => {
const result = buildAutoRenameBaseName("Show.S01.2160p-4sf", "show.s01e01.rp.2160p.mkv");
expect(result).toBe("Show.S01E01.REPACK.2160p-4sf");
});
it("handles 480p quality token", () => {
const result = buildAutoRenameBaseName("Show.S01.480p-4sf", "show.s01e07.480p.mkv");
expect(result).toBe("Show.S01E07.480p-4sf");
});
it("does not trigger on folders ending with similar but wrong suffix", () => {
expect(buildAutoRenameBaseName("Show.S01-4sfx", "show.s01e01.mkv")).toBeNull();
expect(buildAutoRenameBaseName("Show.S01-x4sf", "show.s01e01.mkv")).toBeNull();
});
it("handles high season and episode numbers", () => {
const result = buildAutoRenameBaseName("Show.S99.720p-4sf", "show.s99e999.720p.mkv");
// SCENE_EPISODE_RE allows up to 3-digit episodes and 2-digit seasons
expect(result).not.toBeNull();
expect(result!).toContain("S99E999");
});
// Real-world scene release patterns
it("real-world: German series with dots", () => {
const result = buildAutoRenameBaseName(
"Der.Bergdoktor.S18.German.720p.WEB.x264-4SJ",
"der.bergdoktor.s18e01.german.720p.web.x264"
);
expect(result).toBe("Der.Bergdoktor.S18E01.German.720p.WEB.x264-4SJ");
});
it("real-world: English series with rp token", () => {
const result = buildAutoRenameBaseName(
"The.Last.of.Us.S02.1080p.WEB-4SF",
"the.last.of.us.s02e03.rp.1080p.web"
);
expect(result).toBe("The.Last.of.Us.S02E03.REPACK.1080p.WEB-4SF");
});
it("real-world: multiple dots in name", () => {
const result = buildAutoRenameBaseName(
"Grey.s.Anatomy.S21.German.DL.720p.WEB.x264-4SJ",
"grey.s.anatomy.s21e08.german.dl.720p.web.x264"
);
expect(result).toBe("Grey.s.Anatomy.S21E08.German.DL.720p.WEB.x264-4SJ");
});
it("real-world: 4K content", () => {
const result = buildAutoRenameBaseName(
"Severance.S02.2160p.ATVP.WEB-DL.DDP5.1.DV.H.265-4SF",
"severance.s02e07.2160p.atvp.web-dl.ddp5.1.dv.h.265"
);
expect(result).toBe("Severance.S02E07.2160p.ATVP.WEB-DL.DDP5.1.DV.H.265-4SF");
});
it("real-world: Britannia release keeps folder base name", () => {
const result = buildAutoRenameBaseName(
"Britannia.S02.GERMAN.720p.WEBRiP.x264-LAW",
"law-britannia.s02e01.720p.webrip"
);
expect(result).toBe("Britannia.S02.GERMAN.720p.WEBRiP.x264-LAW");
});
it("real-world: Britannia repack injects REPACK", () => {
const result = buildAutoRenameBaseName(
"Britannia.S02.GERMAN.720p.WEBRiP.x264-LAW",
"law-britannia.s02e09.720p.webrip.repack"
);
expect(result).toBe("Britannia.S02.GERMAN.REPACK.720p.WEBRiP.x264-LAW");
});
it("adds REPACK when folder name carries RP hint", () => {
const result = buildAutoRenameBaseName(
"Banshee.S02E01.German.RP.720p.BluRay.x264-RIPLEY",
"r-banshee.s02e01-720p"
);
expect(result).toBe("Banshee.S02E01.German.REPACK.720p.BluRay.x264-RIPLEY");
});
it("real-world: folder already has wrong episode", () => {
const result = buildAutoRenameBaseName(
"Cobra.Kai.S06E01.720p.NF.WEB-DL.DDP5.1.x264-4SF",
"cobra.kai.s06e14.720p.nf.web-dl.ddp5.1.x264"
);
expect(result).toBe("Cobra.Kai.S06E14.720p.NF.WEB-DL.DDP5.1.x264-4SF");
});
// Bug-hunting edge cases
it("source filename extension is not included in episode detection", () => {
// The sourceFileName passed to buildAutoRenameBaseName is the basename without extension
// so .mkv should not interfere, but let's verify with an actual extension
const result = buildAutoRenameBaseName("Show.S01-4sf", "show.s01e01.mkv");
// "mkv" should not be treated as part of the filename match
expect(result).not.toBeNull();
expect(result!).toContain("S01E01");
});
it("does not match episode-like patterns in codec strings", () => {
// h.265 has digits but should not be confused with episode tokens
const token = extractEpisodeToken("show.s01e01.h.265");
expect(token).toBe("S01E01");
});
it("handles folder with dash separators throughout", () => {
const result = buildAutoRenameBaseName(
"Show-Name-S01-720p-4sf",
"show-name-s01e05-720p"
);
expect(result).toBe("Show-Name-S01E05-720p-4sf");
});
it("does not duplicate episode when folder already has the same episode", () => {
const result = buildAutoRenameBaseName(
"Show.S01E05.720p-4sf",
"show.s01e05.720p"
);
// Must NOT produce "Show.S01E05.720p.S01E05-4sf" (double episode bug)
expect(result).toBe("Show.S01E05.720p-4sf");
});
it("handles folder with only -4sf suffix (edge case)", () => {
const result = buildAutoRenameBaseName("-4sf", "show.s01e01.mkv");
// Extreme edge case - sanitizeFilename trims leading dots
expect(result).not.toBeNull();
expect(result!).toContain("S01E01");
expect(result!).toContain("-4sf");
expect(result!).not.toContain(".S01E01.S01E01"); // no duplication
});
it("sanitizes special characters from result", () => {
// sanitizeFilename should strip dangerous chars
const result = buildAutoRenameBaseName("Show:Name.S01-4sf", "show.s01e01.mkv");
// The colon should be sanitized away
expect(result).not.toBeNull();
expect(result!).not.toContain(":");
});
});
describe("buildAutoRenameBaseNameFromFolders", () => {
it("uses parent folder when current folder is not a scene template", () => {
const result = buildAutoRenameBaseNameFromFolders(
[
"Episode 01",
"Banshee.S02.German.720p.BluRay.x264-RIPLEY"
],
"r-banshee.s02e01-720p"
);
expect(result).toBe("Banshee.S02.German.720p.BluRay.x264-RIPLEY");
});
it("uses nested scene subfolder directly", () => {
const result = buildAutoRenameBaseNameFromFolders(
[
"Banshee.S02E01.German.720p.BluRay.x264-RIPLEY",
"Banshee.S02.German.720p.BluRay.x264-RIPLEY"
],
"r-banshee.s02e01-720p"
);
expect(result).toBe("Banshee.S02E01.German.720p.BluRay.x264-RIPLEY");
});
it("injects REPACK when parent folder carries repack hint", () => {
const result = buildAutoRenameBaseNameFromFolders(
[
"Banshee.S02E01.German.720p.BluRay.x264-RIPLEY",
"Banshee.S02.German.RP.720p.BluRay.x264-RIPLEY"
],
"r-banshee.s02e01-720p"
);
expect(result).toBe("Banshee.S02E01.German.REPACK.720p.BluRay.x264-RIPLEY");
});
it("uses nested Arrow episode folder with title", () => {
const result = buildAutoRenameBaseNameFromFolders(
[
"Arrow.S04E01.Green.Arrow.German.DL.720p.BluRay.x264-RSG",
"Arrow.S04.German.DL.720p.BluRay.x264-RSG"
],
"rsg-arrow-s04e01-720p"
);
expect(result).toBe("Arrow.S04E01.Green.Arrow.German.DL.720p.BluRay.x264-RSG");
});
it("adds REPACK for Arrow when source contains rp token", () => {
const result = buildAutoRenameBaseNameFromFolders(
[
"Arrow.S04E01.Green.Arrow.German.DL.720p.BluRay.x264-RSG",
"Arrow.S04.German.DL.720p.BluRay.x264-RSG"
],
"rsg-arrow-s04e01.rp.720p"
);
expect(result).toBe("Arrow.S04E01.Green.Arrow.German.DL.REPACK.720p.BluRay.x264-RSG");
});
it("converts Teil token to episode using parent season", () => {
const result = buildAutoRenameBaseNameFromFolders(
[
"Last.Impact.Der.Einschlag.Teil1.GERMAN.DL.720p.WEB.H264-SunDry",
"Last.Impact.Der.Einschlag.S01.GERMAN.DL.720p.WEB.H264-SunDry"
],
"sundry-last.impact.der.einschlag.teil1.720p.web.h264"
);
expect(result).toBe("Last.Impact.Der.Einschlag.S01E01.GERMAN.DL.720p.WEB.H264-SunDry");
});
it("converts Teil token to episode with REPACK", () => {
const result = buildAutoRenameBaseNameFromFolders(
[
"Last.Impact.Der.Einschlag.Teil1.GERMAN.DL.720p.WEB.H264-SunDry",
"Last.Impact.Der.Einschlag.S01.GERMAN.DL.720p.WEB.H264-SunDry"
],
"sundry-last.impact.der.einschlag.teil1.rp.720p.web.h264"
);
expect(result).toBe("Last.Impact.Der.Einschlag.S01E01.GERMAN.DL.REPACK.720p.WEB.H264-SunDry");
});
it("forces episode insertion for flat season folder when many files share directory", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Arrow.S08.GERMAN.DUBBED.DL.720p.BluRay.x264-TMSF"
],
"tmsf-arrow-s08e03-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Arrow.S08E03.GERMAN.DUBBED.DL.720p.BluRay.x264-TMSF");
});
it("forces episode insertion plus REPACK for flat season folder", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Gotham.S05.GERMAN.DUBBED.720p.BLURAY.x264-ZZGtv"
],
"zzgtv-gotham-s05e02.rp",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Gotham.S05E02.GERMAN.DUBBED.REPACK.720p.BLURAY.x264-ZZGtv");
});
it("uses nested episode title folder for Gotham TvR style", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Gotham.S04E01.Pax.Penguina.GERMAN.5.1.DL.AC3.720p.BDRiP.x264-TvR",
"Gotham.S04.GERMAN.5.1.DL.AC3.720p.BDRiP.x264-TvR"
],
"tvr-gotham-s04e01-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Gotham.S04E01.Pax.Penguina.GERMAN.5.1.DL.AC3.720p.BDRiP.x264-TvR");
});
it("uses nested title folder for Britannia TV4A style", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Britannia.S01E01.Die.Landung.German.DL.720p.BluRay.x264-TV4A",
"Britannia.S01.German.DL.720p.BluRay.x264-TV4A"
],
"tv4a-britannia.s01e01-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Britannia.S01E01.Die.Landung.German.DL.720p.BluRay.x264-TV4A");
});
it("handles odd source token style 101 by using nested Agent X folder", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Agent.X.S01E01.Pilot.German.DD51.Dubbed.DL.720p.iTunesHD.x264-TVS",
"Agent.X.S01.German.DD51.Dubbed.DL.720p.iTunesHD.x264-TVS"
],
"tvs-agent-x-dd51-ded-dl-7p-ithd-x264-101",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Agent.X.S01E01.Pilot.German.DD51.Dubbed.DL.720p.iTunesHD.x264-TVS");
});
it("maps compact code 301 to S03E01 for nested Legion folder", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Legion.S03E01.Kapitel.20.German.DD51.Dubbed.DL.720p.AmazonHD.AVC-TVS",
"Legion.S03.German.DD51.Dubbed.DL.720p.AmazonHD.AVC-TVS"
],
"tvs-legion-dd51-ded-dl-7p-azhd-avc-301",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Legion.S03E01.Kapitel.20.German.DD51.Dubbed.DL.720p.AmazonHD.AVC-TVS");
});
it("maps compact code 211 in flat season folder", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Lethal.Weapon.S02.German.DD51.Dubbed.DL.720p.AmazonHD.x264-TVS"
],
"tvs-lethal-weapon-dd51-ded-dl-7p-azhd-x264-211",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Lethal.Weapon.S02E11.German.DD51.Dubbed.DL.720p.AmazonHD.x264-TVS");
});
it("maps episode-only token e01 via season folder hint and keeps REPACK", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Cheat.der.Betrug.S01.GERMAN.720p.WEB.h264-TMSF"
],
"tmsf-cheatderbetrug-e01-720p-repack",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Cheat.der.Betrug.S01E01.GERMAN.REPACK.720p.WEB.h264-TMSF");
});
it("maps episode-only token e02 via season folder hint", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Cheat.der.Betrug.S01.GERMAN.720p.WEB.h264-TMSF"
],
"tmsf-cheatderbetrug-e02-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Cheat.der.Betrug.S01E02.GERMAN.720p.WEB.h264-TMSF");
});
it("keeps renaming for odd source order like 4sf-bs-720p-s01e05", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Cheat.der.Betrug.S01.GERMAN.720p.WEB.h264-TMSF"
],
"4sf-bs-720p-s01e05",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Cheat.der.Betrug.S01E05.GERMAN.720p.WEB.h264-TMSF");
});
it("accepts lowercase scene group suffixes", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Cheat.der.Betrug.S01.GERMAN.720p.WEB.h264-tmsf"
],
"tmsf-cheatderbetrug-e01-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Cheat.der.Betrug.S01E01.GERMAN.720p.WEB.h264-tmsf");
});
it("renames double episode file into season folder (Mammon style)", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Mammon.S01.German.1080P.Bluray.x264-SMAHD"
],
"tvr-mammon-s01e01e02-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Mammon.S01E01E02.German.1080P.Bluray.x264-SMAHD");
});
it("renames second double episode file correctly", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Mammon.S01.German.1080P.Bluray.x264-SMAHD"
],
"tvr-mammon-s01e03e04-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Mammon.S01E03E04.German.1080P.Bluray.x264-SMAHD");
});
it("renames third double episode file correctly", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Mammon.S01.German.1080P.Bluray.x264-SMAHD"
],
"tvr-mammon-s01e05e06-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Mammon.S01E05E06.German.1080P.Bluray.x264-SMAHD");
});
// Last-resort fallback: folder has season but no scene group suffix (user-renamed packages)
it("renames when folder has season but no scene group suffix (Mystery Road case)", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Mystery Road S02"],
"myst.road.de.dl.hdtv.7p-s02e05",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Mystery Road S02E05");
});
it("renames with season-only folder and custom name without dots", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Meine Serie S03"],
"meine-serie-s03e10-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Meine Serie S03E10");
});
it("prefers scene-group folder over season-only fallback", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Mystery Road S02",
"Mystery.Road.S02.GERMAN.DL.AC3.720p.HDTV.x264-hrs"
],
"myst.road.de.dl.hdtv.7p-s02e05",
{ forceEpisodeForSeasonFolder: true }
);
// Should use the scene-group folder (hrs), not the custom one
expect(result).toBe("Mystery.Road.S02E05.GERMAN.DL.AC3.720p.HDTV.x264-hrs");
});
it("does not use season-only fallback when forceEpisodeForSeasonFolder is false", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Mystery Road S02"],
"myst.road.de.dl.hdtv.7p-s02e05",
{ forceEpisodeForSeasonFolder: false }
);
expect(result).toBeNull();
});
it("renames Riviera S02 with single-digit episode s02e2", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Riviera.S02.GERMAN.DUBBED.DL.720p.WebHD.x264-TVP"],
"tvp-riviera-s02e2-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Riviera.S02E02.GERMAN.DUBBED.DL.720p.WebHD.x264-TVP");
});
it("renames Room 104 abbreviated source r104.de.dl.web.7p-s04e02", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Room.104.S04.GERMAN.DL.720p.WEBRiP.x264-LAW"],
"r104.de.dl.web.7p-s04e02",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Room.104.S04E02.GERMAN.DL.720p.WEBRiP.x264-LAW");
});
it("renames Room 104 wayne source with episode", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Room.104.S04.GERMAN.DL.720p.WEBRiP.x264-LAW"],
"room.104.s04e01.german.dl.720p.web.h264-wayne",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Room.104.S04E01.GERMAN.DL.720p.WEBRiP.x264-LAW");
});
});

109
tests/cleanup.test.ts Normal file
View File

@ -0,0 +1,109 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it } from "vitest";
import { cleanupCancelledPackageArtifacts, removeDownloadLinkArtifacts, removeSampleArtifacts } from "../src/main/cleanup";
const tempDirs: string[] = [];
afterEach(() => {
for (const dir of tempDirs.splice(0)) {
fs.rmSync(dir, { recursive: true, force: true });
}
});
describe("cleanup", () => {
it("removes archive artifacts but keeps media", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-clean-"));
tempDirs.push(dir);
fs.writeFileSync(path.join(dir, "release.part1.rar"), "x");
fs.writeFileSync(path.join(dir, "movie.mkv"), "x");
const removed = cleanupCancelledPackageArtifacts(dir);
expect(removed).toBeGreaterThan(0);
expect(fs.existsSync(path.join(dir, "release.part1.rar"))).toBe(false);
expect(fs.existsSync(path.join(dir, "movie.mkv"))).toBe(true);
});
it("removes sample artifacts and link files", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-clean-"));
tempDirs.push(dir);
fs.mkdirSync(path.join(dir, "Samples"), { recursive: true });
fs.writeFileSync(path.join(dir, "Samples", "demo-sample.mkv"), "x");
fs.writeFileSync(path.join(dir, "download_links.txt"), "https://example.com/a\n");
const links = await removeDownloadLinkArtifacts(dir);
const samples = await removeSampleArtifacts(dir);
expect(links).toBeGreaterThan(0);
expect(samples.files + samples.dirs).toBeGreaterThan(0);
});
it("cleans up archive files in nested directories", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-clean-"));
tempDirs.push(dir);
// Create nested directory structure with archive files
const sub1 = path.join(dir, "season1");
const sub2 = path.join(dir, "season1", "extras");
fs.mkdirSync(sub2, { recursive: true });
fs.writeFileSync(path.join(sub1, "episode.part1.rar"), "x");
fs.writeFileSync(path.join(sub1, "episode.part2.rar"), "x");
fs.writeFileSync(path.join(sub2, "bonus.zip"), "x");
fs.writeFileSync(path.join(sub2, "bonus.7z"), "x");
// Non-archive files should be kept
fs.writeFileSync(path.join(sub1, "video.mkv"), "real content");
fs.writeFileSync(path.join(sub2, "subtitle.srt"), "subtitle content");
const removed = cleanupCancelledPackageArtifacts(dir);
expect(removed).toBe(4); // 2 rar parts + zip + 7z
expect(fs.existsSync(path.join(sub1, "episode.part1.rar"))).toBe(false);
expect(fs.existsSync(path.join(sub1, "episode.part2.rar"))).toBe(false);
expect(fs.existsSync(path.join(sub2, "bonus.zip"))).toBe(false);
expect(fs.existsSync(path.join(sub2, "bonus.7z"))).toBe(false);
// Non-archives kept
expect(fs.existsSync(path.join(sub1, "video.mkv"))).toBe(true);
expect(fs.existsSync(path.join(sub2, "subtitle.srt"))).toBe(true);
});
it("detects link artifacts by URL content in text files", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-clean-"));
tempDirs.push(dir);
// File with link-like name containing URLs should be removed
fs.writeFileSync(path.join(dir, "download_links.txt"), "https://rapidgator.net/file/abc123\nhttps://uploaded.net/file/def456\n");
// File with link-like name but no URLs should be kept
fs.writeFileSync(path.join(dir, "my_downloads.txt"), "Just some random text without URLs");
// Regular text file that doesn't match the link pattern should be kept
fs.writeFileSync(path.join(dir, "readme.txt"), "https://example.com");
// .url files should always be removed
fs.writeFileSync(path.join(dir, "bookmark.url"), "[InternetShortcut]\nURL=https://example.com");
// .dlc files should always be removed
fs.writeFileSync(path.join(dir, "container.dlc"), "encrypted-data");
const removed = await removeDownloadLinkArtifacts(dir);
expect(removed).toBeGreaterThanOrEqual(3); // download_links.txt + bookmark.url + container.dlc
expect(fs.existsSync(path.join(dir, "download_links.txt"))).toBe(false);
expect(fs.existsSync(path.join(dir, "bookmark.url"))).toBe(false);
expect(fs.existsSync(path.join(dir, "container.dlc"))).toBe(false);
// Non-matching files should be kept
expect(fs.existsSync(path.join(dir, "readme.txt"))).toBe(true);
});
it("does not recurse into sample symlink or junction targets", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-clean-"));
const external = fs.mkdtempSync(path.join(os.tmpdir(), "rd-clean-ext-"));
tempDirs.push(dir, external);
const outsideFile = path.join(external, "outside-sample.mkv");
fs.writeFileSync(outsideFile, "keep", "utf8");
const linkedSampleDir = path.join(dir, "sample");
const linkType: fs.symlink.Type = process.platform === "win32" ? "junction" : "dir";
fs.symlinkSync(external, linkedSampleDir, linkType);
const result = await removeSampleArtifacts(dir);
expect(result.files).toBe(0);
expect(fs.existsSync(outsideFile)).toBe(true);
});
});

210
tests/container.test.ts Normal file
View File

@ -0,0 +1,210 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it, vi } from "vitest";
import { importDlcContainers } from "../src/main/container";
const tempDirs: string[] = [];
const originalFetch = globalThis.fetch;
afterEach(() => {
globalThis.fetch = originalFetch;
vi.restoreAllMocks();
for (const dir of tempDirs.splice(0)) {
fs.rmSync(dir, { recursive: true, force: true });
}
});
describe("container", () => {
it("skips oversized DLC files without throwing and blocking other files", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dlc-"));
tempDirs.push(dir);
const oversizedFilePath = path.join(dir, "oversized.dlc");
fs.writeFileSync(oversizedFilePath, Buffer.alloc((8 * 1024 * 1024) + 1, 1));
// Create a valid mockup DLC that would be skipped if an error was thrown
const validFilePath = path.join(dir, "valid.dlc");
// Just needs to be short enough to pass file limits but fail parsing, triggering dcrypt fallback
fs.writeFileSync(validFilePath, Buffer.from("Valid but not real DLC content..."));
const fetchSpy = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
if (urlStr.includes("dcrypt.it/decrypt/upload")) {
return new Response("http://example.com/file1.rar\nhttp://example.com/file2.rar", { status: 200 });
}
return new Response("", { status: 404 });
});
globalThis.fetch = fetchSpy as unknown as typeof fetch;
const result = await importDlcContainers([oversizedFilePath, validFilePath]);
// Expect the oversized to be silently skipped, and valid to be parsed into 1 package with DLC filename
expect(result).toHaveLength(1);
expect(result[0].name).toBe("valid");
expect(result[0].links).toEqual(["http://example.com/file1.rar", "http://example.com/file2.rar"]);
expect(fetchSpy).toHaveBeenCalledTimes(1);
});
it("skips non-dlc files completely", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dlc-non-"));
tempDirs.push(dir);
const txtPath = path.join(dir, "links.txt");
fs.writeFileSync(txtPath, "http://link.com/1");
const result = await importDlcContainers([txtPath]);
expect(result).toEqual([]);
});
it("falls back to dcrypt if local decryption returns empty", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dlc-"));
tempDirs.push(dir);
const filePath = path.join(dir, "fallback.dlc");
// A file large enough to trigger local decryption attempt (needs > 89 bytes to pass the slice check)
fs.writeFileSync(filePath, Buffer.alloc(100, 1).toString("base64"));
const fetchSpy = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
if (urlStr.includes("service.jdownloader.org")) {
// Mock local RC service failure (returning 404)
return new Response("", { status: 404 });
}
if (urlStr.includes("dcrypt.it/decrypt/upload")) {
// Mock dcrypt fallback success
return new Response("http://fallback.com/1", { status: 200 });
}
return new Response("", { status: 404 });
});
globalThis.fetch = fetchSpy as unknown as typeof fetch;
const result = await importDlcContainers([filePath]);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("fallback");
expect(result[0].links).toEqual(["http://fallback.com/1"]);
// Should have tried both!
expect(fetchSpy).toHaveBeenCalledTimes(2);
});
it("falls back to dcrypt when local decryption throws invalid padding", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dlc-"));
tempDirs.push(dir);
const filePath = path.join(dir, "invalid-local.dlc");
fs.writeFileSync(filePath, "X".repeat(120));
const fetchSpy = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
if (urlStr.includes("service.jdownloader.org")) {
return new Response(`<rc>${Buffer.alloc(16).toString("base64")}</rc>`, { status: 200 });
}
if (urlStr.includes("dcrypt.it/decrypt/upload")) {
return new Response("http://example.com/fallback1", { status: 200 });
}
return new Response("", { status: 404 });
});
globalThis.fetch = fetchSpy as unknown as typeof fetch;
const result = await importDlcContainers([filePath]);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("invalid-local");
expect(result[0].links).toEqual(["http://example.com/fallback1"]);
expect(fetchSpy).toHaveBeenCalledTimes(2);
});
it("falls back to paste endpoint when upload returns 413", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dlc-"));
tempDirs.push(dir);
const filePath = path.join(dir, "big-dlc.dlc");
fs.writeFileSync(filePath, Buffer.alloc(100, 1).toString("base64"));
const fetchSpy = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
if (urlStr.includes("service.jdownloader.org")) {
return new Response("", { status: 404 });
}
if (urlStr.includes("dcrypt.it/decrypt/upload")) {
return new Response("Request Entity Too Large", { status: 413 });
}
if (urlStr.includes("dcrypt.it/decrypt/paste")) {
return new Response("http://paste-fallback.com/file1.rar\nhttp://paste-fallback.com/file2.rar", { status: 200 });
}
return new Response("", { status: 404 });
});
globalThis.fetch = fetchSpy as unknown as typeof fetch;
const result = await importDlcContainers([filePath]);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("big-dlc");
expect(result[0].links).toEqual(["http://paste-fallback.com/file1.rar", "http://paste-fallback.com/file2.rar"]);
// local RC + upload + paste = 3 calls
expect(fetchSpy).toHaveBeenCalledTimes(3);
});
it("throws when both dcrypt endpoints return 413", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dlc-"));
tempDirs.push(dir);
const filePath = path.join(dir, "huge.dlc");
fs.writeFileSync(filePath, Buffer.alloc(100, 1).toString("base64"));
const fetchSpy = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
if (urlStr.includes("service.jdownloader.org")) {
return new Response("", { status: 404 });
}
if (urlStr.includes("dcrypt.it/decrypt/upload")) {
return new Response("Request Entity Too Large", { status: 413 });
}
if (urlStr.includes("dcrypt.it/decrypt/paste")) {
return new Response("Request Entity Too Large", { status: 413 });
}
return new Response("", { status: 500 });
});
globalThis.fetch = fetchSpy as unknown as typeof fetch;
await expect(importDlcContainers([filePath])).rejects.toThrow(/zu groß für dcrypt/i);
});
it("throws when upload returns 413 and paste returns 500", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dlc-"));
tempDirs.push(dir);
const filePath = path.join(dir, "doomed.dlc");
fs.writeFileSync(filePath, Buffer.from("not a valid dlc payload at all"));
const fetchSpy = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
if (urlStr.includes("service.jdownloader.org")) {
return new Response("", { status: 404 });
}
if (urlStr.includes("dcrypt.it/decrypt/upload")) {
return new Response("Request Entity Too Large", { status: 413 });
}
if (urlStr.includes("dcrypt.it/decrypt/paste")) {
return new Response("paste failure", { status: 500 });
}
return new Response("", { status: 500 });
});
globalThis.fetch = fetchSpy as unknown as typeof fetch;
await expect(importDlcContainers([filePath])).rejects.toThrow(/DLC konnte nicht importiert werden/i);
});
it("throws clear error when all dlc imports fail", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dlc-"));
tempDirs.push(dir);
const filePath = path.join(dir, "broken.dlc");
fs.writeFileSync(filePath, Buffer.from("not a valid dlc payload at all"));
const fetchSpy = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
if (urlStr.includes("service.jdownloader.org")) {
return new Response("", { status: 404 });
}
if (urlStr.includes("dcrypt.it/decrypt/upload")) {
return new Response("upstream failure", { status: 500 });
}
return new Response("", { status: 500 });
});
globalThis.fetch = fetchSpy as unknown as typeof fetch;
await expect(importDlcContainers([filePath])).rejects.toThrow(/DLC konnte nicht importiert werden/i);
});
});

792
tests/debrid.test.ts Normal file
View File

@ -0,0 +1,792 @@
import { afterEach, describe, expect, it, vi } from "vitest";
import { defaultSettings, REQUEST_RETRIES } from "../src/main/constants";
import { DebridService, extractRapidgatorFilenameFromHtml, filenameFromRapidgatorUrlPath, normalizeResolvedFilename } from "../src/main/debrid";
const originalFetch = globalThis.fetch;
afterEach(() => {
globalThis.fetch = originalFetch;
vi.restoreAllMocks();
});
describe("debrid service", () => {
it("falls back to Mega web when Real-Debrid fails", async () => {
const settings = {
...defaultSettings(),
token: "rd-token",
megaLogin: "user",
megaPassword: "pass",
bestToken: "",
providerPrimary: "realdebrid" as const,
providerSecondary: "megadebrid" as const,
providerTertiary: "bestdebrid" as const,
autoProviderFallback: true
};
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.real-debrid.com/rest/1.0/unrestrict/link")) {
return new Response(JSON.stringify({ error: "traffic_limit" }), {
status: 403,
headers: { "Content-Type": "application/json" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const megaWeb = vi.fn(async () => ({
fileName: "file.bin",
directUrl: "https://mega-web.example/file.bin",
fileSize: null,
retriesUsed: 0
}));
const service = new DebridService(settings, { megaWebUnrestrict: megaWeb });
const result = await service.unrestrictLink("https://rapidgator.net/file/example.part1.rar.html");
expect(result.provider).toBe("megadebrid");
expect(result.directUrl).toBe("https://mega-web.example/file.bin");
expect(megaWeb).toHaveBeenCalledTimes(1);
});
it("does not fallback when auto fallback is disabled", async () => {
const settings = {
...defaultSettings(),
token: "rd-token",
megaLogin: "user",
megaPassword: "pass",
providerPrimary: "realdebrid" as const,
providerSecondary: "megadebrid" as const,
providerTertiary: "bestdebrid" as const,
autoProviderFallback: false
};
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.real-debrid.com/rest/1.0/unrestrict/link")) {
return new Response("traffic exhausted", { status: 429 });
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const megaWeb = vi.fn(async () => ({
fileName: "unused.bin",
directUrl: "https://unused",
fileSize: null,
retriesUsed: 0
}));
const service = new DebridService(settings, { megaWebUnrestrict: megaWeb });
await expect(service.unrestrictLink("https://rapidgator.net/file/example.part2.rar.html")).rejects.toThrow();
expect(megaWeb).toHaveBeenCalledTimes(0);
});
it("uses BestDebrid auth header without token query fallback", async () => {
const settings = {
...defaultSettings(),
token: "",
bestToken: "best-token",
providerPrimary: "bestdebrid" as const,
providerSecondary: "realdebrid" as const,
providerTertiary: "megadebrid" as const,
autoProviderFallback: true
};
const calledUrls: string[] = [];
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
calledUrls.push(url);
if (url.includes("/api/v1/generateLink?link=")) {
return new Response(JSON.stringify({ download: "https://best.example/file.bin", filename: "file.bin", filesize: 2048 }), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
const result = await service.unrestrictLink("https://rapidgator.net/file/example.part3.rar.html");
expect(result.provider).toBe("bestdebrid");
expect(result.fileSize).toBe(2048);
expect(calledUrls.some((url) => url.includes("auth="))).toBe(false);
});
it("sends Bearer auth header to BestDebrid", async () => {
const settings = {
...defaultSettings(),
token: "",
bestToken: "best-token",
providerPrimary: "bestdebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
autoProviderFallback: true
};
let authHeader = "";
globalThis.fetch = (async (input: RequestInfo | URL, init?: RequestInit): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("/api/v1/generateLink?link=")) {
const headers = init?.headers;
if (headers instanceof Headers) {
authHeader = headers.get("Authorization") || "";
} else if (Array.isArray(headers)) {
const tuple = headers.find(([key]) => key.toLowerCase() === "authorization");
authHeader = tuple?.[1] || "";
} else {
authHeader = String((headers as Record<string, unknown> | undefined)?.Authorization || "");
}
return new Response(JSON.stringify({ download: "https://best.example/file.bin", filename: "file.bin", filesize: 42 }), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
const result = await service.unrestrictLink("https://hoster.example/file/abc");
expect(result.provider).toBe("bestdebrid");
expect(authHeader).toBe("Bearer best-token");
});
it("does not retry BestDebrid auth failures (401)", async () => {
const settings = {
...defaultSettings(),
token: "",
bestToken: "best-token",
providerPrimary: "bestdebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
autoProviderFallback: true
};
let calls = 0;
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("/api/v1/generateLink?link=")) {
calls += 1;
return new Response(JSON.stringify({ message: "Unauthorized" }), {
status: 401,
headers: { "Content-Type": "application/json" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
await expect(service.unrestrictLink("https://hoster.example/file/no-retry")).rejects.toThrow();
expect(calls).toBe(1);
});
it("does not retry AllDebrid auth failures (403)", async () => {
const settings = {
...defaultSettings(),
allDebridToken: "ad-token",
providerPrimary: "alldebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
autoProviderFallback: true
};
let calls = 0;
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.alldebrid.com/v4/link/unlock")) {
calls += 1;
return new Response(JSON.stringify({ status: "error", error: { message: "forbidden" } }), {
status: 403,
headers: { "Content-Type": "application/json" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
await expect(service.unrestrictLink("https://hoster.example/file/no-retry-ad")).rejects.toThrow();
expect(calls).toBe(1);
});
it("supports AllDebrid unlock", async () => {
const settings = {
...defaultSettings(),
token: "",
bestToken: "",
allDebridToken: "ad-token",
providerPrimary: "alldebrid" as const,
providerSecondary: "realdebrid" as const,
providerTertiary: "megadebrid" as const,
autoProviderFallback: true
};
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.alldebrid.com/v4/link/unlock")) {
return new Response(JSON.stringify({
status: "success",
data: {
link: "https://alldebrid.example/file.bin",
filename: "file.bin",
filesize: 4096
}
}), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
const result = await service.unrestrictLink("https://rapidgator.net/file/example.part4.rar.html");
expect(result.provider).toBe("alldebrid");
expect(result.directUrl).toBe("https://alldebrid.example/file.bin");
expect(result.fileSize).toBe(4096);
});
it("treats MegaDebrid as not configured when web fallback callback is unavailable", async () => {
const settings = {
...defaultSettings(),
megaLogin: "user",
megaPassword: "pass",
providerPrimary: "megadebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
autoProviderFallback: false
};
const service = new DebridService(settings);
await expect(service.unrestrictLink("https://rapidgator.net/file/missing-mega-web")).rejects.toThrow(/nicht konfiguriert/i);
});
it("uses Mega web path exclusively", async () => {
const settings = {
...defaultSettings(),
token: "",
bestToken: "",
allDebridToken: "",
megaLogin: "user",
megaPassword: "pass",
providerPrimary: "megadebrid" as const,
providerSecondary: "megadebrid" as const,
providerTertiary: "megadebrid" as const,
autoProviderFallback: true
};
const fetchSpy = vi.fn(async () => new Response("not-found", { status: 404 }));
globalThis.fetch = fetchSpy as unknown as typeof fetch;
const megaWeb = vi.fn(async () => ({
fileName: "from-web.rar",
directUrl: "https://www11.unrestrict.link/download/file/abc/from-web.rar",
fileSize: null,
retriesUsed: 0
}));
const service = new DebridService(settings, { megaWebUnrestrict: megaWeb });
const result = await service.unrestrictLink("https://rapidgator.net/file/abc/from-web.rar.html");
expect(result.provider).toBe("megadebrid");
expect(result.directUrl).toContain("unrestrict.link/download/file/");
expect(megaWeb).toHaveBeenCalledTimes(1);
expect(fetchSpy).toHaveBeenCalledTimes(0);
});
it("aborts Mega web unrestrict when caller signal is cancelled", async () => {
const settings = {
...defaultSettings(),
token: "",
bestToken: "",
allDebridToken: "",
megaLogin: "user",
megaPassword: "pass",
providerPrimary: "megadebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
autoProviderFallback: false
};
const megaWeb = vi.fn((_link: string, signal?: AbortSignal): Promise<never> => new Promise((_, reject) => {
const onAbort = (): void => reject(new Error("aborted:mega-web-test"));
if (signal?.aborted) {
onAbort();
return;
}
signal?.addEventListener("abort", onAbort, { once: true });
}));
const service = new DebridService(settings, { megaWebUnrestrict: megaWeb });
const controller = new AbortController();
const abortTimer = setTimeout(() => {
controller.abort("test");
}, 200);
try {
await expect(service.unrestrictLink("https://rapidgator.net/file/abort-mega-web", controller.signal)).rejects.toThrow(/aborted/i);
expect(megaWeb).toHaveBeenCalledTimes(1);
expect(megaWeb.mock.calls[0]?.[1]).toBe(controller.signal);
} finally {
clearTimeout(abortTimer);
}
});
it("respects provider selection and does not append hidden providers", async () => {
const settings = {
...defaultSettings(),
token: "",
bestToken: "",
allDebridToken: "ad-token",
megaLogin: "user",
megaPassword: "pass",
providerPrimary: "megadebrid" as const,
providerSecondary: "megadebrid" as const,
providerTertiary: "megadebrid" as const,
autoProviderFallback: true
};
let allDebridCalls = 0;
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.alldebrid.com/v4/link/unlock")) {
allDebridCalls += 1;
return new Response(JSON.stringify({ status: "success", data: { link: "https://alldebrid.example/file.bin" } }), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const megaWeb = vi.fn(async () => null);
const service = new DebridService(settings, { megaWebUnrestrict: megaWeb });
await expect(service.unrestrictLink("https://rapidgator.net/file/example.part5.rar.html")).rejects.toThrow();
expect(allDebridCalls).toBe(0);
});
it("does not use secondary provider when fallback is disabled and primary is missing", async () => {
const settings = {
...defaultSettings(),
token: "",
megaLogin: "user",
megaPassword: "pass",
providerPrimary: "realdebrid" as const,
providerSecondary: "megadebrid" as const,
providerTertiary: "none" as const,
autoProviderFallback: false
};
const megaWeb = vi.fn(async () => ({
fileName: "should-not-run.bin",
directUrl: "https://unused",
fileSize: null,
retriesUsed: 0
}));
const service = new DebridService(settings, { megaWebUnrestrict: megaWeb });
await expect(service.unrestrictLink("https://rapidgator.net/file/example.part5.rar.html")).rejects.toThrow(/nicht konfiguriert/i);
expect(megaWeb).toHaveBeenCalledTimes(0);
});
it("allows disabling secondary and tertiary providers", async () => {
const settings = {
...defaultSettings(),
token: "rd-token",
megaLogin: "user",
megaPassword: "pass",
providerPrimary: "realdebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
autoProviderFallback: true
};
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.real-debrid.com/rest/1.0/unrestrict/link")) {
return new Response(JSON.stringify({ error: "traffic_limit" }), {
status: 403,
headers: { "Content-Type": "application/json" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const megaWeb = vi.fn(async () => ({
fileName: "unused.bin",
directUrl: "https://unused",
fileSize: null,
retriesUsed: 0
}));
const service = new DebridService(settings, { megaWebUnrestrict: megaWeb });
await expect(service.unrestrictLink("https://rapidgator.net/file/example.part6.rar.html")).rejects.toThrow();
expect(megaWeb).toHaveBeenCalledTimes(0);
});
it("resolves rapidgator filename from page when provider returns hash", async () => {
const settings = {
...defaultSettings(),
token: "rd-token",
providerPrimary: "realdebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
autoProviderFallback: true
};
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.real-debrid.com/rest/1.0/unrestrict/link")) {
return new Response(JSON.stringify({
download: "https://cdn.example/file.bin",
filename: "6f09df2984fe01378537c7cd8d7fa7ce",
filesize: 2048
}), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
if (url.includes("rapidgator.net/file/6f09df2984fe01378537c7cd8d7fa7ce")) {
return new Response("<html><head><title>download file Banshee.S04E01.German.DL.720p.part01.rar - Rapidgator</title></head></html>", {
status: 200,
headers: { "Content-Type": "text/html" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
const result = await service.unrestrictLink("https://rapidgator.net/file/6f09df2984fe01378537c7cd8d7fa7ce");
expect(result.provider).toBe("realdebrid");
expect(result.fileName).toBe("Banshee.S04E01.German.DL.720p.part01.rar");
});
it("resolves filenames for rg.to links", async () => {
const settings = {
...defaultSettings(),
allDebridToken: ""
};
const link = "https://rg.to/file/685cec6dcc1837dc725755fc9c726dd9";
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url === link) {
return new Response("<html><head><title>Download file Bulletproof.S01E01.German.DL.DD20.Synced.720p.AmazonHD.h264-GDR.part01.rar</title></head></html>", {
status: 200,
headers: { "Content-Type": "text/html" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
const resolved = await service.resolveFilenames([link]);
expect(resolved.get(link)).toBe("Bulletproof.S01E01.German.DL.DD20.Synced.720p.AmazonHD.h264-GDR.part01.rar");
});
it("does not unrestrict non-rapidgator links during filename scan", async () => {
const settings = {
...defaultSettings(),
token: "rd-token",
providerPrimary: "realdebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
autoProviderFallback: true,
allDebridToken: ""
};
const linkFromPage = "https://rapidgator.net/file/11111111111111111111111111111111";
const linkFromProvider = "https://hoster.example/file/22222222222222222222222222222222";
let unrestrictCalls = 0;
globalThis.fetch = (async (input: RequestInfo | URL, init?: RequestInit): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url === linkFromPage) {
return new Response("<html><head><title>Download file from-page.part1.rar</title></head></html>", {
status: 200,
headers: { "Content-Type": "text/html" }
});
}
if (url.includes("api.real-debrid.com/rest/1.0/unrestrict/link")) {
unrestrictCalls += 1;
const body = init?.body;
const bodyText = body instanceof URLSearchParams ? body.toString() : String(body || "");
const linkValue = new URLSearchParams(bodyText).get("link") || "";
if (linkValue === linkFromProvider) {
return new Response(JSON.stringify({
download: "https://cdn.example/from-provider",
filename: "from-provider.part2.rar",
filesize: 1024
}), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
const events: Array<{ link: string; fileName: string }> = [];
const resolved = await service.resolveFilenames([linkFromPage, linkFromProvider], (link, fileName) => {
events.push({ link, fileName });
});
expect(resolved.get(linkFromPage)).toBe("from-page.part1.rar");
expect(resolved.has(linkFromProvider)).toBe(false);
expect(unrestrictCalls).toBe(0);
expect(events).toEqual(expect.arrayContaining([
{ link: linkFromPage, fileName: "from-page.part1.rar" }
]));
});
it("does not unrestrict rapidgator links during filename scan after page lookup miss", async () => {
const settings = {
...defaultSettings(),
token: "rd-token",
providerPrimary: "realdebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
allDebridToken: ""
};
const link = "https://rapidgator.net/file/aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa";
let unrestrictCalls = 0;
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.real-debrid.com/rest/1.0/unrestrict/link")) {
unrestrictCalls += 1;
return new Response(JSON.stringify({ error: "should-not-be-called" }), {
status: 500,
headers: { "Content-Type": "application/json" }
});
}
if (url === link) {
return new Response("not found", { status: 404 });
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
const resolved = await service.resolveFilenames([link]);
expect(resolved.size).toBe(0);
expect(unrestrictCalls).toBe(0);
});
it("maps AllDebrid filename infos by index when response link is missing", async () => {
const settings = {
...defaultSettings(),
token: "",
bestToken: "",
allDebridToken: "ad-token",
providerPrimary: "realdebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
autoProviderFallback: true
};
const linkA = "https://rapidgator.net/file/aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa";
const linkB = "https://rapidgator.net/file/bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb";
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.alldebrid.com/v4/link/infos")) {
return new Response(JSON.stringify({
status: "success",
data: {
infos: [
{ filename: "wrong-a.mkv" },
{ filename: "wrong-b.mkv" }
]
}
}), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
if (url === linkA || url === linkB) {
return new Response("no title", { status: 404 });
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
const resolved = await service.resolveFilenames([linkA, linkB]);
expect(resolved.get(linkA)).toBe("wrong-a.mkv");
expect(resolved.get(linkB)).toBe("wrong-b.mkv");
expect(resolved.size).toBe(2);
});
it("retries AllDebrid filename infos after transient server error", async () => {
const settings = {
...defaultSettings(),
allDebridToken: "ad-token"
};
const link = "https://rapidgator.net/file/aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa";
let infoCalls = 0;
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.alldebrid.com/v4/link/infos")) {
infoCalls += 1;
if (infoCalls === 1) {
return new Response("temporary error", { status: 500 });
}
return new Response(JSON.stringify({
status: "success",
data: {
infos: [
{ link, filename: "resolved-from-infos.mkv" }
]
}
}), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
const resolved = await service.resolveFilenames([link]);
expect(resolved.get(link)).toBe("resolved-from-infos.mkv");
expect(infoCalls).toBe(2);
});
it("retries AllDebrid filename infos when HTML challenge is returned", async () => {
const settings = {
...defaultSettings(),
allDebridToken: "ad-token"
};
const link = "https://rapidgator.net/file/bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb";
let infoCalls = 0;
let pageCalls = 0;
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.alldebrid.com/v4/link/infos")) {
infoCalls += 1;
return new Response("<html><title>cf challenge</title></html>", {
status: 200,
headers: { "Content-Type": "text/html" }
});
}
if (url === link) {
pageCalls += 1;
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
const resolved = await service.resolveFilenames([link]);
expect(resolved.size).toBe(0);
expect(infoCalls).toBe(REQUEST_RETRIES);
expect(pageCalls).toBe(1);
});
});
describe("normalizeResolvedFilename", () => {
it("strips HTML entities", () => {
expect(normalizeResolvedFilename("Show.S01E01.German.DL.720p.part01.rar")).toBe("Show.S01E01.German.DL.720p.part01.rar");
expect(normalizeResolvedFilename("File&amp;Name.part1.rar")).toBe("File&Name.part1.rar");
expect(normalizeResolvedFilename("File&quot;Name&quot;.part1.rar")).toBe('File"Name".part1.rar');
});
it("strips HTML tags and collapses whitespace", () => {
// Tags are replaced by spaces, then multiple spaces collapsed
const result = normalizeResolvedFilename("<b>Show.S01E01</b>.part01.rar");
expect(result).toBe("Show.S01E01 .part01.rar");
// Entity decoding happens before tag removal, so &lt;...&gt; becomes <...> then gets stripped
const entityTagResult = normalizeResolvedFilename("File&lt;Tag&gt;.part1.rar");
expect(entityTagResult).toBe("File .part1.rar");
});
it("strips 'download file' prefix", () => {
expect(normalizeResolvedFilename("Download file Show.S01E01.part01.rar")).toBe("Show.S01E01.part01.rar");
expect(normalizeResolvedFilename("download file Movie.2024.mkv")).toBe("Movie.2024.mkv");
});
it("strips Rapidgator suffix", () => {
expect(normalizeResolvedFilename("Show.S01E01.part01.rar - Rapidgator")).toBe("Show.S01E01.part01.rar");
expect(normalizeResolvedFilename("Movie.mkv | Rapidgator.net")).toBe("Movie.mkv");
});
it("returns empty for opaque or non-filename values", () => {
expect(normalizeResolvedFilename("")).toBe("");
expect(normalizeResolvedFilename("just some text")).toBe("");
expect(normalizeResolvedFilename("e51f6809bb6ca615601f5ac5db433737")).toBe("");
expect(normalizeResolvedFilename("download.bin")).toBe("");
});
it("handles combined transforms", () => {
// "Download file" prefix stripped, &amp; decoded to &, "- Rapidgator" suffix stripped
expect(normalizeResolvedFilename("Download file Show.S01E01.part01.rar - Rapidgator"))
.toBe("Show.S01E01.part01.rar");
});
});
describe("filenameFromRapidgatorUrlPath", () => {
it("extracts filename from standard rapidgator URL", () => {
expect(filenameFromRapidgatorUrlPath("https://rapidgator.net/file/abc123/Show.S01E01.part01.rar.html"))
.toBe("Show.S01E01.part01.rar");
});
it("extracts filename without .html suffix", () => {
expect(filenameFromRapidgatorUrlPath("https://rapidgator.net/file/abc123/Movie.2024.mkv"))
.toBe("Movie.2024.mkv");
});
it("returns empty for hash-only URL paths", () => {
expect(filenameFromRapidgatorUrlPath("https://rapidgator.net/file/e51f6809bb6ca615601f5ac5db433737"))
.toBe("");
});
it("returns empty for invalid URLs", () => {
expect(filenameFromRapidgatorUrlPath("not-a-url")).toBe("");
expect(filenameFromRapidgatorUrlPath("")).toBe("");
});
it("handles URL-encoded path segments", () => {
expect(filenameFromRapidgatorUrlPath("https://rapidgator.net/file/id/Show%20Name.S01E01.part01.rar.html"))
.toBe("Show Name.S01E01.part01.rar");
});
});
describe("extractRapidgatorFilenameFromHtml", () => {
it("extracts filename from title tag", () => {
const html = "<html><head><title>Download file Show.S01E01.German.DL.720p.part01.rar - Rapidgator</title></head></html>";
expect(extractRapidgatorFilenameFromHtml(html)).toBe("Show.S01E01.German.DL.720p.part01.rar");
});
it("extracts filename from og:title meta tag", () => {
const html = '<html><head><meta property="og:title" content="Movie.2024.German.DL.1080p.mkv"></head></html>';
expect(extractRapidgatorFilenameFromHtml(html)).toBe("Movie.2024.German.DL.1080p.mkv");
});
it("extracts filename from reversed og:title attribute order", () => {
const html = '<html><head><meta content="Movie.2024.German.DL.1080p.mkv" property="og:title"></head></html>';
expect(extractRapidgatorFilenameFromHtml(html)).toBe("Movie.2024.German.DL.1080p.mkv");
});
it("returns empty for HTML without recognizable filenames", () => {
const html = "<html><head><title>Rapidgator: Fast, Pair and Unlimited</title></head><body>No file here</body></html>";
expect(extractRapidgatorFilenameFromHtml(html)).toBe("");
});
it("returns empty for empty HTML", () => {
expect(extractRapidgatorFilenameFromHtml("")).toBe("");
});
it("ignores broad body text that is not a labeled filename", () => {
const html = "<html><body>Please download file now from mirror.mkv</body></html>";
expect(extractRapidgatorFilenameFromHtml(html)).toBe("");
});
it("extracts from File name label in page body", () => {
const html = '<html><body>File name: <b>Show.S02E03.720p.part01.rar</b></body></html>';
expect(extractRapidgatorFilenameFromHtml(html)).toBe("Show.S02E03.720p.part01.rar");
});
});

File diff suppressed because it is too large Load Diff

204
tests/extractor-jvm.test.ts Normal file
View File

@ -0,0 +1,204 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { spawnSync } from "node:child_process";
import AdmZip from "adm-zip";
import { afterEach, describe, expect, it } from "vitest";
import { extractPackageArchives } from "../src/main/extractor";
const tempDirs: string[] = [];
const originalBackend = process.env.RD_EXTRACT_BACKEND;
function hasJavaRuntime(): boolean {
const result = spawnSync("java", ["-version"], { stdio: "ignore" });
return result.status === 0;
}
function hasJvmExtractorRuntime(): boolean {
const root = path.join(process.cwd(), "resources", "extractor-jvm");
const classesMain = path.join(root, "classes", "com", "sucukdeluxe", "extractor", "JBindExtractorMain.class");
const requiredLibs = [
path.join(root, "lib", "sevenzipjbinding.jar"),
path.join(root, "lib", "sevenzipjbinding-all-platforms.jar"),
path.join(root, "lib", "zip4j.jar")
];
return fs.existsSync(classesMain) && requiredLibs.every((libPath) => fs.existsSync(libPath));
}
afterEach(() => {
for (const dir of tempDirs.splice(0)) {
fs.rmSync(dir, { recursive: true, force: true });
}
if (originalBackend === undefined) {
delete process.env.RD_EXTRACT_BACKEND;
} else {
process.env.RD_EXTRACT_BACKEND = originalBackend;
}
});
describe.skipIf(!hasJavaRuntime() || !hasJvmExtractorRuntime())("extractor jvm backend", () => {
it("extracts zip archives through SevenZipJBinding backend", async () => {
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-extract-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
const zipPath = path.join(packageDir, "release.zip");
const zip = new AdmZip();
zip.addFile("episode.txt", Buffer.from("ok"));
zip.writeZip(zipPath);
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false
});
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
expect(fs.existsSync(path.join(targetDir, "episode.txt"))).toBe(true);
});
it("emits progress callbacks with archiveName and percent", async () => {
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-progress-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
// Create a ZIP with some content to trigger progress
const zipPath = path.join(packageDir, "progress-test.zip");
const zip = new AdmZip();
zip.addFile("file1.txt", Buffer.from("Hello World ".repeat(100)));
zip.addFile("file2.txt", Buffer.from("Another file ".repeat(100)));
zip.writeZip(zipPath);
const progressUpdates: Array<{
archiveName: string;
percent: number;
phase: string;
archivePercent?: number;
}> = [];
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false,
onProgress: (update) => {
progressUpdates.push({
archiveName: update.archiveName,
percent: update.percent,
phase: update.phase,
archivePercent: update.archivePercent,
});
},
});
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
// Should have at least preparing, extracting, and done phases
const phases = new Set(progressUpdates.map((u) => u.phase));
expect(phases.has("preparing")).toBe(true);
expect(phases.has("extracting")).toBe(true);
// Extracting phase should include the archive name
const extracting = progressUpdates.filter((u) => u.phase === "extracting" && u.archiveName === "progress-test.zip");
expect(extracting.length).toBeGreaterThan(0);
// Should end at 100%
const lastExtracting = extracting[extracting.length - 1];
expect(lastExtracting.archivePercent).toBe(100);
// Files should exist
expect(fs.existsSync(path.join(targetDir, "file1.txt"))).toBe(true);
expect(fs.existsSync(path.join(targetDir, "file2.txt"))).toBe(true);
});
it("extracts multiple archives sequentially with progress for each", async () => {
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-multi-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
// Create two separate ZIP archives
const zip1 = new AdmZip();
zip1.addFile("episode01.txt", Buffer.from("ep1 content"));
zip1.writeZip(path.join(packageDir, "archive1.zip"));
const zip2 = new AdmZip();
zip2.addFile("episode02.txt", Buffer.from("ep2 content"));
zip2.writeZip(path.join(packageDir, "archive2.zip"));
const archiveNames = new Set<string>();
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false,
onProgress: (update) => {
if (update.phase === "extracting" && update.archiveName) {
archiveNames.add(update.archiveName);
}
},
});
expect(result.extracted).toBe(2);
expect(result.failed).toBe(0);
// Both archive names should have appeared in progress
expect(archiveNames.has("archive1.zip")).toBe(true);
expect(archiveNames.has("archive2.zip")).toBe(true);
// Both files extracted
expect(fs.existsSync(path.join(targetDir, "episode01.txt"))).toBe(true);
expect(fs.existsSync(path.join(targetDir, "episode02.txt"))).toBe(true);
});
it("respects ask/skip conflict mode in jvm backend", async () => {
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-extract-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
fs.mkdirSync(targetDir, { recursive: true });
const zipPath = path.join(packageDir, "conflict.zip");
const zip = new AdmZip();
zip.addFile("same.txt", Buffer.from("new"));
zip.writeZip(zipPath);
const existingPath = path.join(targetDir, "same.txt");
fs.writeFileSync(existingPath, "old", "utf8");
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "ask",
removeLinks: false,
removeSamples: false
});
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
expect(fs.readFileSync(existingPath, "utf8")).toBe("old");
});
});

1089
tests/extractor.test.ts Normal file

File diff suppressed because it is too large Load Diff

84
tests/integrity.test.ts Normal file
View File

@ -0,0 +1,84 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it } from "vitest";
import { parseHashLine, readHashManifest, validateFileAgainstManifest } from "../src/main/integrity";
const tempDirs: string[] = [];
afterEach(() => {
for (const dir of tempDirs.splice(0)) {
fs.rmSync(dir, { recursive: true, force: true });
}
});
describe("integrity", () => {
it("parses md5 and sfv lines", () => {
const md = parseHashLine("d41d8cd98f00b204e9800998ecf8427e sample.bin");
expect(md?.algorithm).toBe("md5");
const sfv = parseHashLine("sample.bin 1A2B3C4D");
expect(sfv?.algorithm).toBe("crc32");
});
it("validates file against md5 manifest", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-int-"));
tempDirs.push(dir);
const filePath = path.join(dir, "movie.bin");
fs.writeFileSync(filePath, Buffer.from("hello"));
fs.writeFileSync(path.join(dir, "hash.md5"), "5d41402abc4b2a76b9719d911017c592 movie.bin\n");
const result = await validateFileAgainstManifest(filePath, dir);
expect(result.ok).toBe(true);
});
it("skips manifest files larger than 5MB", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-int-"));
tempDirs.push(dir);
// Create a .md5 manifest that exceeds the 5MB limit
const largeContent = "d41d8cd98f00b204e9800998ecf8427e sample.bin\n".repeat(200000);
const manifestPath = path.join(dir, "hashes.md5");
fs.writeFileSync(manifestPath, largeContent, "utf8");
// Verify the file is actually > 5MB
const stat = fs.statSync(manifestPath);
expect(stat.size).toBeGreaterThan(5 * 1024 * 1024);
// readHashManifest should skip the oversized file
const manifest = readHashManifest(dir);
expect(manifest.size).toBe(0);
});
it("does not parse SHA256 (64-char hex) as valid hash", () => {
// SHA256 is 64 chars - parseHashLine only supports 32 (MD5) and 40 (SHA1)
const sha256Line = "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855 emptyfile.bin";
const result = parseHashLine(sha256Line);
// 64-char hex should not match the MD5 (32) or SHA1 (40) pattern
expect(result).toBeNull();
});
it("parses SHA1 hash lines correctly", () => {
const sha1Line = "da39a3ee5e6b4b0d3255bfef95601890afd80709 emptyfile.bin";
const result = parseHashLine(sha1Line);
expect(result).not.toBeNull();
expect(result?.algorithm).toBe("sha1");
expect(result?.digest).toBe("da39a3ee5e6b4b0d3255bfef95601890afd80709");
expect(result?.fileName).toBe("emptyfile.bin");
});
it("ignores comment lines in hash manifests", () => {
expect(parseHashLine("; This is a comment")).toBeNull();
expect(parseHashLine("")).toBeNull();
expect(parseHashLine(" ")).toBeNull();
});
it("keeps first hash entry when duplicate filename appears across manifests", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-int-"));
tempDirs.push(dir);
fs.writeFileSync(path.join(dir, "disc1.md5"), "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa movie.mkv\n", "utf8");
fs.writeFileSync(path.join(dir, "disc2.md5"), "bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb movie.mkv\n", "utf8");
const manifest = readHashManifest(dir);
expect(manifest.get("movie.mkv")?.digest).toBe("aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa");
});
});

74
tests/link-parser.test.ts Normal file
View File

@ -0,0 +1,74 @@
import { describe, expect, it } from "vitest";
import { mergePackageInputs, parseCollectorInput } from "../src/main/link-parser";
describe("link-parser", () => {
describe("mergePackageInputs", () => {
it("merges packages with the same name and preserves order", () => {
const input = [
{ name: "Package A", links: ["http://link1", "http://link2"] },
{ name: "Package B", links: ["http://link3"] },
{ name: "Package A", links: ["http://link4", "http://link1"] },
{ name: "", links: ["http://link5"] } // empty name will be inferred
];
const result = mergePackageInputs(input);
expect(result).toHaveLength(3); // Package A, Package B, and inferred 'Paket'
const pkgA = result.find(p => p.name === "Package A");
expect(pkgA?.links).toEqual(["http://link1", "http://link2", "http://link4"]); // link1 deduplicated
const pkgB = result.find(p => p.name === "Package B");
expect(pkgB?.links).toEqual(["http://link3"]);
});
it("sanitizes names during merge", () => {
const input = [
{ name: "Valid_Name", links: ["http://link1"] },
{ name: "Valid?Name*", links: ["http://link2"] }
];
const result = mergePackageInputs(input);
// "Valid?Name*" becomes "Valid Name " -> trimmed to "Valid Name"
expect(result.map(p => p.name).sort()).toEqual(["Valid Name", "Valid_Name"]);
});
});
describe("parseCollectorInput", () => {
it("returns empty array for empty or invalid input", () => {
expect(parseCollectorInput("")).toEqual([]);
expect(parseCollectorInput("just some text without links")).toEqual([]);
expect(parseCollectorInput("ftp://notsupported")).toEqual([]);
});
it("parses and merges links from raw text", () => {
const rawText = `
Here are some links:
http://example.com/part1.rar
http://example.com/part2.rar
# package: Custom_Name
http://other.com/file1
http://other.com/file2
`;
const result = parseCollectorInput(rawText, "DefaultFallback");
// Should have 2 packages: "DefaultFallback" and "Custom_Name"
expect(result).toHaveLength(2);
const defaultPkg = result.find(p => p.name === "DefaultFallback");
expect(defaultPkg?.links).toEqual([
"http://example.com/part1.rar",
"http://example.com/part2.rar"
]);
const customPkg = result.find(p => p.name === "Custom_Name"); // sanitized!
expect(customPkg?.links).toEqual([
"http://other.com/file1",
"http://other.com/file2"
]);
});
});
});

View File

@ -0,0 +1,178 @@
import { afterEach, describe, expect, it, vi } from "vitest";
import { MegaWebFallback } from "../src/main/mega-web-fallback";
const originalFetch = globalThis.fetch;
describe("mega-web-fallback", () => {
afterEach(() => {
globalThis.fetch = originalFetch;
vi.restoreAllMocks();
});
describe("MegaWebFallback class", () => {
it("returns null when credentials are empty", async () => {
const fallback = new MegaWebFallback(() => ({ login: "", password: "" }));
const result = await fallback.unrestrict("https://mega.debrid/test");
expect(result).toBeNull();
});
it("logs in, fetches HTML, parses code, and polls AJAX for direct url", async () => {
let fetchCallCount = 0;
globalThis.fetch = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
fetchCallCount += 1;
if (urlStr.includes("form=login")) {
const headers = new Headers();
headers.append("set-cookie", "session=goodcookie; path=/");
return new Response("", { headers, status: 200 });
}
if (urlStr.includes("page=debrideur")) {
return new Response('<form id="debridForm"></form>', { status: 200 });
}
if (urlStr.includes("form=debrid")) {
// The POST to generate the code
return new Response(`
<div class="acp-box">
<h3>Link: https://mega.debrid/link1</h3>
<a href="javascript:processDebrid(1,'secretcode123',0)">Download</a>
</div>
`, { status: 200 });
}
if (urlStr.includes("ajax=debrid")) {
// Polling endpoint
return new Response(JSON.stringify({ link: "https://mega.direct/123" }), { status: 200 });
}
return new Response("Not found", { status: 404 });
}) as unknown as typeof fetch;
const fallback = new MegaWebFallback(() => ({ login: "user", password: "pwd" }));
const result = await fallback.unrestrict("https://mega.debrid/link1");
expect(result).not.toBeNull();
expect(result?.directUrl).toBe("https://mega.direct/123");
expect(result?.fileName).toBe("link1");
// Calls: 1. Login POST, 2. Verify GET, 3. Generate POST, 4. Polling POST
expect(fetchCallCount).toBe(4);
});
it("throws if login fails to set cookie", async () => {
globalThis.fetch = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
if (urlStr.includes("form=login")) {
const headers = new Headers(); // No cookie
return new Response("", { headers, status: 200 });
}
return new Response("Not found", { status: 404 });
}) as unknown as typeof fetch;
const fallback = new MegaWebFallback(() => ({ login: "bad", password: "bad" }));
await expect(fallback.unrestrict("http://mega.debrid/file"))
.rejects.toThrow("Mega-Web Login liefert kein Session-Cookie");
});
it("throws if login verify check fails (no form found)", async () => {
globalThis.fetch = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
if (urlStr.includes("form=login")) {
const headers = new Headers();
headers.append("set-cookie", "session=goodcookie; path=/");
return new Response("", { headers, status: 200 });
}
if (urlStr.includes("page=debrideur")) {
// Missing form!
return new Response('<html><body>Nothing here</body></html>', { status: 200 });
}
return new Response("Not found", { status: 404 });
}) as unknown as typeof fetch;
const fallback = new MegaWebFallback(() => ({ login: "a", password: "b" }));
await expect(fallback.unrestrict("http://mega.debrid/file"))
.rejects.toThrow("Mega-Web Login ungültig oder Session blockiert");
});
it("returns null if generation fails to find a code", async () => {
let callCount = 0;
globalThis.fetch = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
callCount++;
if (urlStr.includes("form=login")) {
const headers = new Headers();
headers.append("set-cookie", "session=goodcookie; path=/");
return new Response("", { headers, status: 200 });
}
if (urlStr.includes("page=debrideur")) {
return new Response('<form id="debridForm"></form>', { status: 200 });
}
if (urlStr.includes("form=debrid")) {
// The generate POST returns HTML without any codes
return new Response(`<div>No links here</div>`, { status: 200 });
}
return new Response("Not found", { status: 404 });
}) as unknown as typeof fetch;
const fallback = new MegaWebFallback(() => ({ login: "a", password: "b" }));
const result = await fallback.unrestrict("http://mega.debrid/file");
// Generation fails -> resets cookie -> tries again -> fails again -> returns null
expect(result).toBeNull();
});
it("aborts pending Mega-Web polling when signal is cancelled", async () => {
globalThis.fetch = vi.fn((url: string | URL | Request, init?: RequestInit): Promise<Response> => {
const urlStr = String(url);
if (urlStr.includes("form=login")) {
const headers = new Headers();
headers.append("set-cookie", "session=goodcookie; path=/");
return Promise.resolve(new Response("", { headers, status: 200 }));
}
if (urlStr.includes("page=debrideur")) {
return Promise.resolve(new Response('<form id="debridForm"></form>', { status: 200 }));
}
if (urlStr.includes("form=debrid")) {
return Promise.resolve(new Response(`
<div class="acp-box">
<h3>Link: https://mega.debrid/link2</h3>
<a href="javascript:processDebrid(1,'secretcode456',0)">Download</a>
</div>
`, { status: 200 }));
}
if (urlStr.includes("ajax=debrid")) {
return new Promise<Response>((_resolve, reject) => {
const signal = init?.signal;
const onAbort = (): void => reject(new Error("aborted:ajax"));
if (signal?.aborted) {
onAbort();
return;
}
signal?.addEventListener("abort", onAbort, { once: true });
});
}
return Promise.resolve(new Response("Not found", { status: 404 }));
}) as unknown as typeof fetch;
const fallback = new MegaWebFallback(() => ({ login: "user", password: "pwd" }));
const controller = new AbortController();
const timer = setTimeout(() => {
controller.abort("test");
}, 200);
try {
await expect(fallback.unrestrict("https://mega.debrid/link2", controller.signal)).rejects.toThrow(/aborted/i);
} finally {
clearTimeout(timer);
}
});
});
});

42
tests/realdebrid.test.ts Normal file
View File

@ -0,0 +1,42 @@
import { afterEach, describe, expect, it } from "vitest";
import { RealDebridClient } from "../src/main/realdebrid";
const originalFetch = globalThis.fetch;
afterEach(() => {
globalThis.fetch = originalFetch;
});
describe("realdebrid client", () => {
it("returns a clear error when HTML is returned instead of JSON", async () => {
globalThis.fetch = (async (): Promise<Response> => {
return new Response("<html><title>Cloudflare</title></html>", {
status: 200,
headers: { "Content-Type": "text/html" }
});
}) as typeof fetch;
const client = new RealDebridClient("rd-token");
await expect(client.unrestrictLink("https://hoster.example/file/html")).rejects.toThrow(/html/i);
});
it("does not leak raw response body on JSON parse errors", async () => {
globalThis.fetch = (async (): Promise<Response> => {
return new Response("<html>token=secret-should-not-leak</html>", {
status: 200,
headers: { "Content-Type": "application/json" }
});
}) as typeof fetch;
const client = new RealDebridClient("rd-token");
try {
await client.unrestrictLink("https://hoster.example/file/invalid-json");
throw new Error("expected unrestrict to fail");
} catch (error) {
const text = String(error || "");
expect(text.toLowerCase()).toContain("json");
expect(text.toLowerCase()).not.toContain("secret-should-not-leak");
expect(text.toLowerCase()).not.toContain("<html>");
}
});
});

View File

@ -0,0 +1,188 @@
import { describe, expect, it } from "vitest";
import { resolveArchiveItemsFromList } from "../src/main/download-manager";
type MinimalItem = {
targetPath?: string;
fileName?: string;
[key: string]: unknown;
};
function makeItems(names: string[]): MinimalItem[] {
return names.map((name) => ({
targetPath: `C:\\Downloads\\Package\\${name}`,
fileName: name,
id: name,
status: "completed",
}));
}
describe("resolveArchiveItemsFromList", () => {
// ── Multipart RAR (.partN.rar) ──
it("matches multipart .part1.rar archives", () => {
const items = makeItems([
"Movie.part1.rar",
"Movie.part2.rar",
"Movie.part3.rar",
"Other.rar",
]);
const result = resolveArchiveItemsFromList("Movie.part1.rar", items as any);
expect(result).toHaveLength(3);
expect(result.map((i: any) => i.fileName)).toEqual([
"Movie.part1.rar",
"Movie.part2.rar",
"Movie.part3.rar",
]);
});
it("matches multipart .part01.rar archives (zero-padded)", () => {
const items = makeItems([
"Film.part01.rar",
"Film.part02.rar",
"Film.part10.rar",
"Unrelated.zip",
]);
const result = resolveArchiveItemsFromList("Film.part01.rar", items as any);
expect(result).toHaveLength(3);
});
// ── Old-style RAR (.rar + .r00, .r01, etc.) ──
it("matches old-style .rar + .rNN volumes", () => {
const items = makeItems([
"Archive.rar",
"Archive.r00",
"Archive.r01",
"Archive.r02",
"Other.zip",
]);
const result = resolveArchiveItemsFromList("Archive.rar", items as any);
expect(result).toHaveLength(4);
});
// ── Single RAR ──
it("matches a single .rar file", () => {
const items = makeItems(["SingleFile.rar", "Other.mkv"]);
const result = resolveArchiveItemsFromList("SingleFile.rar", items as any);
expect(result).toHaveLength(1);
expect((result[0] as any).fileName).toBe("SingleFile.rar");
});
// ── Split ZIP ──
it("matches split .zip.NNN files", () => {
const items = makeItems([
"Data.zip",
"Data.zip.001",
"Data.zip.002",
"Data.zip.003",
]);
const result = resolveArchiveItemsFromList("Data.zip.001", items as any);
expect(result).toHaveLength(4);
});
// ── Split 7z ──
it("matches split .7z.NNN files", () => {
const items = makeItems([
"Backup.7z.001",
"Backup.7z.002",
]);
const result = resolveArchiveItemsFromList("Backup.7z.001", items as any);
expect(result).toHaveLength(2);
});
// ── Generic .NNN splits ──
it("matches generic .NNN split files", () => {
const items = makeItems([
"video.001",
"video.002",
"video.003",
]);
const result = resolveArchiveItemsFromList("video.001", items as any);
expect(result).toHaveLength(3);
});
// ── Exact filename match ──
it("matches a single .zip by exact name", () => {
const items = makeItems(["myarchive.zip", "other.rar"]);
const result = resolveArchiveItemsFromList("myarchive.zip", items as any);
expect(result).toHaveLength(1);
expect((result[0] as any).fileName).toBe("myarchive.zip");
});
// ── Case insensitivity ──
it("matches case-insensitively", () => {
const items = makeItems([
"MOVIE.PART1.RAR",
"MOVIE.PART2.RAR",
]);
const result = resolveArchiveItemsFromList("movie.part1.rar", items as any);
expect(result).toHaveLength(2);
});
// ── Stem-based fallback ──
it("uses stem-based fallback when exact patterns fail", () => {
// Simulate a debrid service that renames "Movie.part1.rar" to "Movie.part1_dl.rar"
// but the disk file is "Movie.part1.rar"
const items = makeItems([
"Movie.rar",
]);
// The archive on disk is "Movie.part1.rar" but there's no item matching the
// .partN pattern. The stem "movie" should match "Movie.rar" via fallback.
const result = resolveArchiveItemsFromList("Movie.part1.rar", items as any);
// stem fallback: "movie" starts with "movie" and ends with .rar
expect(result).toHaveLength(1);
});
// ── Single item fallback ──
it("returns single archive item when no pattern matches", () => {
const items = makeItems(["totally-different-name.rar"]);
const result = resolveArchiveItemsFromList("Original.rar", items as any);
// Single item in list with archive extension → return it
expect(result).toHaveLength(1);
});
// ── Empty when no match ──
it("returns empty when items have no archive extensions", () => {
const items = makeItems(["video.mkv", "subtitle.srt"]);
const result = resolveArchiveItemsFromList("Archive.rar", items as any);
expect(result).toHaveLength(0);
});
// ── Items without targetPath ──
it("falls back to fileName when targetPath is missing", () => {
const items = [
{ fileName: "Movie.part1.rar", id: "1", status: "completed" },
{ fileName: "Movie.part2.rar", id: "2", status: "completed" },
];
const result = resolveArchiveItemsFromList("Movie.part1.rar", items as any);
expect(result).toHaveLength(2);
});
// ── Multiple archives, should not cross-match ──
it("does not cross-match different archive groups", () => {
const items = makeItems([
"Episode.S01E01.part1.rar",
"Episode.S01E01.part2.rar",
"Episode.S01E02.part1.rar",
"Episode.S01E02.part2.rar",
]);
const result1 = resolveArchiveItemsFromList("Episode.S01E01.part1.rar", items as any);
expect(result1).toHaveLength(2);
expect(result1.every((i: any) => i.fileName.includes("S01E01"))).toBe(true);
const result2 = resolveArchiveItemsFromList("Episode.S01E02.part1.rar", items as any);
expect(result2).toHaveLength(2);
expect(result2.every((i: any) => i.fileName.includes("S01E02"))).toBe(true);
});
});

208
tests/self-check.ts Normal file
View File

@ -0,0 +1,208 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import http from "node:http";
import { once } from "node:events";
import { DownloadManager } from "../src/main/download-manager";
import { defaultSettings } from "../src/main/constants";
import { createStoragePaths, emptySession } from "../src/main/storage";
function assert(condition: unknown, message: string): void {
if (!condition) {
throw new Error(`Self-check fehlgeschlagen: ${message}`);
}
}
async function waitFor(predicate: () => boolean, timeoutMs = 20000): Promise<void> {
const start = Date.now();
while (!predicate()) {
if (Date.now() - start > timeoutMs) {
throw new Error("Timeout während Self-check");
}
await new Promise((resolve) => setTimeout(resolve, 100));
}
}
async function runDownloadCase(baseDir: string, baseUrl: string, url: string, options?: Partial<ReturnType<typeof defaultSettings>>): Promise<DownloadManager> {
const settings = {
...defaultSettings(),
token: "demo-token",
outputDir: path.join(baseDir, "downloads"),
extractDir: path.join(baseDir, "extract"),
autoExtract: false,
autoReconnect: true,
reconnectWaitSeconds: 1,
...options
};
const manager = new DownloadManager(settings, emptySession(), createStoragePaths(path.join(baseDir, "state")));
manager.addPackages([
{
name: "test-package",
links: [url]
}
]);
manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 30000);
return manager;
}
async function main(): Promise<void> {
const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), "rd-node-self-"));
const binary = Buffer.alloc(512 * 1024, 7);
let flakyFailures = 1;
const server = http.createServer((req, res) => {
const url = req.url || "/";
if (url.startsWith("/file.bin") || url.startsWith("/slow.bin") || url.startsWith("/rarcancel.bin") || url.startsWith("/flaky.bin")) {
if (url.startsWith("/flaky.bin") && flakyFailures > 0) {
flakyFailures -= 1;
res.statusCode = 503;
res.end("retry");
return;
}
const range = req.headers.range;
let start = 0;
if (range) {
const match = String(range).match(/bytes=(\d+)-/i);
if (match) {
start = Number(match[1]);
}
}
const chunk = binary.subarray(start);
if (start > 0) {
res.statusCode = 206;
res.setHeader("Content-Range", `bytes ${start}-${binary.length - 1}/${binary.length}`);
}
res.setHeader("Accept-Ranges", "bytes");
res.setHeader("Content-Length", chunk.length);
res.statusCode = res.statusCode || 200;
if (url.startsWith("/slow.bin") || url.startsWith("/rarcancel.bin")) {
const mid = Math.floor(chunk.length / 2);
res.write(chunk.subarray(0, mid));
setTimeout(() => {
res.end(chunk.subarray(mid));
}, 400);
return;
}
res.end(chunk);
return;
}
res.statusCode = 404;
res.end("not-found");
});
server.listen(0, "127.0.0.1");
await once(server, "listening");
const address = server.address();
if (!address || typeof address === "string") {
throw new Error("Server konnte nicht gestartet werden");
}
const baseUrl = `http://127.0.0.1:${address.port}`;
const originalFetch = globalThis.fetch;
globalThis.fetch = async (input: RequestInfo | URL, init?: RequestInit): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("/unrestrict/link")) {
const body = init?.body;
const params = body instanceof URLSearchParams ? body : new URLSearchParams(String(body || ""));
const link = params.get("link") || "";
const filename = link.includes("rarcancel") ? "release.part1.rar" : "file.bin";
const direct = link.includes("slow")
? `${baseUrl}/slow.bin`
: link.includes("rarcancel")
? `${baseUrl}/rarcancel.bin`
: link.includes("flaky")
? `${baseUrl}/flaky.bin`
: `${baseUrl}/file.bin`;
return new Response(
JSON.stringify({
download: direct,
filename,
filesize: binary.length
}),
{
status: 200,
headers: { "Content-Type": "application/json" }
}
);
}
return originalFetch(input, init);
};
try {
const manager1 = await runDownloadCase(tempRoot, baseUrl, "https://dummy/file");
const snapshot1 = manager1.getSnapshot();
const item1 = Object.values(snapshot1.session.items)[0];
assert(item1?.status === "completed", "normaler Download wurde nicht abgeschlossen");
assert(fs.existsSync(item1.targetPath), "Datei fehlt nach Download");
const manager2 = new DownloadManager(
{
...defaultSettings(),
token: "demo-token",
outputDir: path.join(tempRoot, "downloads-pause"),
extractDir: path.join(tempRoot, "extract-pause"),
autoExtract: false,
autoReconnect: false
},
emptySession(),
createStoragePaths(path.join(tempRoot, "state-pause"))
);
manager2.addPackages([{ name: "pause", links: ["https://dummy/slow"] }]);
await manager2.start();
await new Promise((resolve) => setTimeout(resolve, 120));
const paused = manager2.togglePause();
assert(paused, "Pause konnte nicht aktiviert werden");
await new Promise((resolve) => setTimeout(resolve, 150));
manager2.togglePause();
await waitFor(() => !manager2.getSnapshot().session.running, 30000);
const item2 = Object.values(manager2.getSnapshot().session.items)[0];
assert(item2?.status === "completed", "Pause/Resume Download nicht abgeschlossen");
const manager3 = await runDownloadCase(tempRoot, baseUrl, "https://dummy/flaky", { autoReconnect: true, reconnectWaitSeconds: 1 });
const item3 = Object.values(manager3.getSnapshot().session.items)[0];
assert(item3?.status === "completed", "Reconnect-Fall nicht abgeschlossen");
const manager4 = new DownloadManager(
{
...defaultSettings(),
token: "demo-token",
outputDir: path.join(tempRoot, "downloads-cancel"),
extractDir: path.join(tempRoot, "extract-cancel"),
autoExtract: false
},
emptySession(),
createStoragePaths(path.join(tempRoot, "state-cancel"))
);
manager4.addPackages([{ name: "cancel", links: ["https://dummy/rarcancel"] }]);
manager4.start();
await new Promise((resolve) => setTimeout(resolve, 150));
const pkgId = manager4.getSnapshot().session.packageOrder[0];
manager4.cancelPackage(pkgId);
await waitFor(() => !manager4.getSnapshot().session.running || Object.values(manager4.getSnapshot().session.items).every((item) => item.status !== "downloading"), 15000);
const cancelSnapshot = manager4.getSnapshot();
const remainingItems = Object.values(cancelSnapshot.session.items);
if (remainingItems.length === 0) {
assert(cancelSnapshot.session.packageOrder.length === 0, "Abgebrochenes Paket wurde nicht entfernt");
} else {
const cancelItem = remainingItems[0];
assert(cancelItem?.status === "cancelled" || cancelItem?.status === "queued", "Paketabbruch nicht wirksam");
}
const packageDir = path.join(path.join(tempRoot, "downloads-cancel"), "cancel");
const cancelArtifact = path.join(packageDir, "release.part1.rar");
await waitFor(() => !fs.existsSync(cancelArtifact), 10000);
assert(!fs.existsSync(cancelArtifact), "RAR-Artefakt wurde nicht gelöscht");
console.log("Node self-check erfolgreich");
} finally {
globalThis.fetch = originalFetch;
server.close();
fs.rmSync(tempRoot, { recursive: true, force: true });
}
}
void main();

163
tests/session-log.test.ts Normal file
View File

@ -0,0 +1,163 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it } from "vitest";
import { initSessionLog, getSessionLogPath, shutdownSessionLog } from "../src/main/session-log";
import { setLogListener } from "../src/main/logger";
const tempDirs: string[] = [];
afterEach(() => {
// Ensure session log is shut down between tests
shutdownSessionLog();
// Ensure listener is cleared between tests
setLogListener(null);
for (const dir of tempDirs.splice(0)) {
fs.rmSync(dir, { recursive: true, force: true });
}
});
describe("session-log", () => {
it("initSessionLog creates directory and file", () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
initSessionLog(baseDir);
const logPath = getSessionLogPath();
expect(logPath).not.toBeNull();
expect(fs.existsSync(logPath!)).toBe(true);
expect(fs.existsSync(path.join(baseDir, "session-logs"))).toBe(true);
expect(path.basename(logPath!)).toMatch(/^session_\d{4}-\d{2}-\d{2}_\d{2}-\d{2}-\d{2}\.txt$/);
const content = fs.readFileSync(logPath!, "utf8");
expect(content).toContain("=== Session gestartet:");
shutdownSessionLog();
});
it("logger listener writes to session log", async () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
initSessionLog(baseDir);
const logPath = getSessionLogPath()!;
// Simulate a log line via the listener
const { logger } = await import("../src/main/logger");
logger.info("Test-Nachricht für Session-Log");
// Wait for flush (200ms interval + margin)
await new Promise((resolve) => setTimeout(resolve, 500));
const content = fs.readFileSync(logPath, "utf8");
expect(content).toContain("Test-Nachricht für Session-Log");
shutdownSessionLog();
});
it("shutdownSessionLog writes closing line", () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
initSessionLog(baseDir);
const logPath = getSessionLogPath()!;
shutdownSessionLog();
const content = fs.readFileSync(logPath, "utf8");
expect(content).toContain("=== Session beendet:");
});
it("shutdownSessionLog removes listener", async () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
initSessionLog(baseDir);
const logPath = getSessionLogPath()!;
shutdownSessionLog();
// Log after shutdown - should NOT appear in session log
const { logger } = await import("../src/main/logger");
logger.info("Nach-Shutdown-Nachricht");
await new Promise((resolve) => setTimeout(resolve, 500));
const content = fs.readFileSync(logPath, "utf8");
expect(content).not.toContain("Nach-Shutdown-Nachricht");
});
it("cleanupOldSessionLogs deletes old files", async () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
const logsDir = path.join(baseDir, "session-logs");
fs.mkdirSync(logsDir, { recursive: true });
// Create a fake old session log
const oldFile = path.join(logsDir, "session_2020-01-01_00-00-00.txt");
fs.writeFileSync(oldFile, "old session");
// Set mtime to 30 days ago
const oldTime = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000);
fs.utimesSync(oldFile, oldTime, oldTime);
// Create a recent file
const newFile = path.join(logsDir, "session_2099-01-01_00-00-00.txt");
fs.writeFileSync(newFile, "new session");
// initSessionLog triggers cleanup
initSessionLog(baseDir);
// Wait for async cleanup
await new Promise((resolve) => setTimeout(resolve, 300));
expect(fs.existsSync(oldFile)).toBe(false);
expect(fs.existsSync(newFile)).toBe(true);
shutdownSessionLog();
});
it("cleanupOldSessionLogs keeps recent files", async () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
const logsDir = path.join(baseDir, "session-logs");
fs.mkdirSync(logsDir, { recursive: true });
// Create a file from 2 days ago (should be kept)
const recentFile = path.join(logsDir, "session_2025-12-01_00-00-00.txt");
fs.writeFileSync(recentFile, "recent session");
const recentTime = new Date(Date.now() - 2 * 24 * 60 * 60 * 1000);
fs.utimesSync(recentFile, recentTime, recentTime);
initSessionLog(baseDir);
await new Promise((resolve) => setTimeout(resolve, 300));
expect(fs.existsSync(recentFile)).toBe(true);
shutdownSessionLog();
});
it("multiple sessions create different files", async () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
initSessionLog(baseDir);
const path1 = getSessionLogPath();
shutdownSessionLog();
// Small delay to ensure different timestamp
await new Promise((resolve) => setTimeout(resolve, 1100));
initSessionLog(baseDir);
const path2 = getSessionLogPath();
shutdownSessionLog();
expect(path1).not.toBeNull();
expect(path2).not.toBeNull();
expect(path1).not.toBe(path2);
expect(fs.existsSync(path1!)).toBe(true);
expect(fs.existsSync(path2!)).toBe(true);
});
});

513
tests/storage.test.ts Normal file
View File

@ -0,0 +1,513 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it } from "vitest";
import { AppSettings } from "../src/shared/types";
import { defaultSettings } from "../src/main/constants";
import { createStoragePaths, emptySession, loadSession, loadSettings, normalizeSettings, saveSession, saveSessionAsync, saveSettings } from "../src/main/storage";
const tempDirs: string[] = [];
afterEach(() => {
for (const dir of tempDirs.splice(0)) {
fs.rmSync(dir, { recursive: true, force: true });
}
});
describe("settings storage", () => {
it("does not persist provider credentials when rememberToken is disabled", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
saveSettings(paths, {
...defaultSettings(),
rememberToken: false,
token: "rd-token",
megaLogin: "mega-user",
megaPassword: "mega-pass",
bestToken: "best-token",
allDebridToken: "all-token"
});
const raw = JSON.parse(fs.readFileSync(paths.configFile, "utf8")) as Record<string, unknown>;
expect(raw.token).toBe("");
expect(raw.megaLogin).toBe("");
expect(raw.megaPassword).toBe("");
expect(raw.bestToken).toBe("");
expect(raw.allDebridToken).toBe("");
const loaded = loadSettings(paths);
expect(loaded.rememberToken).toBe(false);
expect(loaded.token).toBe("");
expect(loaded.megaLogin).toBe("");
expect(loaded.megaPassword).toBe("");
expect(loaded.bestToken).toBe("");
expect(loaded.allDebridToken).toBe("");
});
it("persists provider credentials when rememberToken is enabled", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
saveSettings(paths, {
...defaultSettings(),
rememberToken: true,
token: "rd-token",
megaLogin: "mega-user",
megaPassword: "mega-pass",
bestToken: "best-token",
allDebridToken: "all-token"
});
const loaded = loadSettings(paths);
expect(loaded.token).toBe("rd-token");
expect(loaded.megaLogin).toBe("mega-user");
expect(loaded.megaPassword).toBe("mega-pass");
expect(loaded.bestToken).toBe("best-token");
expect(loaded.allDebridToken).toBe("all-token");
});
it("normalizes invalid enum and numeric values", () => {
const normalized = normalizeSettings({
...defaultSettings(),
providerPrimary: "invalid-provider" as unknown as AppSettings["providerPrimary"],
providerSecondary: "invalid-provider" as unknown as AppSettings["providerSecondary"],
providerTertiary: "invalid-provider" as unknown as AppSettings["providerTertiary"],
cleanupMode: "broken" as unknown as AppSettings["cleanupMode"],
extractConflictMode: "broken" as unknown as AppSettings["extractConflictMode"],
completedCleanupPolicy: "broken" as unknown as AppSettings["completedCleanupPolicy"],
speedLimitMode: "broken" as unknown as AppSettings["speedLimitMode"],
maxParallel: 0,
retryLimit: 999,
reconnectWaitSeconds: 9999,
speedLimitKbps: -1,
outputDir: " ",
extractDir: " ",
mkvLibraryDir: " ",
updateRepo: " "
});
expect(normalized.providerPrimary).toBe("realdebrid");
expect(normalized.providerSecondary).toBe("none");
expect(normalized.providerTertiary).toBe("none");
expect(normalized.cleanupMode).toBe("none");
expect(normalized.extractConflictMode).toBe("overwrite");
expect(normalized.completedCleanupPolicy).toBe("never");
expect(normalized.speedLimitMode).toBe("global");
expect(normalized.maxParallel).toBe(1);
expect(normalized.retryLimit).toBe(99);
expect(normalized.reconnectWaitSeconds).toBe(600);
expect(normalized.speedLimitKbps).toBe(0);
expect(normalized.outputDir).toBe(defaultSettings().outputDir);
expect(normalized.extractDir).toBe(defaultSettings().extractDir);
expect(normalized.mkvLibraryDir).toBe(defaultSettings().mkvLibraryDir);
expect(normalized.updateRepo).toBe(defaultSettings().updateRepo);
});
it("normalizes malformed persisted config on load", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
fs.writeFileSync(
paths.configFile,
JSON.stringify({
providerPrimary: "not-valid",
completedCleanupPolicy: "not-valid",
maxParallel: "999",
retryLimit: "-3",
reconnectWaitSeconds: "1",
speedLimitMode: "not-valid",
updateRepo: ""
}),
"utf8"
);
const loaded = loadSettings(paths);
expect(loaded.providerPrimary).toBe("realdebrid");
expect(loaded.completedCleanupPolicy).toBe("never");
expect(loaded.maxParallel).toBe(50);
expect(loaded.retryLimit).toBe(0);
expect(loaded.reconnectWaitSeconds).toBe(10);
expect(loaded.speedLimitMode).toBe("global");
expect(loaded.updateRepo).toBe(defaultSettings().updateRepo);
});
it("keeps explicit none as fallback provider choice", () => {
const normalized = normalizeSettings({
...defaultSettings(),
providerSecondary: "none",
providerTertiary: "none"
});
expect(normalized.providerSecondary).toBe("none");
expect(normalized.providerTertiary).toBe("none");
});
it("normalizes archive password list line endings", () => {
const normalized = normalizeSettings({
...defaultSettings(),
archivePasswordList: "one\r\ntwo\r\nthree"
});
expect(normalized.archivePasswordList).toBe("one\ntwo\nthree");
});
it("assigns and preserves bandwidth schedule ids", () => {
const normalized = normalizeSettings({
...defaultSettings(),
bandwidthSchedules: [{ id: "", startHour: 1, endHour: 6, speedLimitKbps: 1024, enabled: true }]
});
const generatedId = normalized.bandwidthSchedules[0]?.id;
expect(typeof generatedId).toBe("string");
expect(generatedId?.length).toBeGreaterThan(0);
const normalizedAgain = normalizeSettings({
...defaultSettings(),
bandwidthSchedules: normalized.bandwidthSchedules
});
expect(normalizedAgain.bandwidthSchedules[0]?.id).toBe(generatedId);
});
it("resets stale active statuses to queued on session load", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
const session = emptySession();
session.packages["pkg1"] = {
id: "pkg1",
name: "Test Package",
outputDir: "/tmp/out",
extractDir: "/tmp/extract",
status: "downloading",
itemIds: ["item1", "item2", "item3", "item4"],
cancelled: false,
enabled: true,
createdAt: Date.now(),
updatedAt: Date.now()
};
session.items["item1"] = {
id: "item1",
packageId: "pkg1",
url: "https://example.com/file1.rar",
provider: null,
status: "downloading",
retries: 0,
speedBps: 1024,
downloadedBytes: 5000,
totalBytes: 10000,
progressPercent: 50,
fileName: "file1.rar",
targetPath: "/tmp/out/file1.rar",
resumable: true,
attempts: 1,
lastError: "some error",
fullStatus: "",
createdAt: Date.now(),
updatedAt: Date.now()
};
session.items["item2"] = {
id: "item2",
packageId: "pkg1",
url: "https://example.com/file2.rar",
provider: null,
status: "paused",
retries: 0,
speedBps: 0,
downloadedBytes: 0,
totalBytes: null,
progressPercent: 0,
fileName: "file2.rar",
targetPath: "/tmp/out/file2.rar",
resumable: false,
attempts: 0,
lastError: "",
fullStatus: "",
createdAt: Date.now(),
updatedAt: Date.now()
};
session.items["item3"] = {
id: "item3",
packageId: "pkg1",
url: "https://example.com/file3.rar",
provider: null,
status: "completed",
retries: 0,
speedBps: 0,
downloadedBytes: 10000,
totalBytes: 10000,
progressPercent: 100,
fileName: "file3.rar",
targetPath: "/tmp/out/file3.rar",
resumable: false,
attempts: 1,
lastError: "",
fullStatus: "",
createdAt: Date.now(),
updatedAt: Date.now()
};
session.items["item4"] = {
id: "item4",
packageId: "pkg1",
url: "https://example.com/file4.rar",
provider: null,
status: "queued",
retries: 0,
speedBps: 0,
downloadedBytes: 0,
totalBytes: null,
progressPercent: 0,
fileName: "file4.rar",
targetPath: "/tmp/out/file4.rar",
resumable: false,
attempts: 0,
lastError: "",
fullStatus: "",
createdAt: Date.now(),
updatedAt: Date.now()
};
saveSession(paths, session);
const loaded = loadSession(paths);
// Active statuses (downloading, paused) should be reset to "queued"
expect(loaded.items["item1"].status).toBe("queued");
expect(loaded.items["item2"].status).toBe("queued");
// Speed should be cleared
expect(loaded.items["item1"].speedBps).toBe(0);
// lastError should be cleared for reset items
expect(loaded.items["item1"].lastError).toBe("");
// Completed and queued statuses should be preserved
expect(loaded.items["item3"].status).toBe("completed");
expect(loaded.items["item4"].status).toBe("queued");
// Downloaded bytes should be preserved
expect(loaded.items["item1"].downloadedBytes).toBe(5000);
// Package data should be preserved
expect(loaded.packages["pkg1"].name).toBe("Test Package");
});
it("returns empty session when session file contains invalid JSON", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
fs.writeFileSync(paths.sessionFile, "{{{corrupted json!!!", "utf8");
const loaded = loadSession(paths);
const empty = emptySession();
expect(loaded.packages).toEqual(empty.packages);
expect(loaded.items).toEqual(empty.items);
expect(loaded.packageOrder).toEqual(empty.packageOrder);
});
it("loads backup session when primary session is corrupted", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
const backupSession = emptySession();
backupSession.packageOrder = ["pkg-backup"];
backupSession.packages["pkg-backup"] = {
id: "pkg-backup",
name: "Backup Package",
outputDir: path.join(dir, "out"),
extractDir: path.join(dir, "extract"),
status: "queued",
itemIds: ["item-backup"],
cancelled: false,
enabled: true,
createdAt: Date.now(),
updatedAt: Date.now()
};
backupSession.items["item-backup"] = {
id: "item-backup",
packageId: "pkg-backup",
url: "https://example.com/backup-file",
provider: null,
status: "queued",
retries: 0,
speedBps: 0,
downloadedBytes: 0,
totalBytes: null,
progressPercent: 0,
fileName: "backup-file.rar",
targetPath: path.join(dir, "out", "backup-file.rar"),
resumable: true,
attempts: 0,
lastError: "",
fullStatus: "Wartet",
createdAt: Date.now(),
updatedAt: Date.now()
};
fs.writeFileSync(`${paths.sessionFile}.bak`, JSON.stringify(backupSession), "utf8");
fs.writeFileSync(paths.sessionFile, "{broken-session-json", "utf8");
const loaded = loadSession(paths);
expect(loaded.packageOrder).toEqual(["pkg-backup"]);
expect(loaded.packages["pkg-backup"]?.name).toBe("Backup Package");
expect(loaded.items["item-backup"]?.fileName).toBe("backup-file.rar");
const restoredPrimary = JSON.parse(fs.readFileSync(paths.sessionFile, "utf8")) as { packages?: Record<string, unknown> };
expect(restoredPrimary.packages && "pkg-backup" in restoredPrimary.packages).toBe(true);
});
it("returns defaults when config file contains invalid JSON", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
// Write invalid JSON to the config file
fs.writeFileSync(paths.configFile, "{{{{not valid json!!!}", "utf8");
const loaded = loadSettings(paths);
const defaults = defaultSettings();
expect(loaded.providerPrimary).toBe(defaults.providerPrimary);
expect(loaded.maxParallel).toBe(defaults.maxParallel);
expect(loaded.retryLimit).toBe(defaults.retryLimit);
expect(loaded.outputDir).toBe(defaults.outputDir);
expect(loaded.cleanupMode).toBe(defaults.cleanupMode);
});
it("loads backup config when primary config is corrupted", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
const backupSettings = {
...defaultSettings(),
outputDir: path.join(dir, "backup-output"),
packageName: "from-backup"
};
fs.writeFileSync(`${paths.configFile}.bak`, JSON.stringify(backupSettings, null, 2), "utf8");
fs.writeFileSync(paths.configFile, "{broken-json", "utf8");
const loaded = loadSettings(paths);
expect(loaded.outputDir).toBe(backupSettings.outputDir);
expect(loaded.packageName).toBe("from-backup");
});
it("sanitizes malformed persisted session structures", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
fs.writeFileSync(paths.sessionFile, JSON.stringify({
version: "invalid",
packageOrder: [123, "pkg-valid"],
packages: {
"1": "bad-entry",
"pkg-valid": {
id: "pkg-valid",
name: "Valid Package",
outputDir: "C:/tmp/out",
extractDir: "C:/tmp/extract",
status: "downloading",
itemIds: ["item-valid", 123],
cancelled: false,
enabled: true
}
},
items: {
"item-valid": {
id: "item-valid",
packageId: "pkg-valid",
url: "https://example.com/file",
status: "queued",
fileName: "file.bin",
targetPath: "C:/tmp/out/file.bin"
},
"item-bad": "broken"
}
}), "utf8");
const loaded = loadSession(paths);
expect(Object.keys(loaded.packages)).toEqual(["pkg-valid"]);
expect(Object.keys(loaded.items)).toEqual(["item-valid"]);
expect(loaded.packageOrder).toEqual(["pkg-valid"]);
});
it("captures async session save payload before later mutations", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
const session = emptySession();
session.summaryText = "before-mutation";
const pending = saveSessionAsync(paths, session);
session.summaryText = "after-mutation";
await pending;
const persisted = JSON.parse(fs.readFileSync(paths.sessionFile, "utf8")) as { summaryText: string };
expect(persisted.summaryText).toBe("before-mutation");
});
it("creates session backup before sync and async session overwrites", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
const first = emptySession();
first.summaryText = "first";
saveSession(paths, first);
const second = emptySession();
second.summaryText = "second";
saveSession(paths, second);
const backupAfterSync = JSON.parse(fs.readFileSync(`${paths.sessionFile}.bak`, "utf8")) as { summaryText?: string };
expect(backupAfterSync.summaryText).toBe("first");
const third = emptySession();
third.summaryText = "third";
await saveSessionAsync(paths, third);
const backupAfterAsync = JSON.parse(fs.readFileSync(`${paths.sessionFile}.bak`, "utf8")) as { summaryText?: string };
const primaryAfterAsync = JSON.parse(fs.readFileSync(paths.sessionFile, "utf8")) as { summaryText?: string };
expect(backupAfterAsync.summaryText).toBe("second");
expect(primaryAfterAsync.summaryText).toBe("third");
});
it("applies defaults for missing fields when loading old config", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
// Write a minimal config that simulates an old version missing newer fields
fs.writeFileSync(
paths.configFile,
JSON.stringify({
token: "my-token",
rememberToken: true,
outputDir: "/custom/output"
}),
"utf8"
);
const loaded = loadSettings(paths);
const defaults = defaultSettings();
// Old fields should be preserved
expect(loaded.token).toBe("my-token");
expect(loaded.outputDir).toBe(path.resolve("/custom/output"));
// Missing new fields should get default values
expect(loaded.autoProviderFallback).toBe(defaults.autoProviderFallback);
expect(loaded.hybridExtract).toBe(defaults.hybridExtract);
expect(loaded.completedCleanupPolicy).toBe(defaults.completedCleanupPolicy);
expect(loaded.speedLimitMode).toBe(defaults.speedLimitMode);
expect(loaded.clipboardWatch).toBe(defaults.clipboardWatch);
expect(loaded.minimizeToTray).toBe(defaults.minimizeToTray);
expect(loaded.retryLimit).toBe(defaults.retryLimit);
expect(loaded.collectMkvToLibrary).toBe(defaults.collectMkvToLibrary);
expect(loaded.mkvLibraryDir).toBe(defaults.mkvLibraryDir);
expect(loaded.theme).toBe(defaults.theme);
expect(loaded.bandwidthSchedules).toEqual(defaults.bandwidthSchedules);
expect(loaded.updateRepo).toBe(defaults.updateRepo);
});
});

567
tests/update.test.ts Normal file
View File

@ -0,0 +1,567 @@
import fs from "node:fs";
import crypto from "node:crypto";
import { afterEach, describe, expect, it, vi } from "vitest";
import { checkGitHubUpdate, installLatestUpdate, isRemoteNewer, normalizeUpdateRepo, parseVersionParts } from "../src/main/update";
import { APP_VERSION } from "../src/main/constants";
import { UpdateCheckResult, UpdateInstallProgress } from "../src/shared/types";
const originalFetch = globalThis.fetch;
function sha256Hex(buffer: Buffer): string {
return crypto.createHash("sha256").update(buffer).digest("hex");
}
function sha512Hex(buffer: Buffer): string {
return crypto.createHash("sha512").update(buffer).digest("hex");
}
afterEach(() => {
globalThis.fetch = originalFetch;
vi.restoreAllMocks();
});
describe("update", () => {
it("normalizes update repo input", () => {
expect(normalizeUpdateRepo("")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo("owner/repo")).toBe("owner/repo");
expect(normalizeUpdateRepo("https://codeberg.org/owner/repo")).toBe("owner/repo");
expect(normalizeUpdateRepo("https://www.codeberg.org/owner/repo")).toBe("owner/repo");
expect(normalizeUpdateRepo("https://codeberg.org/owner/repo/releases/tag/v1.2.3")).toBe("owner/repo");
expect(normalizeUpdateRepo("codeberg.org/owner/repo.git")).toBe("owner/repo");
expect(normalizeUpdateRepo("git@codeberg.org:owner/repo.git")).toBe("owner/repo");
});
it("uses normalized repo slug for API requests", async () => {
let requestedUrl = "";
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
requestedUrl = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
return new Response(
JSON.stringify({
tag_name: `v${APP_VERSION}`,
html_url: "https://git.24-music.de/owner/repo/releases/tag/v1.0.0",
assets: []
}),
{
status: 200,
headers: { "Content-Type": "application/json" }
}
);
}) as typeof fetch;
const result = await checkGitHubUpdate("https://git.24-music.de/owner/repo/releases");
expect(requestedUrl).toBe("https://git.24-music.de/api/v1/repos/owner/repo/releases/latest");
expect(result.currentVersion).toBe(APP_VERSION);
expect(result.latestVersion).toBe(APP_VERSION);
expect(result.updateAvailable).toBe(false);
});
it("picks setup executable asset from release list", async () => {
globalThis.fetch = (async (): Promise<Response> => new Response(
JSON.stringify({
tag_name: "v9.9.9",
html_url: "https://codeberg.org/owner/repo/releases/tag/v9.9.9",
assets: [
{
name: "Real-Debrid-Downloader 9.9.9.exe",
browser_download_url: "https://example.invalid/portable.exe"
},
{
name: "Real-Debrid-Downloader Setup 9.9.9.exe",
browser_download_url: "https://example.invalid/setup.exe",
digest: "sha256:aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
}
]
}),
{
status: 200,
headers: { "Content-Type": "application/json" }
}
)) as typeof fetch;
const result = await checkGitHubUpdate("owner/repo");
expect(result.updateAvailable).toBe(true);
expect(result.setupAssetUrl).toBe("https://example.invalid/setup.exe");
expect(result.setupAssetName).toBe("Real-Debrid-Downloader Setup 9.9.9.exe");
});
it("falls back to alternate download URL when setup asset URL returns 404", async () => {
const executablePayload = fs.readFileSync(process.execPath);
const executableDigest = sha256Hex(executablePayload);
const requestedUrls: string[] = [];
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
requestedUrls.push(url);
if (url.includes("stale-setup.exe")) {
return new Response("missing", { status: 404 });
}
if (url.includes("/releases/download/v9.9.9/")) {
return new Response(executablePayload, {
status: 200,
headers: { "Content-Type": "application/octet-stream" }
});
}
return new Response("missing", { status: 404 });
}) as typeof fetch;
const prechecked: UpdateCheckResult = {
updateAvailable: true,
currentVersion: APP_VERSION,
latestVersion: "9.9.9",
latestTag: "v9.9.9",
releaseUrl: "https://codeberg.org/owner/repo/releases/tag/v9.9.9",
setupAssetUrl: "https://example.invalid/stale-setup.exe",
setupAssetName: "Real-Debrid-Downloader Setup 9.9.9.exe",
setupAssetDigest: `sha256:${executableDigest}`
};
const result = await installLatestUpdate("owner/repo", prechecked);
expect(result.started).toBe(true);
expect(requestedUrls.some((url) => url.includes("/releases/download/v9.9.9/"))).toBe(true);
expect(requestedUrls.filter((url) => url.includes("stale-setup.exe"))).toHaveLength(1);
});
it("skips draft tag payload and resolves setup asset from stable latest release", async () => {
const executablePayload = fs.readFileSync(process.execPath);
const requestedUrls: string[] = [];
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
requestedUrls.push(url);
if (url.endsWith("/releases/tags/v9.9.9")) {
return new Response(JSON.stringify({
tag_name: "v9.9.9",
draft: true,
prerelease: false,
assets: [
{
name: "Draft Setup 9.9.9.exe",
browser_download_url: "https://example.invalid/draft-setup.exe"
}
]
}), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
if (url.endsWith("/releases/latest")) {
const stableDigest = sha256Hex(executablePayload);
return new Response(JSON.stringify({
tag_name: "v9.9.9",
draft: false,
prerelease: false,
assets: [
{
name: "Stable Setup 9.9.9.exe",
browser_download_url: "https://example.invalid/stable-setup.exe",
digest: `sha256:${stableDigest}`
}
]
}), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
if (url.includes("stable-setup.exe")) {
return new Response(executablePayload, {
status: 200,
headers: { "Content-Type": "application/octet-stream" }
});
}
return new Response("missing", { status: 404 });
}) as typeof fetch;
const prechecked: UpdateCheckResult = {
updateAvailable: true,
currentVersion: APP_VERSION,
latestVersion: "9.9.9",
latestTag: "v9.9.9",
releaseUrl: "https://codeberg.org/owner/repo/releases/tag/v9.9.9",
setupAssetUrl: "",
setupAssetName: ""
};
const result = await installLatestUpdate("owner/repo", prechecked);
expect(result.started).toBe(true);
expect(requestedUrls.some((url) => url.endsWith("/releases/tags/v9.9.9"))).toBe(true);
expect(requestedUrls.some((url) => url.endsWith("/releases/latest"))).toBe(true);
expect(requestedUrls.some((url) => url.includes("stable-setup.exe"))).toBe(true);
expect(requestedUrls.some((url) => url.includes("draft-setup.exe"))).toBe(false);
});
it("times out hanging release JSON body reads", async () => {
vi.useFakeTimers();
try {
const cancelSpy = vi.fn(async () => undefined);
globalThis.fetch = (async (): Promise<Response> => ({
ok: true,
status: 200,
headers: new Headers({ "Content-Type": "application/json" }),
json: () => new Promise(() => undefined),
body: {
cancel: cancelSpy
}
} as unknown as Response)) as typeof fetch;
const pending = checkGitHubUpdate("owner/repo");
await vi.advanceTimersByTimeAsync(13000);
const result = await pending;
expect(result.updateAvailable).toBe(false);
expect(String(result.error || "")).toMatch(/timeout/i);
expect(cancelSpy).toHaveBeenCalledTimes(1);
} finally {
vi.useRealTimers();
}
});
it("aborts hanging update body downloads on idle timeout", async () => {
const previousTimeout = process.env.RD_UPDATE_BODY_IDLE_TIMEOUT_MS;
process.env.RD_UPDATE_BODY_IDLE_TIMEOUT_MS = "1000";
try {
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("hang-setup.exe")) {
const body = new ReadableStream<Uint8Array>({
start(controller) {
controller.enqueue(new Uint8Array([1, 2, 3]));
}
});
return new Response(body, {
status: 200,
headers: { "Content-Type": "application/octet-stream" }
});
}
return new Response("missing", { status: 404 });
}) as typeof fetch;
const prechecked: UpdateCheckResult = {
updateAvailable: true,
currentVersion: APP_VERSION,
latestVersion: "9.9.9",
latestTag: "v9.9.9",
releaseUrl: "https://codeberg.org/owner/repo/releases/tag/v9.9.9",
setupAssetUrl: "https://example.invalid/hang-setup.exe",
setupAssetName: "",
setupAssetDigest: "sha256:aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
};
const result = await installLatestUpdate("owner/repo", prechecked);
expect(result.started).toBe(false);
expect(result.message).toMatch(/timeout/i);
} finally {
if (previousTimeout === undefined) {
delete process.env.RD_UPDATE_BODY_IDLE_TIMEOUT_MS;
} else {
process.env.RD_UPDATE_BODY_IDLE_TIMEOUT_MS = previousTimeout;
}
}
}, 20000);
it("blocks installer start when SHA256 digest mismatches", async () => {
const executablePayload = fs.readFileSync(process.execPath);
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("mismatch-setup.exe")) {
return new Response(executablePayload, {
status: 200,
headers: { "Content-Type": "application/octet-stream" }
});
}
return new Response("missing", { status: 404 });
}) as typeof fetch;
const prechecked: UpdateCheckResult = {
updateAvailable: true,
currentVersion: APP_VERSION,
latestVersion: "9.9.9",
latestTag: "v9.9.9",
releaseUrl: "https://codeberg.org/owner/repo/releases/tag/v9.9.9",
setupAssetUrl: "https://example.invalid/mismatch-setup.exe",
setupAssetName: "setup.exe",
setupAssetDigest: "sha256:1111111111111111111111111111111111111111111111111111111111111111"
};
const result = await installLatestUpdate("owner/repo", prechecked);
expect(result.started).toBe(false);
expect(result.message).toMatch(/integrit|sha256|mismatch/i);
});
it("uses latest.yml SHA512 digest when API asset digest is missing", async () => {
const executablePayload = fs.readFileSync(process.execPath);
const digestSha512Hex = sha512Hex(executablePayload);
const digestSha512Base64 = Buffer.from(digestSha512Hex, "hex").toString("base64");
const requestedUrls: string[] = [];
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
requestedUrls.push(url);
if (url.endsWith("/releases/tags/v9.9.9")) {
return new Response(JSON.stringify({
tag_name: "v9.9.9",
draft: false,
prerelease: false,
assets: [
{
name: "Real-Debrid-Downloader Setup 9.9.9.exe",
browser_download_url: "https://example.invalid/setup-no-digest.exe"
},
{
name: "latest.yml",
browser_download_url: "https://example.invalid/latest.yml"
}
]
}), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
if (url.includes("latest.yml")) {
return new Response(
`version: 9.9.9\npath: Real-Debrid-Downloader-Setup-9.9.9.exe\nsha512: ${digestSha512Base64}\n`,
{
status: 200,
headers: { "Content-Type": "text/yaml" }
}
);
}
if (url.includes("setup-no-digest.exe")) {
return new Response(executablePayload, {
status: 200,
headers: {
"Content-Type": "application/octet-stream",
"Content-Length": String(executablePayload.length)
}
});
}
return new Response("missing", { status: 404 });
}) as typeof fetch;
const prechecked: UpdateCheckResult = {
updateAvailable: true,
currentVersion: APP_VERSION,
latestVersion: "9.9.9",
latestTag: "v9.9.9",
releaseUrl: "https://codeberg.org/owner/repo/releases/tag/v9.9.9",
setupAssetUrl: "https://example.invalid/setup-no-digest.exe",
setupAssetName: "Real-Debrid-Downloader Setup 9.9.9.exe",
setupAssetDigest: ""
};
const result = await installLatestUpdate("owner/repo", prechecked);
expect(result.started).toBe(true);
expect(requestedUrls.some((url) => url.endsWith("/releases/tags/v9.9.9"))).toBe(true);
expect(requestedUrls.some((url) => url.includes("latest.yml"))).toBe(true);
});
it("rejects installer when latest.yml SHA512 digest does not match", async () => {
const executablePayload = fs.readFileSync(process.execPath);
const wrongDigestBase64 = Buffer.alloc(64, 0x13).toString("base64");
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.endsWith("/releases/tags/v9.9.9")) {
return new Response(JSON.stringify({
tag_name: "v9.9.9",
draft: false,
prerelease: false,
assets: [
{
name: "Real-Debrid-Downloader Setup 9.9.9.exe",
browser_download_url: "https://example.invalid/setup-no-digest.exe"
},
{
name: "latest.yml",
browser_download_url: "https://example.invalid/latest.yml"
}
]
}), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
if (url.includes("latest.yml")) {
return new Response(
`version: 9.9.9\npath: Real-Debrid-Downloader Setup 9.9.9.exe\nsha512: ${wrongDigestBase64}\n`,
{
status: 200,
headers: { "Content-Type": "text/yaml" }
}
);
}
if (url.includes("setup-no-digest.exe")) {
return new Response(executablePayload, {
status: 200,
headers: {
"Content-Type": "application/octet-stream",
"Content-Length": String(executablePayload.length)
}
});
}
return new Response("missing", { status: 404 });
}) as typeof fetch;
const prechecked: UpdateCheckResult = {
updateAvailable: true,
currentVersion: APP_VERSION,
latestVersion: "9.9.9",
latestTag: "v9.9.9",
releaseUrl: "https://codeberg.org/owner/repo/releases/tag/v9.9.9",
setupAssetUrl: "https://example.invalid/setup-no-digest.exe",
setupAssetName: "Real-Debrid-Downloader Setup 9.9.9.exe",
setupAssetDigest: ""
};
const result = await installLatestUpdate("owner/repo", prechecked);
expect(result.started).toBe(false);
expect(result.message).toMatch(/sha512|integrit|mismatch/i);
});
it("emits install progress events while downloading and launching update", async () => {
const executablePayload = fs.readFileSync(process.execPath);
const digest = sha256Hex(executablePayload);
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("progress-setup.exe")) {
return new Response(executablePayload, {
status: 200,
headers: {
"Content-Type": "application/octet-stream",
"Content-Length": String(executablePayload.length)
}
});
}
return new Response("missing", { status: 404 });
}) as typeof fetch;
const prechecked: UpdateCheckResult = {
updateAvailable: true,
currentVersion: APP_VERSION,
latestVersion: "9.9.9",
latestTag: "v9.9.9",
releaseUrl: "https://codeberg.org/owner/repo/releases/tag/v9.9.9",
setupAssetUrl: "https://example.invalid/progress-setup.exe",
setupAssetName: "setup.exe",
setupAssetDigest: `sha256:${digest}`
};
const progressEvents: UpdateInstallProgress[] = [];
const result = await installLatestUpdate("owner/repo", prechecked, (progress) => {
progressEvents.push(progress);
});
expect(result.started).toBe(true);
expect(progressEvents.some((entry) => entry.stage === "starting")).toBe(true);
expect(progressEvents.some((entry) => entry.stage === "downloading")).toBe(true);
expect(progressEvents.some((entry) => entry.stage === "verifying")).toBe(true);
expect(progressEvents.some((entry) => entry.stage === "launching")).toBe(true);
expect(progressEvents.some((entry) => entry.stage === "done")).toBe(true);
});
});
describe("normalizeUpdateRepo extended", () => {
it("handles trailing slashes and extra path segments", () => {
expect(normalizeUpdateRepo("owner/repo/")).toBe("owner/repo");
expect(normalizeUpdateRepo("/owner/repo/")).toBe("owner/repo");
expect(normalizeUpdateRepo("https://codeberg.org/owner/repo/tree/main/src")).toBe("owner/repo");
});
it("handles ssh-style git URLs", () => {
expect(normalizeUpdateRepo("git@codeberg.org:user/project.git")).toBe("user/project");
});
it("returns default for malformed inputs", () => {
expect(normalizeUpdateRepo("just-one-part")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo(" ")).toBe("Administrator/real-debrid-downloader");
});
it("rejects traversal-like owner or repo segments", () => {
expect(normalizeUpdateRepo("../owner/repo")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo("owner/../repo")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo("https://codeberg.org/owner/../../repo")).toBe("Administrator/real-debrid-downloader");
});
it("handles www prefix", () => {
expect(normalizeUpdateRepo("https://www.codeberg.org/owner/repo")).toBe("owner/repo");
expect(normalizeUpdateRepo("www.codeberg.org/owner/repo")).toBe("owner/repo");
});
});
describe("isRemoteNewer", () => {
it("detects newer major version", () => {
expect(isRemoteNewer("1.0.0", "2.0.0")).toBe(true);
});
it("detects newer minor version", () => {
expect(isRemoteNewer("1.2.0", "1.3.0")).toBe(true);
});
it("detects newer patch version", () => {
expect(isRemoteNewer("1.2.3", "1.2.4")).toBe(true);
});
it("returns false for same version", () => {
expect(isRemoteNewer("1.2.3", "1.2.3")).toBe(false);
});
it("returns false for older version", () => {
expect(isRemoteNewer("2.0.0", "1.0.0")).toBe(false);
expect(isRemoteNewer("1.3.0", "1.2.0")).toBe(false);
expect(isRemoteNewer("1.2.4", "1.2.3")).toBe(false);
});
it("handles versions with different segment counts", () => {
expect(isRemoteNewer("1.2", "1.2.1")).toBe(true);
expect(isRemoteNewer("1.2.1", "1.2")).toBe(false);
expect(isRemoteNewer("1", "1.0.1")).toBe(true);
});
it("handles v-prefix in version strings", () => {
expect(isRemoteNewer("v1.0.0", "v2.0.0")).toBe(true);
expect(isRemoteNewer("v1.0.0", "v1.0.0")).toBe(false);
});
});
describe("parseVersionParts", () => {
it("parses standard version strings", () => {
expect(parseVersionParts("1.2.3")).toEqual([1, 2, 3]);
expect(parseVersionParts("10.20.30")).toEqual([10, 20, 30]);
});
it("strips v prefix", () => {
expect(parseVersionParts("v1.2.3")).toEqual([1, 2, 3]);
expect(parseVersionParts("V1.2.3")).toEqual([1, 2, 3]);
});
it("handles single segment", () => {
expect(parseVersionParts("5")).toEqual([5]);
});
it("handles version with pre-release suffix", () => {
// Non-numeric suffixes are stripped per part
expect(parseVersionParts("1.2.3-beta")).toEqual([1, 2, 3]);
expect(parseVersionParts("1.2.3rc1")).toEqual([1, 2, 3]);
});
it("handles empty and whitespace", () => {
expect(parseVersionParts("")).toEqual([0]);
expect(parseVersionParts(" ")).toEqual([0]);
});
it("handles versions with extra dots", () => {
expect(parseVersionParts("1.2.3.4")).toEqual([1, 2, 3, 4]);
});
});

118
tests/utils.test.ts Normal file
View File

@ -0,0 +1,118 @@
import { describe, expect, it } from "vitest";
import { extractHttpLinksFromText, parsePackagesFromLinksText, isHttpLink, sanitizeFilename, formatEta, filenameFromUrl, looksLikeOpaqueFilename } from "../src/main/utils";
describe("utils", () => {
it("validates http links", () => {
expect(isHttpLink("https://example.com/file")).toBe(true);
expect(isHttpLink("http://example.com/file")).toBe(true);
expect(isHttpLink("ftp://example.com")).toBe(false);
expect(isHttpLink("foo bar")).toBe(false);
});
it("extracts links from text and trims trailing punctuation", () => {
const links = extractHttpLinksFromText("See (https://example.com/test) and https://rapidgator.net/file/abc123, plus https://example.com/a.b.");
expect(links).toEqual([
"https://example.com/test",
"https://rapidgator.net/file/abc123",
"https://example.com/a.b"
]);
});
it("sanitizes filenames", () => {
expect(sanitizeFilename("foo/bar:baz*")).toBe("foo bar baz");
expect(sanitizeFilename(" ")).toBe("Paket");
expect(sanitizeFilename("test\0file.txt")).toBe("testfile.txt");
expect(sanitizeFilename("\0\0\0")).toBe("Paket");
expect(sanitizeFilename("..")).toBe("Paket");
expect(sanitizeFilename(".")).toBe("Paket");
expect(sanitizeFilename("release... ")).toBe("release");
expect(sanitizeFilename(" con ")).toBe("con_");
});
it("parses package markers", () => {
const parsed = parsePackagesFromLinksText(
"# package: A\nhttps://a.com/1\nhttps://a.com/2\n# package: B\nhttps://b.com/1\n",
"Default"
);
expect(parsed).toHaveLength(2);
expect(parsed[0].name).toBe("A");
expect(parsed[0].links).toHaveLength(2);
expect(parsed[1].name).toBe("B");
});
it("formats eta", () => {
expect(formatEta(-1)).toBe("--");
expect(formatEta(65)).toBe("01:05");
expect(formatEta(3661)).toBe("01:01:01");
});
it("normalizes filenames from links", () => {
expect(filenameFromUrl("https://rapidgator.net/file/id/show.part1.rar.html")).toBe("show.part1.rar");
expect(filenameFromUrl("https://debrid.example/dl/abc?filename=Movie.S01E01.mkv")).toBe("Movie.S01E01.mkv");
expect(filenameFromUrl("https://debrid.example/dl/%E0%A4%A")).toBe("%E0%A4%A");
expect(filenameFromUrl("https://debrid.example/dl/e51f6809bb6ca615601f5ac5db433737")).toBe("e51f6809bb6ca615601f5ac5db433737");
expect(filenameFromUrl("data:text/plain;base64,SGVsbG8=")).toBe("download.bin");
expect(filenameFromUrl("blob:https://example.com/12345678-1234-1234-1234-1234567890ab")).toBe("download.bin");
expect(looksLikeOpaqueFilename("download.bin")).toBe(true);
expect(looksLikeOpaqueFilename("e51f6809bb6ca615601f5ac5db433737")).toBe(true);
expect(looksLikeOpaqueFilename("movie.part1.rar")).toBe(false);
});
it("preserves unicode filenames", () => {
expect(sanitizeFilename("日本語ファイル.txt")).toBe("日本語ファイル.txt");
expect(sanitizeFilename("Ünïcödé Tëst.mkv")).toBe("Ünïcödé Tëst.mkv");
expect(sanitizeFilename("파일이름.rar")).toBe("파일이름.rar");
expect(sanitizeFilename("файл.zip")).toBe("файл.zip");
});
it("handles very long filenames", () => {
const longName = "a".repeat(300);
const result = sanitizeFilename(longName);
expect(typeof result).toBe("string");
expect(result.length).toBeGreaterThan(0);
// The function should return a non-empty string and not crash
expect(result).toBe(longName);
});
it("formats eta with very large values without crashing", () => {
const result = formatEta(999999);
expect(typeof result).toBe("string");
expect(result.length).toBeGreaterThan(0);
// 999999 seconds = 277h 46m 39s
expect(result).toBe("277:46:39");
});
it("formats eta with edge cases", () => {
expect(formatEta(0)).toBe("00:00");
expect(formatEta(NaN)).toBe("--");
expect(formatEta(Infinity)).toBe("--");
expect(formatEta(Number.MAX_SAFE_INTEGER)).toMatch(/^\d+:\d{2}:\d{2}$/);
});
it("extracts filenames from URLs with encoded characters", () => {
expect(filenameFromUrl("https://example.com/file%20with%20spaces.rar")).toBe("file with spaces.rar");
// %C3%A9 decodes to e-acute (UTF-8), which is preserved
expect(filenameFromUrl("https://example.com/t%C3%A9st%20file.zip")).toBe("t\u00e9st file.zip");
expect(filenameFromUrl("https://example.com/dl?filename=Movie%20Name%20S01E01.mkv")).toBe("Movie Name S01E01.mkv");
// Malformed percent-encoding should not crash
const result = filenameFromUrl("https://example.com/%ZZ%invalid");
expect(typeof result).toBe("string");
expect(result.length).toBeGreaterThan(0);
});
it("handles looksLikeOpaqueFilename edge cases", () => {
// Empty string -> sanitizeFilename returns "Paket" which is not opaque
expect(looksLikeOpaqueFilename("")).toBe(false);
expect(looksLikeOpaqueFilename("a")).toBe(false);
expect(looksLikeOpaqueFilename("ab")).toBe(false);
expect(looksLikeOpaqueFilename("abc")).toBe(false);
expect(looksLikeOpaqueFilename("download.bin")).toBe(true);
// 24-char hex string is opaque (matches /^[a-f0-9]{24,}$/)
expect(looksLikeOpaqueFilename("abcdef123456789012345678")).toBe(true);
expect(looksLikeOpaqueFilename("abcdef1234567890abcdef12")).toBe(true);
// Short hex strings (< 24 chars) are NOT considered opaque
expect(looksLikeOpaqueFilename("abcdef12345")).toBe(false);
// Real filename with extension
expect(looksLikeOpaqueFilename("Show.S01E01.720p.mkv")).toBe(false);
});
});

16
tsconfig.json Normal file
View File

@ -0,0 +1,16 @@
{
"compilerOptions": {
"target": "ES2022",
"module": "ESNext",
"moduleResolution": "Bundler",
"jsx": "react-jsx",
"lib": ["ES2022", "DOM", "DOM.Iterable"],
"strict": true,
"skipLibCheck": true,
"esModuleInterop": true,
"resolveJsonModule": true,
"isolatedModules": true,
"types": ["node", "vite/client"]
},
"include": ["src", "tests", "vite.config.mts"]
}

14
vite.config.mts Normal file
View File

@ -0,0 +1,14 @@
import { defineConfig } from "vite";
import react from "@vitejs/plugin-react";
import path from "node:path";
export default defineConfig({
plugins: [react()],
base: "./",
root: path.resolve(__dirname, "src/renderer"),
publicDir: path.resolve(__dirname, "assets"),
build: {
outDir: path.resolve(__dirname, "build/renderer"),
emptyOutDir: true
}
});

9
vitest.config.ts Normal file
View File

@ -0,0 +1,9 @@
import { defineConfig } from "vitest/config";
export default defineConfig({
test: {
environment: "node",
include: ["tests/**/*.test.ts"],
globals: true
}
});