:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-: : Multi-host Auto Downloader [aka MAD] (by kittykat) : :-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-: Setup: ------ 1. Extract to folder to a location. 2. Ensure the script is executable # Open terminal in script folder (or open terminal and cd into script folder). # Run "chmod +e mad.sh" to give the script execute permission. 3. Configure settings in mad.sh script that you desire. * Optional config: - Copy mad.config to script directory from the Documentation folder and configure settings there instead. This file will load and override any settings configured in the script -- this allows upgrading mad.sh without having to reconfigure the settings across versions. * Optional curl_impersonate: - See Optional Depenencies section below. 4. Add urls to urls.txt or any file you wish to use. 5. Run "./mad.sh urls.txt" (see usage section below for additional commands or run "./mad.sh ?" for help) Optional Dependecies: --------------------- Some hosts use CloudFlare to detect and block scripts (such as hexload). To get around it, this script needs to impersonate a browser. You'll need to download "curl-impersonate". It can be obtained on GitHub, search online for "curl-impersonate" To access the releases on GitHub without javascript, do this: 1. Visit the GitHub page of curl-impersonate and add "/releases/latest/" at end of URL. 2. You'll be redirected to the latest version, e.g: "/releases/tag/vx.x.x" 3. In the URL replace "tag" with "expanded_assets", e.g. "/releases/expanded_assets/v0.5.4" - Download archive "curl-impersonate-vX.Y.Z.x86_64-linux-gnu.tar.gz". - Extract files "curl-impersonate-ff" and "curl_ff109" next to this script or into your PATH." Usage (urls.txt): ----------------- - ENTER 1f, hexload, pixeldrain, kraken, dosya, filehaus, oshi, upload.ee, uploadhive, or uploadflix urls in urls.txt (one url per line). - ! No trailing spaces BUT requires at least one empty newline at the end of file ! Accepts/ignores comment lines and garbage lines (lines beginning with '#' or non http) Keywords (urls.txt): -------------------- folder= - Changes save to folder (Lines beginning with keyword "folder=") ie. folder=This is my folder! (vids) [2024] |filename.ext - Overrides filename to save download as (add the suffix "|Filename.ext" to a url) ie. http://oshi.at/abcd|My new filename.rar direct= - Will download directly from url (special processing for Lainsafe, FileDoge, NekoFile, DiscreetShare, and others. ie. direct=https://oshi.at/abcd/ABCD #pw= - Will update the $CurrentZipPassword variable in mad.sh (can be accessed in plugins) STOP! - Stops processing at this line RELOAD! - Forces a reload of the script and urls.txt file with the same commandline it started with Example: ----------- folder=New folder 01 #ref=http://urlToThreadOrPost #pw=**1234567890$$ https://1fichier.com/?123456789abcdefghijk http://hexload.com/123456789abc folder=New folder 02 #pw=4444555551-1 http://5ety7tpkim5me6eszuwcje7bmy25pbtrjtue7zkqqgziljwqy3rrikqd.onion/ZaZa/12az.rar http://oshi.at/AAzz/11ZZ.rar|File - Set 001 (2001).7z http://oshi.at/AAyy/11YY.rar|File - Set 002 (2001).7z http://pixeldrain.com/u/ZZaa0011 folder=Direct link fun #pw=2022234092 direct=http://pomf2.lain.la/f/abcd123456789.7z direct=http://pomf2.lain.la/f/ABCD998877000.rar|This is it [2022].rar ------ Informational Display ------------------------- [Status] of urls in urls.txt ./mad.sh status urls.txt [Reset] failed / retry urls in urls.txt ./mad.sh reset urls.txt [Host] Modules and their internal description ./mad.sh hosts [Plugins] and their internal description ./mad.sh plugins ------ Basic Usage (Uploads) ------------------------- [Upload] launch MAD Uploader (process files in ./uploads/ folder to selected hosts) ./mad.sh upload ------ Basic Usage (Downloads) ----------------------- [Run] ./mad.sh urls.txt ## Multi Runs: (mutli-terminals / all-hosts / specific-host) ## --------------------------------------------------------------- [Normal Mode] Process urls.txt in order with multiple terminals downloading (OS agnostic, run in X or more separate terminals) ./mad urls.txt ./mad urls.txt (OS dependent, X terminals for all hosts -- whonix tested) ./mad multi [2-8] urls.txt [Specific Host] Process only X host in terminal (OS agnostic, run in X separate terminals) ./mad 1f urls.txt ./mad hex urls.txt ./mad pd urls.txt ./mad kraken urls.txt ./mad dosya urls.txt ./mad fh urls.txt ./mad oshi urls.txt ./mad upee urls.txt ./mad uphive urls.txt ./mad upflix urls.txt [**Multi Specific Host] Create X terminals for a specific host and process downloads in order (**OS dependent, X terminals for a specific host -- whonix tested) ./mad multi 1f 2 urls.txt ./mad multi hex 2 urls.txt ./mad multi pd 2 urls.txt ./mad multi kraken 2 urls.txt ./mad multi dosya 2 urls.txt ./mad multi fh 2 urls.txt ./mad multi oshi 2 urls.txt ./mad multi upee 2 urls.txt ./mad multi uphive 2 urls.txt ./mad multi upflix 2 urls.txt [**Multi Auto] Create 1 terminal for each host and process downloads in order (**OS dependent, 1 terminal per host -- whonix tested) ./mad multi auto urls.txt [**Multi Auto] Create 4 terminals (1 terminal for each host) and process downloads in order (**OS dependent, 1 terminal per host -- whonix tested) ./mad multi auto 4 urls.txt