Skip to content

Commit

Permalink
fixes and improvements
Browse files Browse the repository at this point in the history
feat: oneTimeDownload for E621 and E926 #1
feat: selection for desired site
feat: new pretty logo
feat: auto updater
fix: spelling mistakes
docs: ReadMe updated and cleaned
refactor: config manager
refactor: __init__.py
  • Loading branch information
Official-Husko committed Aug 17, 2023
1 parent a4ec09d commit cb7fccb
Show file tree
Hide file tree
Showing 13 changed files with 249 additions and 118 deletions.
20 changes: 12 additions & 8 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,14 +1,18 @@

config.json

modules/__pycache__/

__pycache__/
*.spec

dist/

build/

media/

.nn-d/
.env/
testing_accounts.txt
config.json.bak
old_config.json
testing_accounts.txt.bak
db/
outdated
modules/updateManager.old
modules/auto_update.py
runtime.log
delete-exe.bat
6 changes: 5 additions & 1 deletion Build Release.bat
Original file line number Diff line number Diff line change
@@ -1 +1,5 @@
pyinstaller --onefile --icon "icon.ico" --console --name "NN-Downloader" --upx-dir "Z:\Projects\Python\### UPX ###" --add-data="Z:/Projects/Python/NN-Downloader/.nn-d/Lib/site-packages/grapheme/data/*;grapheme/data/" main.py
".\.env\Scripts\activate" && pyinstaller --onefile --icon "icon.ico" --console --name "NN-Downloader" --upx-dir "Z:\Projects\Python\### UPX ###" --add-data="Z:/Projects/Python/NN-Downloader/.env/Lib/site-packages/grapheme/data/*;grapheme/data/" main.py

rmdir /s /q .\build
rmdir /s /q .\__pycache__
del ".\NN-Downloader.spec"
4 changes: 0 additions & 4 deletions Clean Folder.bat

This file was deleted.

27 changes: 16 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
# NN-Downloader

Welcome to the successor of the [multporn image downloader][1] [v1][2] & [v2][1] and most downloaders out there regarding "NSFW" material. The NN-Downloader or Naughty-Naughty-Downloader (yes very creative i know) supports multiple sites with their official api (if available), proxies and its also portable.
Welcome to the successor of the [multporn image downloader v1][2] & [v2][1] and most downloaders out there regarding "NSFW" material. The NN-Downloader or Naughty-Naughty-Downloader (yes very creative I know) supports multiple sites with their official API (if available), and proxies and it's also portable.

This is not the complete version and it only works on a [few][13] sites currently. The other parts are WIP and will be complete in the near future. More Documentation and other gibberish coming soo.
This project is unfinished and only works on the [listed][13] sites currently. More Documentation and other gibberish are planned.

[Download][14]
[Windows Download][14] | [Linux Download][21]

<br />

Expand All @@ -13,19 +13,18 @@ This is not the complete version and it only works on a [few][13] sites currentl
- [E621][4] (API)
- [E926][5] (API)
- [Furbooru][6] (API)
- [Multporn][7]
- [Multporn][7]
- [Yiffer][8]
- [Luscious][16]
- [Luscious][16] ***(Currently Broken!)***

#### Planned:
- [YiffGallery][9]
- ~~[FurryBooru][10]~~ Currently not possible due to cloudflare issues.
- ~~[FurryBooru][10]~~ Is currently not possible due to Cloudflare blocking access when contacting their API.
- [BooruPlus][11]
- [nHentai][15]
- [Pixiv][17]
- [HentaiRead][18]


[1]:https://github.com/Official-Husko/multporn-image-downloader-v2
[2]:https://github.com/Official-Husko/multporn-image-downloader
[3]:https://rule34.xxx
Expand All @@ -44,14 +43,20 @@ This is not the complete version and it only works on a [few][13] sites currentl
[16]:https://luscious.net/
[17]:https://www.pixiv.net/
[18]:https://hentairead.com/
[19]:https://rule34.art/
[20]:https://2.multporn.net/
[21]:https://github.com/HttpAnimation/NN-Downloader-Linux


Further sites can be added. Just open a [support ticket][11] with the url to the site.
Further sites can be added. Just open a [support ticket][11] with the URL to the site.

<br />
<br />
<br />

#### Disclaimer
***I'am not in any way affiliated or working with these Sites. This is a unofficial project.***
*I would suggest you to use a customized Windows Terminal.
***I am not in any way affiliated or working with these Sites. This is an unofficial project.***
*I would suggest you use a customized Terminal.


[//]: # (Ingore These Lines Below)
[//]: # (Including mirror [rule34.art][19] & [2.multporn.net][20])
1 change: 1 addition & 0 deletions enable_env.bat
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
".\.env\Scripts\activate"
182 changes: 97 additions & 85 deletions main.py
Original file line number Diff line number Diff line change
@@ -1,157 +1,158 @@
from modules import E621, RULE34, ProxyScraper, FURBOORU, E926, Multporn, Yiffer, Luscious
from modules import *
import json
import os
from termcolor import colored
from ctypes import windll
from time import sleep
from sys import exit
import inquirer


version = "1.3.1"
version = "1.4.0"
windll.kernel32.SetConsoleTitleW(f"NN-Downloader | v{version}")
proxy_list = []
header = {"User-Agent":f"nn-downloader/{version} (by Official Husko on GitHub)"}
needed_folders = ["db", "media"]
database_list = ["e621.db"]

if os.path.exists("outdated"):
version_for_logo = colored(f"v{version}", "cyan", attrs=["blink"])
else:
version_for_logo = colored(f"v{version}", "cyan")

logo = f"""{colored(f'''
d8b db d8b db d8888b. .d88b. db d8b db d8b db db .d88b. .d8b. d8888b. d88888b d8888b.
888o 88 888o 88 88 `8D .8P Y8. 88 I8I 88 888o 88 88 .8P Y8. d8' `8b 88 `8D 88' 88 `8D
88V8o 88 88V8o 88 88 88 88 88 88 I8I 88 88V8o 88 88 88 88 88ooo88 88 88 88ooooo 88oobY'
88 V8o88 88 V8o88 C8888D 88 88 88 88 Y8 I8I 88 88 V8o88 88 88 88 88~~~88 88 88 88~~~~~ 88`8b
88 V888 88 V888 88 .8D `8b d8' `8b d8'8b d8' 88 V888 88booo. `8b d8' 88 88 88 .8D 88. 88 `88.
VP V8P VP V8P Y8888D' `Y88P' `8b8' `8d8' VP V8P Y88888P `Y88P' YP YP Y8888D' Y88888P 88 YD
{version_for_logo} | by {colored("Official-Husko", "yellow")}''', "red")}
"""

class Main():
def main_startup():
os.system("cls")
print(colored("======================================================================================================================", "red"))
print(colored("| |", "red"))
print(colored("| " + colored("Product: ", "white") + colored("NN-Downloader", "green") + colored(" |", "red"), "red"))
print(colored("| " + colored("Version: ", "white") + colored(version, "green") + colored(" |", "red"), "red"))
print(colored("| " + colored("Description: ", "white") + colored("Download Naughty images fast from multiple sites.", "green") + colored(" |", "red"), "red"))
print(colored("| |", "red"))
print(colored("======================================================================================================================", "red"))
print(logo)
print("")

# Check if media folder exists else create it
if not os.path.exists("media"):
os.mkdir("media")
# Check if needed folders exists else create them
for folder in needed_folders:
if not os.path.exists(folder):
os.mkdir(folder)

# Check if config exists and read it
if os.path.exists("config.json"):
with open("config.json") as cf:
config = json.load(cf)
user_proxies = config["proxies"]
user_OTD = config["oneTimeDownload"]
user_blacklist = config["blacklisted_tags"]
user_blocked_formats = config["blacklisted_formats"]

# Create a new config with default values
if os.path.exists("config.json"):
config = Config_Manager.reader()
oneTimeDownload = config["oneTimeDownload"]
use_proxies = config["proxies"]
checkForUpdates = config["checkForUpdates"]
else:
default_config = {
"proxies": "true",
"oneTimeDownload": "true",
"user_credentials": {
"e621": {
"apiUser": "",
"apiKey": ""
},
"e926": {
"apiUser": "",
"apiKey": ""
},
"rule34": {
"user_id": "",
"pass_hash": "",
"comment": "currently not used"
},
"yiffer": {
"username": "",
"email": "",
"id": "",
"comment": "currently not used"
},
"yiffgallery": {
"pwg_id": "",
"comment": "currently not used"
},
"furbooru": {
"apiKey": ""
}
},
"blacklisted_tags": [
"example1",
"example2"
],
"blacklisted_formats": [
"example1",
"example2"
]
}
with open("config.json", "w") as cc:
json.dump(default_config, cc, indent=6)
cc.close()
config = Config_Manager.creator()
print(colored("New Config file generated. Please configure it for your use case and add API keys for needed services.", "green"))
sleep(7)
exit(0)

if user_proxies == True:
if checkForUpdates == True:
os.system("cls")
print(logo)
print("")
print(colored("Checking for Updates...", "yellow"), end='\r')
AutoUpdate.Checker()
os.system("cls")
print(logo)
print("")

if use_proxies == True:
print(colored("Fetching Fresh Proxies...", "yellow"), end='\r')
ProxyScraper.Scraper(proxy_list=proxy_list)
print(colored(f"Fetched {len(proxy_list)} Proxies. ", "green"))
print("")

if oneTimeDownload == True:
for database in database_list:
with open(f"db/{database}", "a") as db_creator:
db_creator.close()

print(colored("What site do you want to download from?", "green"))
site = input(">> ").lower()
if site == "":
print(colored("Please enter a site.", "red"))
sleep(3)
Main.main_startup()
questions = [
inquirer.List('selection',
choices=['E621', 'E926', 'Furbooru', 'Luscious', 'Multporn', 'Rule34', 'Yiffer']),
]
answers = inquirer.prompt(questions)
print("")

site = answers.get("selection").lower()

if site in ["multporn", "yiffer", "luscious"]:
pass
else:
print(colored("Please enter the tags you want to use", "green"))
user_tags = input(">> ").lower()
if user_tags == "":
while user_tags == "":
print(colored("Please enter the tags you want.", "red"))
sleep(3)
Main.main_startup()
user_tags = input(">> ").lower()
print("")

print(colored("How many pages would you like to get?", "green"), " (leave empty for max)")
print(colored("How many pages would you like to get?", "green"), colored(" (leave empty for max)", "yellow"))
max_sites = input(">> ").lower()
print("")

if site == "e621":
apiUser = config["user_credentials"]["e621"]["apiUser"]
apiKey = config["user_credentials"]["e621"]["apiKey"]
if oneTimeDownload == True:
with open("db/e621.db", "r") as db_reader:
database = db_reader.read().splitlines()
if apiKey == "" or apiUser == "":
print(colored("Please add your Api Key into the config.json", "red"))
sleep(3)
sleep(5)
else:
E621.Fetcher(user_tags=user_tags, user_blacklist=user_blacklist, proxy_list=proxy_list, max_sites=max_sites, user_proxies=user_proxies, apiUser=apiUser, apiKey=apiKey, header=header)
E621.Fetcher(user_tags=user_tags, user_blacklist=config["blacklisted_tags"], proxy_list=proxy_list, max_sites=max_sites, user_proxies=config["proxies"], apiUser=apiUser, apiKey=apiKey, header=header, db=database)
elif site == "e926":
apiUser = config["user_credentials"]["e926"]["apiUser"]
apiKey = config["user_credentials"]["e926"]["apiKey"]
if oneTimeDownload == True:
with open("db/e621.db", "r") as db_reader:
database = db_reader.read().splitlines()
if apiKey == "" or apiUser == "":
print(colored("Please add your Api Key into the config.json", "red"))
sleep(3)
sleep(5)
else:
E926.Fetcher(user_tags=user_tags, user_blacklist=user_blacklist, proxy_list=proxy_list, max_sites=max_sites, user_proxies=user_proxies, apiUser=apiUser, apiKey=apiKey, header=header)
E926.Fetcher(user_tags=user_tags, user_blacklist=config["blacklisted_tags"], proxy_list=proxy_list, max_sites=max_sites, user_proxies=config["proxies"], apiUser=apiUser, apiKey=apiKey, header=header, db=database)
elif site == "rule34":
RULE34.Fetcher(user_tags=user_tags, user_blacklist=user_blacklist, proxy_list=proxy_list, max_sites=max_sites, user_proxies=user_proxies, header=header)
RULE34.Fetcher(user_tags=user_tags, user_blacklist=config["blacklisted_tags"], proxy_list=proxy_list, max_sites=max_sites, user_proxies=config["proxies"], header=header)
elif site == "furbooru":
apiKey = config["user_credentials"]["furbooru"]["apiKey"]
if apiKey == "":
print(colored("Please add your Api Key into the config.json", "red"))
sleep(3)
sleep(5)
else:
FURBOORU.Fetcher(user_tags=user_tags, user_blacklist=user_blacklist, proxy_list=proxy_list, max_sites=max_sites, user_proxies=user_proxies, apiKey=apiKey, header=header)
FURBOORU.Fetcher(user_tags=user_tags, user_blacklist=config["blacklisted_tags"], proxy_list=proxy_list, max_sites=max_sites, user_proxies=config["proxies"], apiKey=apiKey, header=header)
elif site == "multporn":
print(colored("Please enter the link. (e.g. https://multporn.net/comics/double_trouble_18)", "green"))
URL = input(">> ")
Multporn.Fetcher(proxy_list=proxy_list, user_proxies=user_proxies, header=header, URL=URL)
while URL == "":
print(colored("Please enter a valid link.", "red"))
sleep(1.5)
URL = input(">> ")
Multporn.Fetcher(proxy_list=proxy_list, user_proxies=config["proxies"], header=header, URL=URL)
elif site == "yiffer":
print(colored("Please enter the link. (e.g. https://yiffer.xyz/Howl & Jasper)", "green"))
URL = input(">> ")
Yiffer.Fetcher(proxy_list=proxy_list, user_proxies=user_proxies, header=header, URL=URL)
while URL == "":
print(colored("Please enter a valid link.", "red"))
sleep(1.5)
URL = input(">> ")
Yiffer.Fetcher(proxy_list=proxy_list, user_proxies=config["proxies"], header=header, URL=URL)
elif site == "luscious":
print(colored("Please enter the link. (e.g. https://www.luscious.net/albums/bifurcation-ongoing_437722)", "green"))
URL = input(">> ")
Luscious.Fetcher(proxy_list=proxy_list, user_proxies=user_proxies, header=header, URL=URL)
while URL == "":
print(colored("Please enter a valid link.", "red"))
sleep(1.5)
URL = input(">> ")
Luscious.Fetcher(proxy_list=proxy_list, user_proxies=config["proxies"], header=header, URL=URL)


else:
Expand All @@ -161,4 +162,15 @@ def main_startup():
Main.main_startup()

if __name__ == '__main__':
Main.main_startup()
try:
Main.main_startup()
except KeyboardInterrupt:
print("User Cancelled")
sleep(3)
exit(0)


"""
TODO: fix luscious being broken
"""
11 changes: 9 additions & 2 deletions modules/__init__.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,15 @@
from .proxyScraper import ProxyScraper
from .configManager import Config_Manager
from .auto_update import AutoUpdate
from .logger import Logger
from .pretty_print import *


# Here are all modules for the sites that are supported
from .e621 import E621
from .e926 import E926
from .rule34 import RULE34
from .proxyscraper import ProxyScraper
from .furbooru import FURBOORU
from .e926 import E926
from .multporn import Multporn
from .yiffer import Yiffer
from .luscious import Luscious
Loading

0 comments on commit cb7fccb

Please sign in to comment.