mirror of
http://git.nowherejezfoltodf4jiyl6r56jnzintap5vyjlia7fkirfsnfizflqd.onion/nihilist/darknet-lantern.git
synced 2025-05-16 12:16:57 +00:00
update roadmap + start working on simplex
This commit is contained in:
parent
760ee9d1b9
commit
5f7e87a2e1
1 changed files with 10 additions and 10 deletions
20
README.md
20
README.md
|
@ -102,20 +102,12 @@ V1.0.2:
|
|||
|
||||
Future Versions:
|
||||
```
|
||||
V1.0.3:
|
||||
- py: option 4: at the end of the synchronization phase, iterate over (your own) unverified links that do NOT have a description, and find the first description in other participants' verified.csv file to put in there, to enrich your unverified list (and skip if nobody wrote a description for it)
|
||||
- py+csv: expand on the participants trust levels, add 3 columns to be filled with -1, 0 or 1 (verified.csv, blacklist.csv, webring-participants.csv) that is to customize your instance's behavior in regards to other participants
|
||||
- py: in option 4, make sure that new webring participants (that are listed from other webrings) are automatically added on your own instance aswell, in case if you trust an instance's list of webring participants (opt-in only)
|
||||
|
||||
V1.0.4:
|
||||
- py: add an optional way to run lantern.py without any manual inputs by passing arguments (ex: python3 lantern.py 1 name desc link "description") or simply (python3 lantern.py 4) to synchronize links --> for all options! either manual lantern.py or prompt-less lantern.py with arguments
|
||||
- docker: figure out how to dockerize the darknet lantern project while maintaining the onion-only requirement (c0mmando is on it, will merge it when he finishes)
|
||||
|
||||
|
||||
V1.1.0 (SimpleX chatrooms and servers uptime ):
|
||||
V1.1.0 (SimpleX chatrooms and servers uptime): (WIP)
|
||||
- V1.1.0: find a way to check if the simplex invite link is still joinable or not, but without joining it every time (it should only join it ONCE) -> make it remember that this invite link equates to this chat
|
||||
room name ? and then if chatroom name exists then instead of trying to join it make it look for "this chatroom has been deleted" message, if it doesn't exist then assume the chatroom is still joinable. (problem is that the bot should not join every chatroom every 3 hours just to check the uptime, it pollutes chatrooms by doing that.)
|
||||
- V1.1.1: using regex alone, create the functions "isSimpleXChatroomValid", and "IsSimplexServerValid", and it should return True for ALL the different syntaxes for simplex chatroom invite links, and smp and xftp servers
|
||||
|
||||
- V1.1.2: uptime.py: make it able to check the uptime of 1) onion links, 2) simplex chatroom links (ONLY if it can query the simplex daemon on 127.0.0.1:3030), and 3) simplex smp and xftp servers
|
||||
|
||||
V1.2.0 SimpleX Crawler:
|
||||
|
@ -125,12 +117,20 @@ V1.2.0 SimpleX Crawler:
|
|||
- V1.2.2: crawler.py: make the script categorize the onion links into "onion websites", the simplex chatroom invite links into "simplex chatrooms", and the simplex servers smp and xftp links into "simplex serv
|
||||
ers" categories, AND in unverified.csv directly
|
||||
|
||||
V1.2.4+ Webring Participants expansions:
|
||||
-V1.2.4: py: option 4: at the end of the synchronization phase, iterate over (your own) unverified links that do NOT have a description, and find the first description in other participants' verified.csv file to put in there, to enrich your unverified list (and skip if nobody wrote a description for it)
|
||||
-V1.2.5: py: in option 4, make sure that new webring participants (that are listed from other webrings) are automatically added on your own instance aswell, in case if you trust an instance's list of webring participants (opt-in only)
|
||||
-V1.2.6: py+csv: expand on the participants trust levels, add 3 columns to be filled with -1, 0 or 1 (verified.csv, blacklist.csv, webring-participants.csv) that is to customize your instance's behavior in regards to other participants
|
||||
|
||||
V1.3.0 Onion Crawler:
|
||||
- V1.3.0: crawler.py: make the script iterate over every onion link in verified.csv, and from the page itself it should find every other a href html/php/txt file on that link directly (recursively), however it should have a limit to prevent crawling endlessly (make it configurable, for now it should crawl up to 10 sub-pages per onion site by default).
|
||||
- V1.3.1: crawler.py: Make it download those webpages in a temporary folder "onioncrawling/{onionwebsitename1.onion,onionwebsitename2.onion}/{index.html,links.php}" Once a website has been crawled, make it delete the entire folder and mark it as crawled in onion-crawl.csv (columns: link (http://blahlbahadazdazaz.onion), crawled (y/n))
|
||||
- V1.3.2: crawler.py: in each crawled html/php/txt file, make it find every simplex chatroom link, simplex server link, and every onion link.
|
||||
- V1.3.3: crawler.py: with every link found, make sure it is properly categorized just like in v1.2.2, directly into unverified.csv
|
||||
|
||||
V1.3.5+ Fully automatable lantern.py + docker:
|
||||
- V1.3.5: py: add an optional way to run lantern.py without any manual inputs by passing arguments (ex: python3 lantern.py 1 name desc link "description") or simply (python3 lantern.py 4) to synchronize links --> for all options! either manual lantern.py or prompt-less lantern.py with arguments
|
||||
- V1.3.6: docker: figure out how to dockerize the darknet lantern project while maintaining the onion-only requirement (c0mmando is on it, will merge it when he finishes)
|
||||
|
||||
V1.4.0+ PGP support:
|
||||
- csv+php+py: implement PGP support to list public pgp keys for verified websites
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue