The Omniverse Library – Knowledge For Life Volume I

Knowledge For Life Volume I

The Omniverse Library:
A diverse reading list from several topics.
The Omniverse Library boasts an extensive collection of resources covering a wide range of subjects, including science, history, philosophy, and the occult. Users can access a plethora of articles, books, research papers, manuscripts, and multimedia content curated from reputable sources worldwide.

Continuous Enrichment: The Omniverse Library is a dynamic platform continually enriched with new additions and updates. With regular contributions from experts, scholars, and content creators, the library remains a vital source of knowledge, fostering intellectual growth and exploration in an ever-evolving world.

Join the Quest for Knowledge: Embark on a journey of discovery and enlightenment with The Omniverse Library—an unparalleled digital repository where the boundaries of human understanding are transcended, and the pursuit of truth knows no bounds.

American & World HistorySciencePhilosophyThe OccultSurvival & Of Course.. some Miscreant Materials.
Carl SaganIsaac NewtonNikola TeslaSun TzuAleister CrowleyKarl MarxAnarchist CookbookBushcraft




PhP Header Request Spoofing Ip Address User Agent Geo-Location

Generate Random HTTP Request

Random HTTP Request Generator – “generator.php”

This generates the Header Request Information to be sent to a Destination URL.
For Testing Purposes Only – Some Files Have Been Excluded.
The Destination URL tracks incoming HTTP Requests and filters them for “bad data” or
“Spoofed Requests” such as the requests generated here.

FFmpeg Video and Photo Software

Time Capturing Photos From Multiple Cameras And Archiving Script

This is pretty basic and I like it that way.
Using ffmpeg to capture the integrated web cam on my laptop and my USB webcam plugged in and then creating an archive to store subsequent photos in.
Part of a bigger project.

#!/bin/bash
# Set date for file naming
date=$(date +"%Y-%m-%d_%H%M%S")
    # Take photo using Integrated Webcam
      ffmpeg -f v4l2 -video_size 1280x720 -i /dev/video0 -frames 1 int.$date.jpg

    # Take photo using USB Webcam
      ffmpeg -f v4l2 -video_size 1280x720 -i /dev/video1 -frames 1 usb.$date.jpg

    # Add all .jpg files to payload.zip
      zip payload.zip *.jpg

    # Remove all .jpg files now
      rm *.jpg

    # Set time between photos
      sleep 10

    # Exit and start over
./$(basename $0) && exit
Netcat file transfer chat utility send receive files

Netcat Scheduled Server / Client File Transfer Script

Using Netcat may be “Old School”, but so am I, so I love using Netcat for simple tasks or just chatting without Big Brother paying too much attention. I love using Bold Text too.

These are two separate scripts, one for use on a server, “server.sh” (home pc/Pi/laptop or and server that allows you to use Netcat) and “client.sh”, which you can use on your Android or Laptop etc from a mobile location.
Of course you’re going to have to set permissions and run them. I highly suggest editing out the sleep function and using cron if you’re savvy as this is really meant to update files such as remote sensors, cameras etc.

*Edit the IP address to your server in client.sh.

server.sh

#!/bin/bash
clear
    echo "Server Running."
        mkdir incoming
    date="$(date +'%Y-%m-%d_%H-%M')"
    file="incoming/payload.file"
# Set the Servers Port To Listen On
    echo $(nc -l 1234 > $file)
        mv $file "incoming/$date.payload"
    echo "File Recieved."
    sleep 10
./$(basename $0) && exit

client.sh

#!/bin/bash
clear
mkdir outgoing
    echo "Client Running."
        file="outgoing/payload.file"
# For Demo Only
    touch $file
    echo "Some Data" >> $file
# Set The Server IP and Port To Connect To
    echo $(nc -w 3 192.168.1.XXX 1234 < $file)
    echo "File Sent."
    sleep 60
./$(basename $0) && exit
BashKat Web Scraper

BashKat Web Scraping Utility Script

BashKat is pretty straight forward and really easy to use.
I made sure to add some “cute” to it with the emojis.
This bot will scrape from user input or a file using the wget function (example: urls.txt) and it’s Super Fun when using Proxychains.


#!/usr/bin/env bash
# BashKat Version 1.0.2
# K0NxT3D

# Variables
BotOptions="Url File Quit"

# Welcome Banner
clear
printf "✨ BashKat 1.0 ✨\nScrape Single URL/IP or Multiple From File.\n\n" && sleep 1

# Bot Options Menu
select option in $BotOptions; do

# Single URL Scrape
   if [ "$option" = "Url" ];
    then
      printf "URL To Scrape: "
       read scrapeurl
     mkdir -p data/
    wget -P data/ \
     -4 \
     -w 0 \
     -t 3 \
     -rkpN -e robots=off \
     --header="Accept: text/html" \
     --user-agent="BashKat/1.0 (BashKat 1.0 Web Scraper Utility +http://www.bashkat.bot/)" \
     --referer="http://www.bashkat.bot" \
     --random-wait \
     --recursive \
     --no-clobber \
     --page-requisites \
     --convert-links \
     --restrict-file-names=windows \
     --domains $scrapeurl \
     --no-parent \
         $scrapeurl

      printf "🏁Scrape Complete.\nHit Enter To Continue.👍"
       read anykey
./$(basename $0) && exit

  elif [ "$option" = "File" ];
   then
      printf "Path To File: "
       read filepath
     while IFS= read -r scrapeurl
      do
     mkdir -p data/
    wget -P data/ \
     -4 \
     -w 0 \
     -t 3 \
     -rkpN -e robots=off \
     --header="Accept: text/html" \
     --user-agent="BashKat/1.0 (BashKat 1.0 Web Scraper Utility +http://www.bashkat.bot/)" \
     --referer="http://www.bashkat.bot" \
     --random-wait \
     --recursive \
     --no-clobber \
     --page-requisites \
     --convert-links \
     --restrict-file-names=windows \
     --domains $scrapeurl \
     --no-parent \
         $scrapeurl 
     done < "$filepath"
      printf "🏁Scrape Complete.\nHit Enter To Continue.👍"
       read anykey
./$(basename $0) && exit

 elif [ "$option" = "Quit" ];
 then
   printf "Quitting🏳"
    sleep 1
     clear
      exit
# ERRORS
  else
   clear
    printf "❌"
    sleep 1
   ./$(basename $0) && exit
  fi
 exit
done
Paparazzi ScreenShot Bot

Paparazzi Screenshot Script (Bot)

Paparazzi: A basic Screenshot Utility (Bot) for collecting screenshots of IP addresses and or URLs.
This script requires a personal API key and you can get yours here: BrowShotAPI

#!/usr/bin/env bash
# Paparazzi 1.0 - K0NxT3D 2021
# Website ScreenShot Utility
# Bash: ./bot

# Variables
    BotOptions="Url File Quit"
    images="./images"

# Welcome Banner - Create Directories
    mkdir -p $images
    clear
    printf "Paparazzi 1.0 - K0NxT3D\n"
    sleep 1

# Bot Options Menu
    select option in $BotOptions; do

# Single URL/IP Scan
    if [ "$option" = "Url" ];
     then
      printf "IP/URL To Scan: "
    read url
        curl -L "https://api.browshot.com/api/v1/simple?url=$url&key=ENTER YOUR KEY HERE" -o $images/$url.png
      printf "Finished Scanning.\nHit Enter To Continue.."
    read anykey
./bot

# Multiple URL/IP Scan From File (example: urls.txt)
    elif [ "$option" = "File" ];
     then
      printf "Path To File: "
    read filepath
        while IFS= read -r url
        do
          curl -L "https://api.browshot.com/api/v1/simple?url=$url&key=ENTER YOUR KEY HERE" -o $images/$url.png
        done < "$filepath"
    printf "Finished Scanning.\nHit Enter To Continue.."
 read anykey
./bot

# Quitter!!!
    elif [ "$option" = "Quit" ];
     then
        printf "Quitting!"
    sleep 1
clear
exit

# ERRORS
    else
    clear
        printf ""
    sleep 1
   bash ./bot
fi
exit
done