The Omniverse Library – Knowledge For Life Volume I

Knowledge For Life Volume I

The Omniverse Library:
A diverse reading list from several topics.
The Omniverse Library boasts an extensive collection of resources covering a wide range of subjects, including science, history, philosophy, and the occult. Users can access a plethora of articles, books, research papers, manuscripts, and multimedia content curated from reputable sources worldwide.

Continuous Enrichment: The Omniverse Library is a dynamic platform continually enriched with new additions and updates. With regular contributions from experts, scholars, and content creators, the library remains a vital source of knowledge, fostering intellectual growth and exploration in an ever-evolving world.

Join the Quest for Knowledge: Embark on a journey of discovery and enlightenment with The Omniverse Library—an unparalleled digital repository where the boundaries of human understanding are transcended, and the pursuit of truth knows no bounds.

American & World HistorySciencePhilosophyThe OccultSurvival & Of Course.. some Miscreant Materials.
Carl SaganIsaac NewtonNikola TeslaSun TzuAleister CrowleyKarl MarxAnarchist CookbookBushcraft




DSX "Pure SEO" Content Management System

DSX DS7-1.2.5 Content Management System

DSX Version 7-1.2.5 (DS7) “Pure SEO” Content Management System. (Release Update V7-1.2.5)

While this CMS is considered “Black Hat”, it is what it is and it works.
Search Engines have priorities in what ranks and what doesn’t rank and
the single most important things anyone who wants the Top Ten knows are,
that your pages have to load fast, your content has to be abundant, thick and most
of all Hypertext Links.

DSX Delivers on all aspects of Fast Ranking “Pure SEO” tactics that I’ve developed
over the last 20+ years as a Professional SEO Expert and I stand behind my work.
I’m offering DSX 7-1.2.5 at a Very affordable price because it’s very small at this
point and that makes it relatively easy for you to make more of it or if you’re patient,
wait for the next version with far more features.

Installation & Troubleshooting.
View Demo
https://wordpress.org/download/

DarkMatter Display Random Quotes Plugin For WordPress

The DarkMatter Display Random Quotes Plugin For WordPress is a very simple and straight forward plugin that posts random quotes to the bottom of WordPress Posts.
You can edit the quotes.txt (plugins/quote-of-the-day/quotes.txt) file to add or remove your own.
You can see it running below this post.

Download: DarkMatter – Quote Of The Day WordPress Plugin

PhP Header Request Spoofing Ip Address User Agent Geo-Location

Generate Random HTTP Request

Random HTTP Request Generator – “generator.php”

This generates the Header Request Information to be sent to a Destination URL.
For Testing Purposes Only – Some Files Have Been Excluded.
The Destination URL tracks incoming HTTP Requests and filters them for “bad data” or
“Spoofed Requests” such as the requests generated here.

Server Status Monitor PhP Code

PhP Monitor Active and Inactive Servers

This is a simple and no frills way to monitor your servers.
You will need to create the file “urls.txt” in the same folder as the “active-servers.php” file.

Examples – urls.txt:
https://www.my-website.com/
http://localhost/
http://www.my-website.com/some/page.php

Navigate to the active-servers.php file. ( http://www.your-site.com/scripts/active-servers.php)
Online Servers will be Lime and Offline Servers, Red.

active-servers.php

<head>
<title>Active Servers</title>
</head>
<body style=”background-color: #0c0c0c;”>
<div>
<table width=”100%” height=”100%”>
<tr>
<td>
<table style=”margin-bottom: 100%;text-transform: uppercase;”>
<?php
$fn = fopen(“urls.txt”,”r”);
while(! feof($fn)) {
$result = fgets($fn);
$server=gethostbyname(parse_url($result, PHP_URL_HOST));
if (fsockopen($server, 80)){
echo (‘ <tr><td><b><a style=”color: Lime; text-decoration: none;”href=”‘.$result.'” target=”viewer”>’.$result.'</b></td></tr>’. “\r\n”);
} else
{
echo (‘ <tr><td style=”color: #a00000″>’.$result.'</td></tr>’. “\r\n”);
}

} fclose($fn); ?>

</table>
</td>
<td width=”100%”><iframe align=”right” class=”viewpanel” src=”” name=”viewer” frameborder=”0″ width=”100%” height=”600px”></iframe></td>
</tr>
</table>
</div>
</body>

Spoofing Random Toys MySql WordPress Form Data Fields

Fake Email Generator Create Random Email Addresses From Files

This is just a fun little toy that happened while working on MySQL Automation.
The files used are first_names.txt, last_names.txt and domains.txt.
Reading random lines from the files in order creates the “Fake Email Address” and using [array_rand($variable)]; each email address is somewhat unique as I’m only using 80,000 names (give or take a few hundred).

All Files: fake-email-generator.zip


#!/bin/bash
$first_names = 'first_names.txt';
$last_names = 'last_names.txt';
$dom = 'domains.txt';

    $firstname = file($first_names);
    $fdata = $firstname[array_rand($firstname)];
    $first = $fdata;

        $lastname = file($last_names);
        $ldata = $lastname[array_rand($lastname)];
        $last = $ldata;

    $comd = file($dom);
    $edata = $comd[array_rand($comd)];
    $com = $edata;

        $first = preg_replace('/\s+/', '', $first);
        $first = strtolower($first);
        $last = preg_replace('/\s+/', '', $last);
        $last = strtolower($last);
        $com = preg_replace('/\s+/', '', $com);

    echo $first."@".$last.$com;]
PhP Shell Bash Website Security Encryption Decryption Hash Encoding Decoding
FFmpeg Video and Photo Software

Time Capturing Photos From Multiple Cameras And Archiving Script

This is pretty basic and I like it that way.
Using ffmpeg to capture the integrated web cam on my laptop and my USB webcam plugged in and then creating an archive to store subsequent photos in.
Part of a bigger project.

#!/bin/bash
# Set date for file naming
date=$(date +"%Y-%m-%d_%H%M%S")
    # Take photo using Integrated Webcam
      ffmpeg -f v4l2 -video_size 1280x720 -i /dev/video0 -frames 1 int.$date.jpg

    # Take photo using USB Webcam
      ffmpeg -f v4l2 -video_size 1280x720 -i /dev/video1 -frames 1 usb.$date.jpg

    # Add all .jpg files to payload.zip
      zip payload.zip *.jpg

    # Remove all .jpg files now
      rm *.jpg

    # Set time between photos
      sleep 10

    # Exit and start over
./$(basename $0) && exit
Cryptography Cryptology OpenSSL Base 64 MD5 Security

OpenSSL Basic Encryption Script With Random Password Generation

Example script using OpenSSL AES 256 with Salt and a random generated password.
It’s the little things.

#!/bin/bash
clear
echo "Input String:"
    read input
        pass=$(echo cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 1024 | head -n 1)
        encrypt="$(echo -e $input | openssl aes-256-cbc -pbkdf2 -iter 20000 -salt -a -e -k $pass)"
        decrypt="$(echo -e $encrypt | openssl aes-256-cbc -pbkdf2 -iter 20000 -salt -a -d -k $pass)"
    echo -e "Encrypted String: "$encrypt
    echo -e "Decrypted String: "$decrypt
    echo "Hit Any Key.."
  read anykey
./$(basename $0) && exit
BashKat Web Scraper

BashKat Web Scraping Utility Script

BashKat is pretty straight forward and really easy to use.
I made sure to add some “cute” to it with the emojis.
This bot will scrape from user input or a file using the wget function (example: urls.txt) and it’s Super Fun when using Proxychains.


#!/usr/bin/env bash
# BashKat Version 1.0.2
# K0NxT3D

# Variables
BotOptions="Url File Quit"

# Welcome Banner
clear
printf "✨ BashKat 1.0 ✨\nScrape Single URL/IP or Multiple From File.\n\n" && sleep 1

# Bot Options Menu
select option in $BotOptions; do

# Single URL Scrape
   if [ "$option" = "Url" ];
    then
      printf "URL To Scrape: "
       read scrapeurl
     mkdir -p data/
    wget -P data/ \
     -4 \
     -w 0 \
     -t 3 \
     -rkpN -e robots=off \
     --header="Accept: text/html" \
     --user-agent="BashKat/1.0 (BashKat 1.0 Web Scraper Utility +http://www.bashkat.bot/)" \
     --referer="http://www.bashkat.bot" \
     --random-wait \
     --recursive \
     --no-clobber \
     --page-requisites \
     --convert-links \
     --restrict-file-names=windows \
     --domains $scrapeurl \
     --no-parent \
         $scrapeurl

      printf "🏁Scrape Complete.\nHit Enter To Continue.👍"
       read anykey
./$(basename $0) && exit

  elif [ "$option" = "File" ];
   then
      printf "Path To File: "
       read filepath
     while IFS= read -r scrapeurl
      do
     mkdir -p data/
    wget -P data/ \
     -4 \
     -w 0 \
     -t 3 \
     -rkpN -e robots=off \
     --header="Accept: text/html" \
     --user-agent="BashKat/1.0 (BashKat 1.0 Web Scraper Utility +http://www.bashkat.bot/)" \
     --referer="http://www.bashkat.bot" \
     --random-wait \
     --recursive \
     --no-clobber \
     --page-requisites \
     --convert-links \
     --restrict-file-names=windows \
     --domains $scrapeurl \
     --no-parent \
         $scrapeurl 
     done < "$filepath"
      printf "🏁Scrape Complete.\nHit Enter To Continue.👍"
       read anykey
./$(basename $0) && exit

 elif [ "$option" = "Quit" ];
 then
   printf "Quitting🏳"
    sleep 1
     clear
      exit
# ERRORS
  else
   clear
    printf "❌"
    sleep 1
   ./$(basename $0) && exit
  fi
 exit
done