https://wordpress.org/download/

DarkMatter Display Random Quotes Plugin For WordPress

The DarkMatter Display Random Quotes Plugin For WordPress is a very simple and straight forward plugin that posts random quotes to the bottom of WordPress Posts.
You can edit the quotes.txt (plugins/quote-of-the-day/quotes.txt) file to add or remove your own.
You can see it running below this post.

Download: DarkMatter – Quote Of The Day WordPress Plugin

PhP Header Request Spoofing Ip Address User Agent Geo-Location

Generate Random HTTP Request

Random HTTP Request Generator – “generator.php”

This generates the Header Request Information to be sent to a Destination URL.
For Testing Purposes Only – Some Files Have Been Excluded.
The Destination URL tracks incoming HTTP Requests and filters them for “bad data” or
“Spoofed Requests” such as the requests generated here.

Server Status Monitor PhP Code

PhP Monitor Active and Inactive Servers

This is a simple and no frills way to monitor your servers.
You will need to create the file “urls.txt” in the same folder as the “active-servers.php” file.

Examples – urls.txt:
https://www.my-website.com/
http://localhost/
http://www.my-website.com/some/page.php

Navigate to the active-servers.php file. ( http://www.your-site.com/scripts/active-servers.php)
Online Servers will be Lime and Offline Servers, Red.

active-servers.php

<head>
<title>Active Servers</title>
</head>
<body style=”background-color: #0c0c0c;”>
<div>
<table width=”100%” height=”100%”>
<tr>
<td>
<table style=”margin-bottom: 100%;text-transform: uppercase;”>
<?php
$fn = fopen(“urls.txt”,”r”);
while(! feof($fn)) {
$result = fgets($fn);
$server=gethostbyname(parse_url($result, PHP_URL_HOST));
if (fsockopen($server, 80)){
echo (‘ <tr><td><b><a style=”color: Lime; text-decoration: none;”href=”‘.$result.'” target=”viewer”>’.$result.'</b></td></tr>’. “\r\n”);
} else
{
echo (‘ <tr><td style=”color: #a00000″>’.$result.'</td></tr>’. “\r\n”);
}

} fclose($fn); ?>

</table>
</td>
<td width=”100%”><iframe align=”right” class=”viewpanel” src=”” name=”viewer” frameborder=”0″ width=”100%” height=”600px”></iframe></td>
</tr>
</table>
</div>
</body>

Spoofing Random Toys MySql WordPress Form Data Fields

Fake Email Generator Create Random Email Addresses From Files

This is just a fun little toy that happened while working on MySQL Automation.
The files used are first_names.txt, last_names.txt and domains.txt.
Reading random lines from the files in order creates the “Fake Email Address” and using [array_rand($variable)]; each email address is somewhat unique as I’m only using 80,000 names (give or take a few hundred).

All Files: fake-email-generator.zip


#!/bin/bash
$first_names = 'first_names.txt';
$last_names = 'last_names.txt';
$dom = 'domains.txt';

    $firstname = file($first_names);
    $fdata = $firstname[array_rand($firstname)];
    $first = $fdata;

        $lastname = file($last_names);
        $ldata = $lastname[array_rand($lastname)];
        $last = $ldata;

    $comd = file($dom);
    $edata = $comd[array_rand($comd)];
    $com = $edata;

        $first = preg_replace('/\s+/', '', $first);
        $first = strtolower($first);
        $last = preg_replace('/\s+/', '', $last);
        $last = strtolower($last);
        $com = preg_replace('/\s+/', '', $com);

    echo $first."@".$last.$com;]
PhP Shell Bash Website Security Encryption Decryption Hash Encoding Decoding
BashKat Web Scraper

BashKat Web Scraping Utility Script

BashKat is pretty straight forward and really easy to use.
I made sure to add some “cute” to it with the emojis.
This bot will scrape from user input or a file using the wget function (example: urls.txt) and it’s Super Fun when using Proxychains.


#!/usr/bin/env bash
# BashKat Version 1.0.2
# K0NxT3D

# Variables
BotOptions="Url File Quit"

# Welcome Banner
clear
printf "✨ BashKat 1.0 ✨\nScrape Single URL/IP or Multiple From File.\n\n" && sleep 1

# Bot Options Menu
select option in $BotOptions; do

# Single URL Scrape
   if [ "$option" = "Url" ];
    then
      printf "URL To Scrape: "
       read scrapeurl
     mkdir -p data/
    wget -P data/ \
     -4 \
     -w 0 \
     -t 3 \
     -rkpN -e robots=off \
     --header="Accept: text/html" \
     --user-agent="BashKat/1.0 (BashKat 1.0 Web Scraper Utility +http://www.bashkat.bot/)" \
     --referer="http://www.bashkat.bot" \
     --random-wait \
     --recursive \
     --no-clobber \
     --page-requisites \
     --convert-links \
     --restrict-file-names=windows \
     --domains $scrapeurl \
     --no-parent \
         $scrapeurl

      printf "🏁Scrape Complete.\nHit Enter To Continue.👍"
       read anykey
./$(basename $0) && exit

  elif [ "$option" = "File" ];
   then
      printf "Path To File: "
       read filepath
     while IFS= read -r scrapeurl
      do
     mkdir -p data/
    wget -P data/ \
     -4 \
     -w 0 \
     -t 3 \
     -rkpN -e robots=off \
     --header="Accept: text/html" \
     --user-agent="BashKat/1.0 (BashKat 1.0 Web Scraper Utility +http://www.bashkat.bot/)" \
     --referer="http://www.bashkat.bot" \
     --random-wait \
     --recursive \
     --no-clobber \
     --page-requisites \
     --convert-links \
     --restrict-file-names=windows \
     --domains $scrapeurl \
     --no-parent \
         $scrapeurl 
     done < "$filepath"
      printf "🏁Scrape Complete.\nHit Enter To Continue.👍"
       read anykey
./$(basename $0) && exit

 elif [ "$option" = "Quit" ];
 then
   printf "Quitting🏳"
    sleep 1
     clear
      exit
# ERRORS
  else
   clear
    printf "❌"
    sleep 1
   ./$(basename $0) && exit
  fi
 exit
done
Paparazzi ScreenShot Bot

Paparazzi Screenshot Script (Bot)

Paparazzi: A basic Screenshot Utility (Bot) for collecting screenshots of IP addresses and or URLs.
This script requires a personal API key and you can get yours here: BrowShotAPI

#!/usr/bin/env bash
# Paparazzi 1.0 - K0NxT3D 2021
# Website ScreenShot Utility
# Bash: ./bot

# Variables
    BotOptions="Url File Quit"
    images="./images"

# Welcome Banner - Create Directories
    mkdir -p $images
    clear
    printf "Paparazzi 1.0 - K0NxT3D\n"
    sleep 1

# Bot Options Menu
    select option in $BotOptions; do

# Single URL/IP Scan
    if [ "$option" = "Url" ];
     then
      printf "IP/URL To Scan: "
    read url
        curl -L "https://api.browshot.com/api/v1/simple?url=$url&key=ENTER YOUR KEY HERE" -o $images/$url.png
      printf "Finished Scanning.\nHit Enter To Continue.."
    read anykey
./bot

# Multiple URL/IP Scan From File (example: urls.txt)
    elif [ "$option" = "File" ];
     then
      printf "Path To File: "
    read filepath
        while IFS= read -r url
        do
          curl -L "https://api.browshot.com/api/v1/simple?url=$url&key=ENTER YOUR KEY HERE" -o $images/$url.png
        done < "$filepath"
    printf "Finished Scanning.\nHit Enter To Continue.."
 read anykey
./bot

# Quitter!!!
    elif [ "$option" = "Quit" ];
     then
        printf "Quitting!"
    sleep 1
clear
exit

# ERRORS
    else
    clear
        printf ""
    sleep 1
   bash ./bot
fi
exit
done