For example, if you wanted to scan all files in the current directory, and all sub directories for any calls to base64_decode, you could do something like this:
find all files, then execute grep on them, printing out matching lines, filenames and line numbers, finally write output to resultb64.txt
Another twist on this is to filter the filetypes a bit:
Lastly, if we wanted to find and replace (with nothing) a string:
Needed to audit some apache logs, installed scalp, grabbed the XML, and it promptly puked:
Seems there is some issue with the regex in the XML file. I found this handy thread which outlines the fixes: code.google.com/p/apache-scalp/issues/de… and another person posted a XML will all the fixes: pastebin.com/uDziqcD5
Backup of the XML is below just in case pastebin goes down: default_filter (rename to .xml)
Folder /public_html looks like this:
I need to clone all the files and folders (with a couple of exceptions) in this directory into the /public_html/dev folder. We need to exclude the dev/ folder as it is the destination, and also want to exclude the dev2/ folder.
Rsync makes this easy:
In my scenario, something like the following gets the job done:
Here’s a nice one liner to generate a private key and csr:
Generates the key and the csr in one shot.
I came across this today and had to share. The latest version of Process Explorer has native integration with VirusTotal This means you can have Process Explorer analyze the processes running and compare them with the VirusTotal database.
- Select options -> Check VirusTotal.com to initiate a scan of the processes. The VirusTotal column will populate with scores for the process.
- Click a score to be taken to the detailed results on VirusTotal.
How It Works:
- Creates a SHA256 hash of the file.
- Submits to VirusTotal.
- The hash of the process is then looked up in VirusTotal’s database, and the results are displayed in Process Explorer.
I have two drives, dsa3 and da4 which are showing multipath errors. In order to find the physical drives:
Drive serials can also be obtained via:
As a followup to my previous note , I am adding an additional one-liner that is extremely helpful.
Obviously, you can adjust the -n param in the head command to return the top 20 for example.
Backing up the account
– Log-in via SSH as Root user
– Run the pkgacct command to package the account into a tar file:
The account backup will be created in the current working directory.
Restoring the account
– Log-in via SSH as Root user
– Run the restorepkg command to restore the cpbackup archive. Make sure you are in the same directory as the backup file
If the account already exists, you may need for force the restore:
There are also options to specify the account’s IP address, etc.
Today I needed to begin migrating databases from my live server to the new dedicated database server. The first step is to identify the tables to be moved.
Login to the server as per normal
Then list out the tables to find the one(s) we need to move:
For this example, we will move the “test” database. The next step is to create the new empty test database on the remote server:
The next step is to use the mysqldump command to extract the contents of the current test database and pipe it to the new server database:
Once this command completes, the database will be on both servers.
I typically use the dd utility in linux to create large files for testing various things. Today I needed to create a large file in a windows machine. One option is to use dd on a linux box to create the file, and then scp the file over to the windows box. This would work, but is a bit inefficient. After a bit of research I found that fsutil.exe will create files. Below is the same command in dd and fsutil:
Syntax is: fsutil file createnew <filename> <length in bytes>
IOPS = (MBps Throughput / KB per IO) * 1024
MBps = (IOPS * KB per IO) /1024
UPDATE 11/17/15: Another nice command to auto purge old kernels is: sudo apt-get autoremove
Also, removing old kernels is easy with sudo dpkg -r linux-image-3.2.0-83-generic
Recently I wanted to install a new package on a Ubuntu server. Typically this is as simple as issuing a
But this time around, I got a interesting error:
I started poking around and found that the /boot partition is full:
Ok, that is starting to make a bit more sense now… so we need to purge old kernel packages to free up space on the /boot partition. The first step is to identify the kernel version we are currently on so we do not delete that. Secondly, it was recommended to me that you keep the oldest kernel as it was the one the system was installed with. We can see the kernel version with:
So we are on 3.0.0-25-server and need to make sure not to delete that. A handy command to get a list of all the kernels you are not using is:
I attempted to remove old kernels the “nice” way — by letting apt handle the removal:
But it failed. Following the above instructions to run sudo apt-get -f install, it failed saying that there was not enough disk space on /boot (duh!). So much for being nice.
Inside the /boot partition there are three “types” of files — abi-kernel-version, config-kernel-version, initrd.img-kernel-version, System.map-kernel-version, vmcoreinfo-kernel-version, and vmlinuz-kernel-version. There will be a file for each of these for each version of the kernel you have installed. For example: vmlinuz-3.0.0-28-server. Leaving the earliest kernel version and the version I am running (reported by uname -a), I moved the other kernel files off to another location where there was ample space. It looked something like this:
As you can see this is moving the files for kernel version 3.0.0-16 off the /boot partition.
If the boot partition just needed a bit of space freed up, you can now likely use apt-get to purge the other kernels “cleanly”. What I mean by that is apt-get also removes the kernel files from /lib/modules. You could do this by hand as well. I am not sure if it does anything beyond cleaning up /boot and /lib/modules, but I do not believe it does.
Ran into a issue where I wanted to do a mysqldump of a database in order to transfer it to a new server.
This failed saying that three of the tables were corrupted. I ran the mysqlcheck utility to see if it could be repaired:
It outputted the following errors among checking the rest of the tables successfully:
I was strongly suspecting that these tables were old remnants of a old software version or something along those lines.
I tried to re-run the command, telling it to repair the tables. It kicked out the same errors about the tables not being found.
Went ahead and issues a drop command for the three tables, as I suspect this is unused and leftover from a previous upgrade.
After each of the drop statements, MySQL reported a error that it was unable to delete as it could not find the table. I re-ran mysqlcheck and found that it actually did remove them, and it reported no issues. I was then able to go ahead and re-run my mysqldump command and completed extracting the database.
Sometimes a situation pops up where I need to retrieve the contents of a DNS zone but do not have access to the DNS servers. This is typically a situation where a client’s IT person is no longer available and before moving the name servers, we need to recreate the DNS zone on our name servers. If the current host of the DNS zone cannot release the records, we have a few options.
1. Try a zone transfer. I previously wrote about this. This is highly unlikely to work, but if the DNS server is poorly configured, it’s a possibility. This works rarely but if it does work will be the most accurate.
2. Brute force the zone. It sounds bad, but the reality of it is that most sysadmins don’t log or throttle DNS requests, and therefore with a decent enough dictionary of words it is possible to enumerate a large majority of the dns zone. I have mirrored the zip file containing bfdomain and the dictionaries here. (original source)
UPDATE: One other thing that I noticed later on is the fact that this is seemingly only capturing A records, so things like the MX record would not be tried. The python script could be modified easily to add this functionality. Also, the nmap version of this may already do this.
More dictionaries and wordlists: