Exim – Find a list of the most commonly used scripts

Posted by & filed under Server Admin.

I had to deal with a malicious script that was inserted into a website today and was sending out spam. Typically I have a few tools I run, but I couldn’t locate this particular infection. Time to take another angle, Exim logs. The following shows the most used script directories which will help narrow down the suspects substantially.

grep cwd /var/log/exim_mainlog | grep -v /var/spool | awk -F"cwd=" '{print $2}' | awk '{print $1}' | sort | uniq -c | sort -n

Linux — Finding top n large files

Posted by & filed under Linux, Server Admin.

As a followup to my previous note , I am adding an additional one-liner that is extremely helpful.

du -a /path | sort -n -r | head -n 10

Obviously, you can adjust the -n param in the head command to return the top 20 for example.

Quickly renaming multiple file extensions with bash

Posted by & filed under Uncategorized.

I needed to quickly rename a bunch of file extensions in a directory. This one liner made quick work of it:

for old in *.JPG; do cp $old `basename $old .JPG`.jpg; done

basename, when given a file name and a extension spits out something like this:

user@server$ basename derp.txt .txt

Which we then use to append the proper extension (.jpg) in the example and done.

Linux: Find files greater than n size

Posted by & filed under Linux, Server Admin.

Recently I had a issue where I needed to clean up some disk utilization on a linux server. In order to find a list of larger files, I used the following find command from the root directory I want to recurse through:

find . -type f -size +50000k -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'

As you can see, the -size switch is setting the minimum size to find at 50Mb. Another issue that I ran into was deleting a large amount of files at once using something like:

rm -rf /var/my/path/*

“Argument list too large” was the error. Apparently the list of files is too large for rm to handle. I found that there are a variety of methods to solve this, from using loops to split up the files into smaller groups, to recompiling the kernel. One of the simplest is to use the find command to delete the files it finds:

find /var/my/path/ -name "*" -delete

The list of files to get deleted can also be tuned so it does not delete all the files in the path:

find /var/my/path/ -name "filename*" -delete

PlayTerm — Watch the linux guru at work

Posted by & filed under Linux.

PLAYTERM is intended to raise the skills of terminal CLI users, share their skills and inspire others.
PLAYTERM wants to push forward a new way of education, because terminalsessions are language-independent, extremely educative & entertaining.

Basically, PlayTerm lets you ‘look over’ a guru’s shoulder, helping to learn the concept in the video.

It may sound strange, but eventhough CLI stuff sounds like an isolated environment, it is an extremely social playground. Since the eighties, billions of users have helped eachother improving their skills, to get things done faster.
However, there was never a playground for sharing this live, but only peeking over the shoulder of your neighbour (or a *N*X screen -x session).

PLAYTERM wants to restore the actual ‘live’ feeling, which was once established in the BBS scene. There, the BBS’es system operators could ‘takeover’ a users’s session..and show him the way around, or teach him a new programming language.


Linux: Find base64_decode in files

Posted by & filed under Linux.

Bad hax0rs! base64_decode is used by hackers frequently when they hijack a site to obfuscate their malicious code. This quick BASH one-liner will find files containing this evil function and lists them out:

find . -name '*.php' | while read  FILE; do  if grep  'eval(base64_decode' "$FILE"; then echo  "$FILE" >>  infectedfiles; else echo "$FILE"  >> notinfected; fi ; done

Linux: Find files modified between dates

Posted by & filed under BASH, Linux.

I found a handy technique to find files modified between a specific date. In essence, we touch two temp files, setting the modified dates to the range we want to find:

touch temp -t 201104141130
touch ntemp -t 201104261630

Note the date is yyymmddtime.

Then we run the find command:

find /path/to/folder/ -cnewer temp -and ! -cnewer ntemp


BASH: Recursively finding files

Posted by & filed under Linux.

I recently wanted to locate a specific file recursively within my directory tree. A simple way to do this is to:

find . -name .htaccess -print

In this example, we would be searching for a .htaccess file, but this could be changed to any file name, and even use wildcards like *.php.

Monitoring a Linux process by PID and sending a e-mail notification upon failure

Posted by & filed under BASH, Email, Linux, Programming.

One of my clients needs their vendor to be alerted when their Backup Exec service crashes. I wrote the following quick-n-dirty batch script intended to be ran as a cron job.

# Process Monitor
# Send e-mail alerts when service goes down
# -------------------------------------------------------------------------
# Author: Nathan Riley
# -------------------------------------------------------------------------
SUBJECT="Backup Exec Agent Failure"

#path to pgrep command

# Daemon name,

# find daemon pid

if [ $? -ne 0 ] # if daemon not running
# Generate email message body
echo "This is servername at location. The Backup Exec service is no longer running." > $EMAILMESSAGE
# send email alert
/usr/bin/mail -s "$SUBJECT" "$EMAIL" < $EMAILMESSAGE

And the cron line would be something like this:

*/5 * * * * root /root/restart.sh  >/dev/null 2>&1