Recursively finding strings in files

Posted by & filed under Linux, Server Admin.

For example, if you wanted to scan all files in the current directory, and all sub directories for any calls to base64_decode, you could do something like this:

find . -type f -exec grep -A 2 -B 2 -H -i -n "base64_decode" {} + > resultb64.txt

find all files, then execute grep on them, printing out matching lines, filenames and line numbers, finally write output to resultb64.txt

Another twist on this is to filter the filetypes a bit:

find . -name "*.html" -or -name "*.php" -or -name "*.js" -exec grep -A 2 -B 2 -H -i -n "base64_decode" {} + > resultb64.txt

Lastly, if we wanted to find and replace (with nothing) a string:

find ./ -name "*.html" -or -name "*.php" -exec sed -i 's#STRING TO FIND##g' '{}' \;

Linux: Counting the number of lines inside multiple files

Posted by & filed under BASH, Programming.

Recently I needed to recursively count the number of lines of code in each of the specific file types. In this instance I wanted to count the number of lines of code in my PHP files. The below command worked flawlessly. In addition to breaking down the line count in each directory, it gives a overall total at the end as well.

find . -name '*.php' | xargs wc -l

Linux: Find files greater than n size

Posted by & filed under Linux, Server Admin.

Recently I had a issue where I needed to clean up some disk utilization on a linux server. In order to find a list of larger files, I used the following find command from the root directory I want to recurse through:

find . -type f -size +50000k -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'

As you can see, the -size switch is setting the minimum size to find at 50Mb. Another issue that I ran into was deleting a large amount of files at once using something like:

rm -rf /var/my/path/*

“Argument list too large” was the error. Apparently the list of files is too large for rm to handle. I found that there are a variety of methods to solve this, from using loops to split up the files into smaller groups, to recompiling the kernel. One of the simplest is to use the find command to delete the files it finds:

find /var/my/path/ -name "*" -delete

The list of files to get deleted can also be tuned so it does not delete all the files in the path:

find /var/my/path/ -name "filename*" -delete

Linux: Find base64_decode in files

Posted by & filed under Linux.

Bad hax0rs! base64_decode is used by hackers frequently when they hijack a site to obfuscate their malicious code. This quick BASH one-liner will find files containing this evil function and lists them out:

find . -name '*.php' | while read  FILE; do  if grep  'eval(base64_decode' "$FILE"; then echo  "$FILE" >>  infectedfiles; else echo "$FILE"  >> notinfected; fi ; done

Linux: Find files modified between dates

Posted by & filed under BASH, Linux.

I found a handy technique to find files modified between a specific date. In essence, we touch two temp files, setting the modified dates to the range we want to find:

touch temp -t 201104141130
touch ntemp -t 201104261630

Note the date is yyymmddtime.

Then we run the find command:

find /path/to/folder/ -cnewer temp -and ! -cnewer ntemp


BASH: Recursively finding files

Posted by & filed under Linux.

I recently wanted to locate a specific file recursively within my directory tree. A simple way to do this is to:

find . -name .htaccess -print

In this example, we would be searching for a .htaccess file, but this could be changed to any file name, and even use wildcards like *.php.