Tunnelling SSH/SCP through intermediate host when two hosts can’t directly communicate

Posted by & filed under Linux.

Scenario:

We need to scp a file between two hosts. The problem is that the two hosts (A & C) cannot directly communicate. We can solve this using a SSH tunnel and an intermediate host (B) that can communicate with both. This also means, the command for Host B needs to run first, then the scp command for host A.:

 

Host A (source)

This will scp to localhost on port 3000 which is actually our tunnel to host c — /destination_file is the path on host C

Host B (intermediate)

Host C (destination)

 

 

Also, if you have spaces in the paths make sure to escape the space with \ e.g.

 

Linux – Unable to boot due to missing drive in fstab

Posted by & filed under Linux, Server Admin.

I had a old server I brought up and it was unable to complete it’s boot due to a missing drive in fstab. Editing the fstab in recovery mode is not a option since the filesystem gets flagged as read only.

In order to make the FS writable and therefore be able to successfully edt the fstab, the following command will remount the FS in read/write mode:

 

Slow DNS Resolution on Ubuntu Linux Server 14.05 LTS

Posted by & filed under Linux, Server Admin.

This all started with WordPress timeouts. I was trying to activate some premium plugins, and the license activation was timing out. I started doing some digging and found they use the WordPress core library WP_http which in turn uses curl to make the request. I wrote my own code to use WP_Http and it failed in the same way with a timeout. I added a timeout parameter to the wp_remote_get() call, and it was able to complete without a timeout. I then used a IP address in place of the domain name and it worked without the need for the timeout parameter.

With that info in hand, I decided it must be on the server. I started doing some tests:

I then did the same test from another server that uses the same DNS servers in resolv.conf:

After much googling, I found a few number of suggested solutions:

  • Disable IPv6
  • Ensure /etc/nsswitch.conf is set correctly (hosts: files dns)

Neither of these worked for me. Finally, I added the following directive into my resolv.conf and it fixed the issue!

Apparently, this is actually somewhat related to ipv6 — from the resolv.conf manpage:

Now, I get good response times when I curl:

Looks like the resolver sends parallel requests, fails to see the IPv6 response, waits 5 sec and sends sequential requests because it thinks the nameserver is broken. By adding the options single-request, glibc makes the requests sequentially be default and does not timeout.

I found some good info and hints on this issue here: https://bbs.archlinux.org/viewtopic.php?id=75770

Lastly, to bring this whole thing full circle, the WprdPress plugins now are able to get out and communicate successfully. Woohoo!

OpenSSL – Extracting a crt and key from a pkcs12 file

Posted by & filed under Linux, Server Admin.

Quick and dirty way to pull out the key and crt from a pkcs12 file:

If you are using this for Apache and need to strip the password out of the certificate so Apache does not ask for it each time it starts:

Recursively finding strings in files

Posted by & filed under Linux, Server Admin.

For example, if you wanted to scan all files in the current directory, and all sub directories for any calls to base64_decode, you could do something like this:

find all files, then execute grep on them, printing out matching lines, filenames and line numbers, finally write output to resultb64.txt

Another twist on this is to filter the filetypes a bit:

Lastly, if we wanted to find and replace (with nothing) a string:

BASH: Copy files recursively, excluding directories

Posted by & filed under Linux, Server Admin.

Scenario:

Folder /public_html looks like this:

I need to clone all the files and folders (with a couple of exceptions) in this directory into the /public_html/dev folder. We need to exclude the dev/ folder as it is the destination, and also want to exclude the dev2/ folder.

Rsync makes this easy:

In my scenario, something like the following gets the job done:

 

Finding failed multipath issues in FreeNAS

Posted by & filed under Linux, Server Admin.

I have two drives, dsa3 and da4 which are showing multipath errors. In order to find the physical drives:

Drive serials can also be obtained via:

 

Linux — Finding top n large files

Posted by & filed under Linux, Server Admin.

As a followup to my previous note , I am adding an additional one-liner that is extremely helpful.

Obviously, you can adjust the -n param in the head command to return the top 20 for example.

Enumerating a DNS zone by brute force

Posted by & filed under Linux, Server Admin.

Sometimes a situation pops up where I need to retrieve the contents of a DNS zone but do not have access to the DNS servers. This is typically a situation where a client’s IT person is no longer available and before moving the name servers, we need to recreate the DNS zone on our name servers. If the current host of the DNS zone cannot release the records, we have a few options.

1. Try a zone transfer. I previously wrote about this. This is highly unlikely to work, but if the DNS server is poorly configured, it’s a possibility. This works rarely but if it does work will be the most accurate.

2. Brute force the zone. It sounds bad, but the reality of it is that most sysadmins don’t log or throttle DNS requests, and therefore with a decent enough dictionary of words it is possible to enumerate a large majority of the dns zone. I have mirrored the zip file containing bfdomain and the dictionaries here. (original source)

UPDATE: One other thing that I noticed later on is the fact that this is seemingly only capturing A records, so things like the MX record would not be tried. The python script could be modified easily to add this functionality. Also, the nmap version of this may already do this.

More dictionaries and wordlists:
packetstormsecurity.org/Crackers/wordlis…
www.cotse.com/tools/wordlists1.htm
www.outpost9.com/files/WordLists.html
www.openwall.com/wordlists/
wordlist.sourceforge.net/
www.ai.uga.edu/ftplib/natural-language/m…
www.insidepro.com/dictionaries.php
www.room362.com/storage/saved/hugelist.t…

Linux: Find files greater than n size

Posted by & filed under Linux, Server Admin.

Recently I had a issue where I needed to clean up some disk utilization on a linux server. In order to find a list of larger files, I used the following find command from the root directory I want to recurse through:

As you can see, the -size switch is setting the minimum size to find at 50Mb. Another issue that I ran into was deleting a large amount of files at once using something like:

“Argument list too large” was the error. Apparently the list of files is too large for rm to handle. I found that there are a variety of methods to solve this, from using loops to split up the files into smaller groups, to recompiling the kernel. One of the simplest is to use the find command to delete the files it finds:

The list of files to get deleted can also be tuned so it does not delete all the files in the path:

PlayTerm — Watch the linux guru at work

Posted by & filed under Linux.

PLAYTERM is intended to raise the skills of terminal CLI users, share their skills and inspire others.
PLAYTERM wants to push forward a new way of education, because terminalsessions are language-independent, extremely educative & entertaining.

Basically, PlayTerm lets you ‘look over’ a guru’s shoulder, helping to learn the concept in the video.

It may sound strange, but eventhough CLI stuff sounds like an isolated environment, it is an extremely social playground. Since the eighties, billions of users have helped eachother improving their skills, to get things done faster.
However, there was never a playground for sharing this live, but only peeking over the shoulder of your neighbour (or a *N*X screen -x session).

PLAYTERM wants to restore the actual ‘live’ feeling, which was once established in the BBS scene. There, the BBS’es system operators could ‘takeover’ a users’s session..and show him the way around, or teach him a new programming language.

www.playterm.org/

Linux: Find base64_decode in files

Posted by & filed under Linux.

Bad hax0rs! base64_decode is used by hackers frequently when they hijack a site to obfuscate their malicious code. This quick BASH one-liner will find files containing this evil function and lists them out:

Linux: Find files modified between dates

Posted by & filed under BASH, Linux.

I found a handy technique to find files modified between a specific date. In essence, we touch two temp files, setting the modified dates to the range we want to find:

Note the date is yyymmddtime.

Then we run the find command:

Done!

BASH: Recursively finding files

Posted by & filed under Linux.

I recently wanted to locate a specific file recursively within my directory tree. A simple way to do this is to:

In this example, we would be searching for a .htaccess file, but this could be changed to any file name, and even use wildcards like *.php.