Ubuntu: Unable to install/update packages; Full /boot partition

Posted by & filed under Server Admin.

UPDATE 11/17/15: Another nice command to auto purge old kernels is: sudo apt-get autoremove


Also, removing old kernels is easy with  sudo dpkg -r linux-image-3.2.0-83-generic



Recently I wanted to install a new package on a Ubuntu server. Typically this is as simple as issuing a

sudo apt-get install package-name

But this time around, I got a interesting error:

$ sudo apt-get install vsftpd
Reading package lists... Done
Building dependency tree
Reading state information... Done
You might want to run 'apt-get -f install' to correct these:
The following packages have unmet dependencies:
 linux-image-server : Depends: linux-image-3.0.0-28-server but it is not going to be installed
E: Unmet dependencies. Try 'apt-get -f install' with no packages (or specify a solution).

I started poking around and found that the /boot partition is full:

$ df -h
Filesystem            Size  Used Avail Use% Mounted on
                       36G  8.0G   26G  24% /
udev                  2.0G  8.0K  2.0G   1% /dev
tmpfs                 793M  236K  793M   1% /run
none                  5.0M     0  5.0M   0% /run/lock
none                  2.0G     0  2.0G   0% /run/shm
/dev/sda1             228M  228M   0M  100% /boot

Ok, that is starting to make a bit more sense now… so we need to purge old kernel packages to free up space on the /boot partition. The first step is to identify the kernel version we are currently on so we do not delete that. Secondly, it was recommended to me that you keep the oldest kernel as it was the one the system was installed with. We can see the kernel version with:

$ uname -a
Linux server 3.0.0-25-server #41-Ubuntu SMP Mon Aug 13 18:18:27 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux

So we are on 3.0.0-25-server and need to make sure not to delete that. A handy command to get a list of all the kernels you are not using is:

dpkg -l 'linux-*' | sed '/^ii/!d;/'"$(uname -r | sed "s/\(.*\)-\([^0-9]\+\)/\1/")"'/d;s/^[^ ]* [^ ]* \([^ ]*\).*/\1/;/[0-9]/!d'

I attempted to remove old kernels the “nice” way — by letting apt handle the removal:

$ sudo apt-get -y purge linux-headers-3.0.0-12-server
Reading package lists... Done
Building dependency tree
Reading state information... Done
You might want to run 'apt-get -f install' to correct these:
The following packages have unmet dependencies:
 linux-image-server : Depends: linux-image-3.0.0-28-server but it is not going to be installed
E: Unmet dependencies. Try 'apt-get -f install' with no packages (or specify a solution).

But it failed. Following the above instructions to run sudo apt-get -f install, it failed saying that there was not enough disk space on /boot (duh!). So much for being nice.

Inside the /boot partition there are three “types” of files — abi-kernel-version, config-kernel-version, initrd.img-kernel-version, System.map-kernel-version, vmcoreinfo-kernel-version, and vmlinuz-kernel-version. There will be a file for each of these for each version of the kernel you have installed. For example: vmlinuz-3.0.0-28-server. Leaving the earliest kernel version and the version I am running (reported by uname -a), I moved the other kernel files off to another location where there was ample space. It looked something like this:

$ sudo mv abi-3.0.0-16-server config-3.0.0-16-server initrd.img-3.0.0-16-server System.map-3.0.0-16-server vmcoreinfo-3.0.0-16-server vmlinuz-3.0.0-16-server /home/tnscweb/boot/

As you can see this is moving the files for kernel version 3.0.0-16 off the /boot partition.

If the boot partition just needed a bit of space freed up, you can now likely use apt-get to purge the other kernels “cleanly”. What I mean by that is apt-get also removes the kernel files from /lib/modules. You could do this by hand as well. I am not sure if it does anything beyond cleaning up /boot and /lib/modules, but I do not believe it does.

Repairing a corrupted MySQL database table

Posted by & filed under Server Admin, Web Development.

Ran into a issue where I wanted to do a mysqldump of a database in order to transfer it to a new server.

mysqldump -u user -p shoppingcart > sqloutput.sql 

This failed saying that three of the tables were corrupted. I ran the mysqlcheck utility to see if it could be repaired:

mysqlcheck -u user -p shoppingcart

It outputted the following errors among checking the rest of the tables successfully:

Error    : Table 'shoppingcart.isc_coupon_locations' doesn't exist
error    : Corrupt
Error    : Table 'shoppingcart.isc_coupon_shipping_methods' doesn't exist
error    : Corrupt
Error    : Table 'shoppingcart.isc_coupon_usages' doesn't exist
error    : Corrupt

I was strongly suspecting that these tables were old remnants of a old software version or something along those lines.

I tried to re-run the command, telling it to repair the tables. It kicked out the same errors about the tables not being found.

mysqlcheck -u user -p shoppingcart --auto-repair --check --optimize --databases

Went ahead and issues a drop command for the three tables, as I suspect this is unused and leftover from a previous upgrade.

mysql -u user -p 
mysql> use shoppingcart;
mysql> drop table isc_coupon_locations;
mysql> drop table isc_coupon_shipping_methods;
mysql> drop table isc_coupon_usages;

After each of the drop statements, MySQL reported a error that it was unable to delete as it could not find the table. I re-ran mysqlcheck and found that it actually did remove them, and it reported no issues. I was then able to go ahead and re-run my mysqldump command and completed extracting the database.


Posted by & filed under Programming.

StatsD is a simple NodeJS daemon (and by “simple” I really mean simple — NodeJS makes event-based systems like this ridiculously easy to write) that listens for messages on a UDP port. (See Flickr’s “Counting & Timing” for a previous description and implementation of this idea, and check out the open-sourced code on github to see our version.) It parses the messages, extracts metrics data, and periodically flushes the data to graphite (which I previously wrote about here).

Github: github.com/etsy/statsd
The Etsy blog post: codeascraft.etsy.com/2011/02/15/measure-…

Mozilla WebMaker Tools (Popcorn and XRay)

Posted by & filed under Software.

Popcorn Maker makes it easy to enhance, remix and share web video. Use your web browser to combine video and audio with content from the rest of the web — from text, links and maps to pictures and live feeds.

The X-Ray Goggles make it easy to see and mess around with the building blocks that make up the web. Activate the Goggles to inspect the code behind any web page, from the New York Times to your own blog. Then remix elements with a single click, swapping in your own text, images and more.

It’s like editing a site’s markup with the developer tools, but it works better and is easier to use. I’m planning on trying this out for prototyping new changes.
More: webmaker.org/en-US/

New motor build… motor assembly.

Posted by & filed under 93 Hatch Blog.

Got the right rod bearings!Q\

Pistons in the hole! O-ring installed!

Shannon Gordon making it happen.

Shiny bits!

Nice and clean.

Head on, timing belt installed.

Express delivery thanks again to Shannon.

Linux: Counting the number of lines inside multiple files

Posted by & filed under BASH, Programming.

Recently I needed to recursively count the number of lines of code in each of the specific file types. In this instance I wanted to count the number of lines of code in my PHP files. The below command worked flawlessly. In addition to breaking down the line count in each directory, it gives a overall total at the end as well.

find . -name '*.php' | xargs wc -l

The effects of hurricane Sandy from a network point of view

Posted by & filed under Networking.

I took a look at Internet Traffic report while the storm was hitting the upper northeast, and the results weren’t surprising. North American packet loss went up ~5%, while the traffic index went down. Again, not surprising, but cool to see nonetheless.

Enumerating a DNS zone by brute force

Posted by & filed under Linux, Server Admin.

Sometimes a situation pops up where I need to retrieve the contents of a DNS zone but do not have access to the DNS servers. This is typically a situation where a client’s IT person is no longer available and before moving the name servers, we need to recreate the DNS zone on our name servers. If the current host of the DNS zone cannot release the records, we have a few options.

1. Try a zone transfer. I previously wrote about this. This is highly unlikely to work, but if the DNS server is poorly configured, it’s a possibility. This works rarely but if it does work will be the most accurate.

dig -t AXFR @dns.server.domains.is.on.com domain.name.to.dump.com

2. Brute force the zone. It sounds bad, but the reality of it is that most sysadmins don’t log or throttle DNS requests, and therefore with a decent enough dictionary of words it is possible to enumerate a large majority of the dns zone. I have mirrored the zip file containing bfdomain and the dictionaries here. (original source)

python bfdomain.py domain-to-test.com dictionaries/hostnames-big.txt
[*]-Using dictionairy: dictionaries/hostnames-lite.txt (Loaded 1399 words)
 |-mail (line: 690) ==> ('mail.domain-to-test.com', [], [''])
 |-webmail (line: 1294) ==> ('webmail.domain-to-test.com', [], [''])
 |-welcome (line: 1311) ==> ('welcome.domain-to-test.com', [], [''])
 |-www (line: 1362) ==> ('domain-to-test.com', ['www.domain-to-test.com'], [''])
[*]-Total assets found: 4

UPDATE: One other thing that I noticed later on is the fact that this is seemingly only capturing A records, so things like the MX record would not be tried. The python script could be modified easily to add this functionality. Also, the nmap version of this may already do this.

More dictionaries and wordlists: