Get BIOS/Motherboard Info from within Linux

Submitted by jbreland on Wed, 05/05/2010 - 19:31

It's possible to read the BIOS version and motherboard information (plus more) from a live Linux system using dmidecode. This utility "reports information about your system's hardware as described in your system BIOS according to the SMBIOS/DMI standard (see a sample output). This information typically includes system manufacturer, model name, serial number, BIOS version, asset tag as well as a lot of other details of varying level of interest and reliability depending on the manufacturer." It can be handy if you want to check the BIOS version of your desktop and you're too lazy to reboot, but it's far more useful when trying to get information about production servers that you simply cannot take down.

Simply run dmidecode (as root) to get a dump of all available information. You can specify --string or --type to filter the results. The dmidecode man page is quite thorough, so I won't rehash it here.

One extremely useful application that may not be immediately obvious is the ability to pull the system serial number. Let's say you need to call support for a particular server that can't be taken down, or that you may not even have physical access to. A vendor like Dell will always want the system serial number, and as long as you can login to the server you can obtain the serial number with dmidecode -s system-serial-number. This has saved me on a couple of occasions with remotely hosted servers.

A lot more information is available through dmidecode, so I definitely encourage you to check it out. To wrap things up, I'll leave you with this obnoxiously long alias:

alias bios='[ -f /usr/sbin/dmidecode ] && sudo -v && echo -n "Motherboard" && sudo /usr/sbin/dmidecode -t 1 | grep "Manufacturer\|Product Name\|Serial Number" | tr -d "\t" | sed "s/Manufacturer//" && echo -ne "\nBIOS" && sudo /usr/sbin/dmidecode -t 0 | grep "Vendor\|Version\|Release" | tr -d "\t" | sed "s/Vendor//"'

This will spit out a nicely formatted summary of the bios and motherboard information, using sudo so it can be run as a normal user. Example output:

$ bios
Motherboard: Dell Inc.
Product Name: Latitude D620
Serial Number: XXXXXXXX
 
BIOS: Dell Inc.
Version: A10
Release Date: 05/16/2008

Enjoy.

Generic Method to Determine Linux (or UNIX) Distribution Name

Submitted by jbreland on Wed, 05/05/2010 - 01:58

A while back I had a need to programmatically determine the which Linux distribution is running in order to have some scripts do the right thing depending on the distro. Unfortunately, there doesn't appear to be one completely foolproof method to do so. What I ended up coming up with was a combination of techniques that combines querying the LSB utilities, distro release info files, and kernel info from uname. It'll take the most specific distro name it can find, falling back to generic Linux if necessary. It'll also identify UNIX variants as well, such as Solaris or AIX.

Here's the code:

# Determine OS platform
UNAME=$(uname | tr "[:upper:]" "[:lower:]")
# If Linux, try to determine specific distribution
if [ "$UNAME" == "linux" ]; then
    # If available, use LSB to identify distribution
    if [ -f /etc/lsb-release -o -d /etc/lsb-release.d ]; then
        export DISTRO=$(lsb_release -i | cut -d: -f2 | sed s/'^\t'//)
    # Otherwise, use release info file
    else
        export DISTRO=$(ls -d /etc/[A-Za-z]*[_-][rv]e[lr]* | grep -v "lsb" | cut -d'/' -f3 | cut -d'-' -f1 | cut -d'_' -f1)
    fi
fi
# For everything else (or if above failed), just use generic identifier
[ "$DISTRO" == "" ] && export DISTRO=$UNAME
unset UNAME

I include this code in my ~/.bashrc file so that it always runs when I login and sets the $DISTRO variable to the appropriate distribution name. I can then use that variable at any later time to perform actions based on the distro. If preferred, this could also easily be adapted into a function by having it return instead of export $DISTRO.

I've tested this on a pretty wide range of Linux and UNIX distributions, and it works very well for me, so I figured I'd share it. Hope you find it useful.

Delete Old Files ONLY If Newer Files Exist

Submitted by jbreland on Tue, 05/04/2010 - 18:17

I discovered recently that one of my automated nightly backup processes had failed. I didn't discover this until about a week after it happened, and though I was able to fix it easily enough, I discovered another problem in the process: all of my backups for those systems had been wiped out. The cause turned out to be a nightly cron job that deletes old backups:

find /home/backup -type f -mtime +2 -exec rm -f {} +

This is pretty basic: find all files under /home/backup/ that are more than two days old and remove them. When new backups are added each night, this is no problems; even though all old backups get removed, newer backups are uploaded to replace them. However, when the backup process failed, the cron job kept happily deleting the older backups until, three days later, I had none left. Oops.

Fortunately, this didn't end up being an issue as I didn't need those specific backups, but nevertheless I wanted to fix the process so that the cleanup cron job would only delete old backups if newer backups exist. After a bit of testing, I cam up with this one-liner:

for i in /home/backup/*; do [[ -n $(find "$i" -type f -mtime -3) ]] && find "$i" -type f -mtime +2 -exec rm -f {} +; done

That line will work great as a cron job, but for the purpose of discussion let's break it down a little more:

1. for i in /home/backup/*; do
2.     if [[ -n $(find "$i" -type f -mtime -3) ]]; then
3.         find "$i" -type f -mtime +2 -exec rm -f {} +
4.     fi
5. done

So, there are three key parts involved. Beginning with step 2 (ignore the for loop for now), I want to make sure "new" backups exist before deleting the older ones. I do this by checking for any files that are younger than the cutoff date; if at least one or more files are found, then we can proceed with step three. The -n test verifies that the output of the find command is "not null", hence files were found.

Step 3 is pretty much exactly what I was doing previously, ie., deleting all files older than two days. However, this time it only gets executed if the previous test was true, and only operates on each subdirectory of /home/backup instead of the whole thing.

This brings us neatly back to step 1. In order for this part to make sense, you must first understand that I backup multiple systems to this directory, each under their own directory. So, I have:

/home/backup/server1
/home/backup/server2
/home/backup/server3
etc.

If I just use steps 2 and 3 operate on /home/backup directly, I could still end up losing backups. Eg., let's say backups for eveery thing except server1 began failing. New backups for server1 would continue to get added to /home/backup/server1, which means a find command on /home/backup (such as my test in step 2) would see those new files and assume everything just dandy. Meanwhile, server2, server3, etc. have not been getting any new backups, and once we cross the three day threshold all of their backups would be removed.

So, in step one I loop through each subdirectory under /home/backup, and then have the find operations run independently for each server's backups. This way, if all but server1 stops backing up, the test in step 2 will succeed on server1/, but fail on server2/, server3, etc,, thus retaining the old backups until new backups are generated.

And there you go: a safer way to cleanup old files and backups.

Make GTK+ Apps Look Better Under KDE (plus mini GTK+ rant)

Submitted by jbreland on Mon, 05/03/2010 - 21:01

Anyone that knows me knows that I'm not a fan of GTK+ applications, the GTK+ toolkit itself, or indeed even the entire GNOME desktop. I don't hide this. I love Linux and open source software, but I've always thought GTK+ applications look ugly and feel wrong. I can't even fully explain why I have this averse reaction to GTK+, but I've literally always felt this way for as long as I've used Linux, going back to the Red Hat 6.0 days. Granted, GTK+ has come a long way since then, but I still don't think it can hold a candle to Qt, both in terms of look and feel and attractiveness. This is one of the (admittedly geeky) reasons I've long preferred KDE over GNOME.

(Interesting aside that I just noticed: I think a comparison between between the GTK+/QT and GNOME/KDE websites also says a lot.)

Unfortunately (for me, at least), many of the "best of breed" Linux applications are built on GTK+. I consider Firefox to be the best general purpose web browser, I think Thunderbird (despite some stagnation over the last few years) is still the best e-mail client, Pidgin is the best IM client, GIMP (even though it pains me to say it) is the best image editor, etc. Despite my bias against GTK+, these are great applications that I use every single day.

Running GTK+ applications under KDE, however, can be an unpleasant experience; aside from the whole "feeling wrong" thing mentioned above, they look absolutely horrendous by default. If you use a KDE-based distribution, such as openSUSE, Kubuntu, or Mandriva, the maintainers usually apply special themes to the GTK+ applications to make them fit in better on the KDE desktop. For users of desktop neutral distributions, or even (gasp!) GNOME-based distributions, though, you'll need to do some extra work to spruce up GTK+ applications.

There are several options to do this, but the easiest method I've found is to install a high quality GTK+ theme links against the Oxygen icons and widgets. For a long time I've used QtCurve to do this, which is mature and works very well. More recently I've switched over to Oxygen-Molecule which looks a bit more accurate under KDE 4.4. In either case, once you get the theme setup it'll be very difficult to distinguish GTK+ applications from Qt applications based on appearance alone.

Many distros already include packages for these themes, which makes installation extremely easy. For example, Gentoo names the packages x11-themes/oxygen-molecule and x11-themes/gtk-engines-qtcurve; Arch includes qtcurve-gtk2 in the base repositories, and oxygen-molecule-theme is available as an AUR package. If your distribution doesn't provide a package, you can install the themes manually by downloading them from the previous links; installation instructions are included in the download.

Once the theme is installed, you need to instruct your GTK+ applications to use it. Sadly, this can be tricky if you don't already have GNOME installed, as the KDE control panel only applies themes to KDE and Qt-based applications. The easiest way I've found to do this is to install gtk-chtheme. It's a lightweight theme switcher specifically for GTK+ applications, and should be packaged by most distributions. Run gtk-chtheme after it's installed and you should see a list of available GTK+ themes. Raleigh is the default "horrendous" theme that I described earlier. Assuming Oxygen-Molecule or QtCurve have already been installed, you should also see them in the list. Select the new them and hit OK. You'll need to restart your GTK+ applications for the change to take effect. After that... viola! Enjoy the attractive new look of your GTK+ applications.

A lot more helpful information can be found in the Arch Linux KDE wiki page.

New Navigation Feature: News Categories

Submitted by jbreland on Sun, 05/02/2010 - 15:55

This is something I've been meaning to add to the site for quite a long time. In the Navigation menu on the left side of the screen, you'll find a new News Categories link. Click on that and you'll see a list of all terms used to categories posts on this site. Click on any term and you'll see a list of all posts in that category. This provides and easy way to, for example, see posts relating to all of my software projects or tips and tricks.

It's also possible to grab RSS feeds for specific categories. For example, if you're only interested in posts about software updates, browse to the Software category, then select the RSS feed icon provided through that page.

Please report any problems in the comments. Thanks.

Port Testing (and Scanning) with Bash

Submitted by jbreland on Sun, 05/02/2010 - 14:53

Posts on my site have been rather... slow, to be generous. To try to change that, I'm going to begin posting neat tips and tricks that I discover as I go about my daily activities. Normally I just mention these to whoever happens to be on IM at the time, but I figure I can post here instead to share the information with a much wider audience and breathe some life back into my site. So, it's a win-win for everyone. :-)

I should note that many of these tips will likely be rather technical, and probably heavily Linux-focused, since that's my primary computing environment. Today's tip definitely holds true on both counts.

One of the neat features supported by Bash is socket programming. Using this, you can connect to any TCP or UDP port and any remote system. Of course, this is of rather limited usefulness as Bash won't actually do anything once connected unless specific protocol instructions are sent as well. As a relatively simple example of how this works:

exec 3<>/dev/tcp/www.google.com/80
echo -e "GET / HTTP/1.1\n\n">&3
cat <&3

(Note: Example taken from Dave Smith's Blog.)

This will establish a connection to www.google.com on port 80 (the standard HTTP port), send an HTTP GET command requesting the home page, and then display the response on your terminal. The &3 stuff is necessary to create a new file descriptor used to pass the input and output back and forth. The end result is that Google's home page (or the raw HTML for it, at least), will be downloaded and displayed on your terminal.

That's pretty slick, but like I said above, it's of rather limited usefulness. Not many people would be interested in browsing the web in this manner. However, we can use these same concepts for various other tasks and troubleshooting, including port scanning.

To get started, try running this command:

[ echo >/dev/tcp/www.google.com/80 ] && echo "open"

This will attempt to send and empty string to www.google.com on port 80, and if it receives a successful response it will display "open". Conversely, if you attempt to connect to a server/port that is not open, Bash will respond with a connection refused error.

Let's expand this a bit into a more flexible and robust function:

# Test remote host:port availability (TCP-only as UDP does not reply)
    # $1 = hostname
    # $2 = port
function port() {
    (echo >/dev/tcp/$1/$2) &>/dev/null
    if [ $? -eq 0 ]; then
        echo "$1:$2 is open"
    else
        echo "$1:$2 is closed"
    fi
}

Now, we can run port www.google.com 80 and get back "www.google.com:80 is open". Conversely, try something like port localhost 80. Unless you're running a webserver on your local computer, you should get back "localhost:80 is closed". This can provide a quick and dirty troubleshooting technique to test whether a server is listening on a given port, and ensure you can reach that port (eg., traffic is not being dropped by a firewall, etc.).

To take this another step further, we can use this function as a basic port scanner as well. For example:

for i in $(seq 1 1023); do port localhost $i; done | grep open

This will check all of the well-known ports on your local computer and report any that are open. I should not that this will be slower and more inefficient than "real" port scanners such as Nmap. However, for one-off testing situations where Nmap isn't available (or can't be installed), using bash directly can really be quite handy.

Additional information on Bash socket programming can be found in the Advanced Bash-Scripting Guide.

I hope you find this tip useful. Future tips will likely be shorter and more to the point, but I figured some additional explanation would be useful for this one. Feel free to post and questions or feedback in the comments.

Convert to FLAC 2.1.2 Released

Submitted by jbreland on Sun, 05/02/2010 - 13:43

I've released a minor update to Convert to FLAC. This fixes a cosmetic bug that could result in inconsistent status output when converting multiple files concurrently (ie., when using the -t option). I also added a -V option to simply display the version of convtoflac and then exit; I'll be adding this to all of my scripts eventually.

Users w/ multi-core or multi-processor systems are encouraged to upgrade.

For more information:
Convert to FLAC home page and downloads
Convert to FLAC ChangeLog

Feedback and Support

Modify Path Update

Submitted by jbreland on Fri, 04/16/2010 - 23:30

I updated the Modify Path (modpath) Inno Setup script. This is a small but important (and long overdue) update that fixes support for Unicode versions of Inno Setup. If you've had trouble using Modify Path with recent versions of Inno Setup, grab this update.

The update can be downloaded from the script's home page:
Modify Path

Desktop Upgrade

Submitted by jbreland on Wed, 02/24/2010 - 23:46

Update: 03/01/10 16:53
It's bought. I ended up going with the primary choices listed below. The motherboard was a tough call, as my current ASUS works so well, but I have a couple small nitpicky complaints that pushed me over to trying the Gigabyte board. Hope it works out.

Update: 02/26/10 03:00
I think I've decided on the motherboard and RAM. I updated the links below. I also decided to scale back to 4 GB of RAM. I'm simply having a hard time justifying 8 GB, even to myself. Since I'm only going to be using two of the four avaialble DIMM slots, though, I can always add more later if it becomes necessary.

At this point I should be ready to go, but I'm going to hold off a few more days (probably through the weekend) before making the purchase. I still want to do some more research to verify Linux compatibility with the motherboard and other stuff like that.

Update: 02/25/10 00:56
I've tentatively narrowed the motherboard down to two selections, one each from ASUS and Gigabyte. Models are listed below, but I still need to do more research on both the boards and RAM compatibility.

For the last several months I've been jonesin' for some of the new Intel Nehalem hotness (aka Core i7/9). From everything I've read, this is a major step up from the previous Core 2 generation of processors (which itself was a major step up from the Pentium line). However, I built my current desktop in March of 2007 (which, for the record, I'm very happy with), so I've bad to be patient and put off upgrading for a while. Given that March will be the three year mark for my desktop, though, I think it's about time to take the plunge. :-)

Now, as I mentioned above, I do very much like my current system, so rather than building a new computer altogether I'm just going to upgrade the guts of my existing one. For reference, here are the specs of my desktop. I plan on salvaging as much as possible, which should include the case, power supply, drives, video and sound cards, monitors, and all peripherals. The CPU, motherboard, and RAM will all need to be replaced (as well as the network card, which is integrated on the motherboard).

As I've done previously when researching components for my desktop and NAS, I'm going to post the details here both for reference and feedback. Since I'm just doing an upgrade this time, though, the list will be much shorter (and thankfully, much cheaper).

I'm just starting research at this point, but here's what I have in mind so far:

CPU
Intel Core i7-860 ($280 - Newegg)

This seems to be the real sweet spot right now in the Nehalem line up in terms of price and performance. It's smokin' fast, includes both Turbo Boost and Hyper-Threading (which are actually done right in Nehalem), and is not outrageously expensive. The Core i7-920 ($290 - Newegg) is another viable option, but although the two processors are approximately the same cost, the i7-8xx platform as a whole is cheaper than the i7-9xx due to the cost of other components. Unless something changes drastically in the next few weeks, the i7-860 will likely be my pick.

If you're interested in what makes this chip so lust-worthy, here are some (much) more detailed reviews by AnandTech:
Nehalem architecture overview: http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=3448
Lynnfield processor core overview: http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=3634
Core i7-860 review: http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=3641

Motherboard
Gigabyte GA-P55A-UD4P ($185 - Newegg)
alternative: ASUS P7P55D-E Pro ($190 - Newegg)

The motherboard situation is, as usual, a tougher call. There are loads of available options, ranging from <$100 to >$350, covering four different compatible chipsets (for the i7-860) and scads of different features. Here are a few options that should help narrow it down, though:

  • supports >= 8 GB RAM
  • includes >= 1 PCIe 2.0 x16 slot(s)
  • includes >= 1 Gb/s NIC
  • supports Serial ATA 6 Gb/s
  • supports USB 3.0

The RAM, PCIe, and NIC requirements are all pretty standard at this point. SATA 6 Gb/s and USB 3.0 are both brand-spanking new, though, so they're only available on a (relatively) few number of motherboards. I don't need either of these at this point in time, but I would like to have them available for future upgrades I'm considering (more on that below).

This excessively long Newegg link shows the current list of contenders, with prices ranging from $135 - $280. I'll narrow that list down just just two or three soon, but I need to do some more research first.

Memory
G.SKILL F3-12800CL7D-4GBECO, 4 GB (2x2GB) DDR3-1600 (PC3-12800) CAS 7-8-7-24-2N 1.35v ($120 - Newegg)
alternative: G.SKILL F3-10666CL7D-4GBRH, 4 GB (2x2GB) DDR3-1333 (PC3-10666) CAS 7-7-7-21 1.5v ($115 - Newegg)

I haven't decided on RAM yet, but the above options are the two leading contenders based on CPU specs and prior experience (I used G.SKILL in my current desktop and, once again, I'm quite happy with it). I won't be able to make a final decision until I've nailed down a motherboard, but here are my current thoughts:

  • I want 8 GB. Do I need 8 GB? No, but then again, this entire upgrade isn't based on need. :-) I currently get by with 4 GB just fine, and I expect that trend to mostly continue, but given how heavily I use my computer I'm pretty confident the extra memory available won't go to waste.
  • I have a few different options for getting up to 8 GB. 2x4GB kits are sold, and would be preferred because it'd allow me to keep two slots free for possible further upgrades, but they're prohibitively expensive at this point. 4x2GB kits are also available, but, oddly, they're both more expensive and offer fewer options than buying two 2x2GB kits. So, unless things change in the near future, I'll stick with two 2x2GB kits.
  • I'm paying attention to both CAS Latency and voltage. Lower latency will essentially allow memory to be accessed faster by the CPU, and a lower voltage will allow the RAM to run cooler and consume less power. However, there are trade-offs, involved, with price and overclockability probably the two most important. As with the motherboards I still need to do some research, but I think CAS 7 and 1.5 or 1.35v are reasonable choices.
  • There are several vendors who meet these RAM requirements, but as I mentioned above I've been satisfied with G.SKILL in my current computer, and their current models are still reasonably priced and have good reviews, so I see no reason to change.

Future Upgrades
There are several other components that I'd like to upgrade in the not-too-distant future, but I think I'm going to hold off a bit longer on them. Here are some current thoughts on this topic, in no particular order:

Hard Drives
I currently run WD Raptor 127 GB drive for my system drive, and a slower WD RE2 500 GB drive for my home/data drive. At the time I built this computer, the Raptor was the fastest consumer drive available, but it wasn't big enough to hold all my data, so added the larger, slower drive for that. It's worked out pretty well, but since than I added my wonderful NAS to the mix, and with 2 TB of usable storage on that, I just don't have a great need for a large amount of local storage on my desktop.

Within the next year or so, I expect to jump on the solid-state drive bandwagon. I already see extremely compelling performance from these drives, with the Intel X25-M G2 setting the current consumer standard (see the AnandTech review for details) and newer, more efficient SSD controllers recently being introduced. However, I think this market still has some room to grow before I'm ready to jump on board. I'm specifically looking forward to Intel's third-generation SSD (and the competition's response), which is due out in Q4 2010. I'm hoping that by that point the maturing technology will have the last few kinks worked out, and that the prices should start approaching "reasonable".

Back to the current upgrade process. I think I'm going to reformat the Raptor and use it for both my system and data partitions, and just lose the RE2 altogether. I simply don't need that much local storage anymore with my NAS, so consolidating to the single faster disk makes sense (not to mention the slight power and heat savings from removing the second disk). This will also act as a good transition to an SSD, as those drives are quite small.

Monitors
While my current 22" Viewsonic CRTs are gorgeous, they're also big, heavy, power sucking beasts. They served me well for the last few years, but I'm definitely ready to move on. Unfortunately, that magic combination of size, quality, and price of flat panel monitors still has yet to meet my requirements. I can certainly find some options available to day that I'd be satisfied with, but since my current monitors are still going strong I'd rather hold off a bit longer until I can (hopefully) get something I'm truly happy with. I'm going to evaluate my options again around the time I upgrade to an SSD, at which point I'm hoping there will be some better options available.

Video Card
Since I don't game very much on my computer these days (thanks primarily to the rampant anti-consumer use of DRM), my current video card is more than powerful enough my needs. Unfortunately, despite being the top of the line model of the GeForce 8 series when I bought it just three years ago, Nvidia chooses not to support it properly for video decoding and acceleration (despite cheaper, lower cost versions of the same damn series of cards being perfectly well supported; not that I'm bitter). I'm also limited to 2xDVI output connectors on this card, which again works fine for my needs, but could potentially limit my monitor upgrade options.

For now, I don't have any definite plans to upgrade the card, but I'll reevaluate this if and when I finally get around to replacing my monitors. At that point, I can probably get a more powerful, more efficient card with more flexible output options for significantly less than what I paid for this card initially. I'm not sure it'll be worth it, but it's an option.

Sound Card
I currently use an Audigy 2, which works fine, but I almost want to upgrade the card out of spite for Creative Labs, which has become radically anti-Linux in recent years. I probably won't of course, if only because getting surround sound working properly under Linux requires a certain black magic that I have yet to fully understand, which makes me hesitant to even touch my working configuration. Plus, honestly, I'd be no better off after spending the money than I am now. Nonetheless, good Linux support is important to me, and I have no problem supporting companies that also support Linux. If I see something particularly compelling with great Linux support (or, as great as Linux support can be given the mess that is ALSA) I'll probably go ahead and pick it up.

Everything Else
That pretty much covers it. I'm pleased with my current speakers, input peripherals, case, power supply, and optical drive, so unless something just breaks and needs replacement, I'll be sticking with what I have. Granted, these components (with the exception of my speakers) are among the least expensive components of my computer, so I'm not exactly saving oodles of money by sticking with them, but I'll take what I can get.

This post ended up being much more long-winded than I originally anticipated, but I guess that's not terribly unexpected for me. If you stuck with it through the end, I hope you found it at least marginally informative and entertaining. If you have any feedback on my product selection, or in fact any of my comments above, please leave a comment below. I'll also update this post as I finalize product selection.

Happy New Year

Submitted by jbreland on Fri, 01/01/2010 - 00:16

I just wanted to wish all of my visitors a happy new year. I haven't posted much lately, but rest assured, I'm still alive. :-) I hope all of you have a safe and happy new year.