Wednesday, September 12, 2007

Automating Automounting

Newer, friendlier Linux distributions such as Ubuntu must surely make it easy to use CDs, CompactFlash and SD card readers, USB drives, external hard disks and so on. But I'm old-fashioned, and like to do things the hard way, that is, edit /etc/fstab and run mount and umount manually so various devices end up at places I like in my directory.

The problem is hot-pluggable devices get assigned different names in the /dev directory all the time. So I followed the first part of this guide that assigns the same name (that you choose) to the same device every time. Once this is done, you can modify /etc/fstab in the usual way.

While there, you might want to use the "noatime" mount option, or at least "relatime". See this Slashdot article.

Friday, September 7, 2007

Git Magic

As I promised in a previous post, I'd write more on tricks one can perform with Git. I had so much to say that I made a site out of it: Git Magic.

It was also a great opportunity to practice my salesmanship, as can be seen from some of section titles: Ultimate Backups, Light-Speed Multitasking, Guerrilla Version Control, Uninterrupted Workflow. It was a probably a lot more fun to write than it is to read.

I'm hoping it will be useful to those new to version control, not just those new to Git. I took pains to explain the effects of Git branching. I had trouble understanding this from the official Git documentation, though it's probably better by now.

Wednesday, September 5, 2007

Wiki Formats

For years I've dealt with raw HTML for a lot of my web pages. After some experience with various wiki sites, I've finally realized that it's a lot easier to maintain a bunch of simple text files marked up in some wiki format, and use clever scripts to convert them to HTML for publishing.

Source code should always be readable, and HTML does not quite make the cut. Reading documentation in raw HTML form is painful. Inspired by the Inform 7 language, I want my text file marked up with wiki to as close to a real document as possible. In other words, I want the source to be a pretty text file that you can read even if you don't have the HTML version. A bit like txt2html, but I want more control. Compare my homepage with it's wiki source.

And if you do need complex HTML, it's easy enough to provide a mechanism to temporarily disregard wiki tags.

I wrote a script that takes converts a text file, marked up in a certain way, and converts it to HTML with these goals in mind, and it's kept in a Git repository at
  • http://cs.stanford.edu/~blynn/scrap/wiki.git
[20090608: updated link]
I'll have to write instructions one day, but basically, headings are designated with equal signs around the heading, two single quotes are italics, asterisks are used to mark text as bold, a minus sign and indentation for lists. Other indentations mean text that should be displayed unformatted and as fixed width.

I'm gradually converting all my HTML to this wiki format, as it's easier to edit and maintain. An added bonus is that I can easily write scripts to output a different format, such as DocBook.

Wednesday, August 29, 2007

Git Revisited

To my consternation, the cogito package disappeared from Debian a few updates ago. I soon discovered the reason. Contrary to Linus Torvald's prognostications, raw Git is now (verson 1.5) a user-friendly version control system and does not need wrappers.

Still, I had to learn some new syntax. Luckily I was interested in learning about Git's more advanced features so I didn't mind. I was only previously using Git via Cogito in the same way one would use CVS or Subversion, except that it was faster, more robust and easier to use.

Now I finally understand why Git can do so much more. Better late than never!

101 Ways To Use Git

Git is the duct tape of version control. When I first started using it, I only cared about a very specific problem and ignored Git's design and internals. I focused only on remembering the Cogito commands I needed to type to do what I want, oblivious to how it worked its magic. Relatively poor documentation is partly to blame, but in truth, it is extremely difficult to convey Git's extraordinary versatility.

The book "Design Patterns" showcases the diverse problems that object-oriented design can elegantly solve. I wish there were an analgous Git book, instead of explaining Git fundamentals and leaving the reader to derive the implications, gave a set of recipes to accomplish a variety of tasks.

Below are a few examples of Git recipes such a book might contain.

Instant Backup

Just as in a computer game, when I'm about to attempt something drastic I like to save the current state, so I can go back and try again should things go awry.

git-init
git-add .
git-commit -m "Initial commit"

to take a snapshot of all files in the current directory.
Then if something goes wrong type

git-reset --hard

to go back to where you were. To save the state again, type git-commit -a. You'll have to type in a description of the current state.

One benefit to doing this instead of simply copying files is that with Git's hash chaining, you can tell if a backup gets corrupted.

Undo/Redo History

More generally, you can do the above, and every so often "save the game" by typing

git-commit -a -m "description of current state"

Note if you want to keep track of newly added files or forget about deleted files you'll need to first run "git-add" or "git-delete" accordingly.

Typing git-log shows you a list of recent commits, and their SHA1 hashes. Then typing

git-commit -a
git-revert SHA1_HASH

will restore the state to the commit with the given hash. You might like to use something like the following instead:

git-revert "@{10 minutes ago}"

You can undo the undo: type git-log and you'll see that the other commits you made are still there.

To take the computer game analogy again, git-revert is like loading a game and recording this fact as another saved game, and git-reset --hard is like loading an old save and deleting all saved games newer than the one just loaded.

Sync Files Between Computers

This is the reason I first used Git. I can make tarballs or use rsync to do backups. The problem was sometimes I'd edit on my laptop, other times on my desktop, and they may not have talked to each other in between.

Initialize a Git repository and commit your files as above on one machine. Then on the other:

git-clone git+ssh://other.computer/directory

to get a second copy. From now on,

git-commit -a
git-pull git+ssh://other.computer/directory

will pull in the state of the files on the other computer into the one you're working on. If you've recently made conflicting edits in the same file, Git will let you know and you should resolve them with another edit and commit.

If you have not committed any changes on the other computer, you may type

git-push git+ssh://other.computer/directory

to push the current state to the other computer. Next time you're there, run git-reset --hard or git-checkout HEAD . to update the files.

That Git does not do the last step automatically is good in case you were in the middle of some uncommitted changes.

Tune in next time...

I've barely scratched the surface, and so far haven't shown anything you can't do with older version control systems. Next time I'll show off problems that only the new generation of version control can solve.

Monday, July 23, 2007

Elective Thinkpad X31 Surgery

My trusty IBM Thinkpad X31 that Stanford bought me in 2003 has taken everything I've dished at it with only a couple of cracks in the case to show for it. At least, until recently. One awful afternoon, after hitting the power button I was briefly greeted with the message "Fan error" before it shut itself off. No amount of coaxing would bring it back.

My desktop is truly ancient, dating back to the last millennium, so I had grown accustomed to doing most tasks on my laptop instead. The first task was to retrieve my data, which is easily achieved with a SATA/IDE to USB adapter. Only one screw needs to be removed to retrieve the hard drive.

I was tempted to take the easiest but most expensive path: shell out for whatever passes for the best of the Thinkpad X series these days. But although working in industry means this remedy wouldn't sting as much as in my student days, it also means I don't spend as much time on my own machines. Now there's other things I'd prefer to get for that kind of money. Similarly, I didn't want to pay for it to be serviced.

My only recourse was to attempt a repair myself. The Lenovo webpage about servicing the Thinkpad X31 spooked me a little. What's going on in that diagram? Do I have to take it apart into a hundred pieces just to replace a fan? The table suggesting that fan replacements should not be undertaken by end users wasn't encouraging either. With a heavy heart, I ordered a 67P1443 fan anyway.

When it arrived, enthusiasm and impatience triumphed and I took out every screw I could see, but the fan remained elusive. Eventually I regained some sense, and Googled for "thinkpad x31 disassembly". The first result contained a links leading to the official Thinkpad X31 hardware maintenance manual. I should have known. What isn't online these days?

Fan replacements turn out to be almost trivial. After removing the battery and hard drive, I only had to remove the four clearly indicated keyboard screws, and then slide the keyboard out. Underneath, I discovered the reason for the fan error. Nestled in the fan blades was a tiny piece of foam, sticky on one side and rose-tinted on the other. A pink pad in my Thinkpad.



I suspect that the fan is still operational, but since I had a new one and had already removed the five screws securing it, I replaced it anyway. Also, it's easier to apply thermal paste to a new fan than clean and reapply to an old one.

The tougher part was figuring out where the pink pad belonged. There was another one in there somewhere that also looked like it had been slightly dislodged. I settled on pressing both of them against some metal frame in the center, where I presume they work best as shock absorbers.



Soon after I received my laptop I wrote a webpage about Linux on a Thinkpad X31. Only when I started blogging did I realize it would be much easier (for me at least!) to maintain with blog posts, and that's probably what I'll do in the future if I have more to add. Sure, it's nice to have all information in one place, but this can be approximated through appropriate use of tags. Furthermore, the dates on each post may also give the reader some idea of how relevant it is.

Sunday, July 1, 2007

ALSA and MIDI

I switched to the ALSA driver from the OSS driver for my Creative Sound Blaster Live! card long ago, but I'm still not used to how MIDI is done. I'll record the relevant commands this time, so I won't have to look them up yet again.

Lately, I've had to manually load the snd_seq kernel module.

Programs I wrote that look at /dev/midi00 should now look at /dev/snd/midiC0D0. The behaviour of this device file differs too: I had to change my code so that it could handle a read() call that returned less than the requested number of bytes.

To play MIDI files on my digital piano, I run pmidi -p 16:0 foo.mid.

To send MIDI events to fluidsynth from my digital piano, I run fluidsynth 2> /dev/null (to get rid of spurious warnings), load a soundfont, then run aconnect 16:0 128:0.

Tuesday, June 12, 2007

Nvidia GeForce 3 Cards on Debian

For a long time I used the Debian nvidia driver packages, but GeForce 3 Ti 200 cards are now considered legacy cards, and not supported by the 97xx and newer series. After a recent upgrade, I had to remove my Debian nvidia packages and
  1. Download the 96xx series Nvidia Linux drivers.
  2. Run the install script as root.
  3. Since I use the xorg X server, I had to move files around:
    mv /usr/X11R6/lib/lib* /usr/lib/xorg/modules/
    mv /usr/X11R6/lib/modules/drivers/* /usr/lib/xorg/modules/drivers/
    mv /usr/X11R6/lib/modules/extensions/* /usr/lib/xorg/modules/extensions/