Saturday, September 18, 2010

Moore's Treadmill

Early in my PhD, I told my advisor I had implemented a particular cryptographic algorithm with a tolerable running time. It wasn’t that fast, but I figured within 2 years, microchips would be twice as powerful, so my code might be practical even on handheld devices.

He replied: "No. Then they’ll halve the power consumption to double the battery life." Or they’ll want to run it on yet smaller devices. Laptops, phones, smart cards. I still had plenty of work to do.

For me, user interfaces are trapped on the same treadmill.

Less is more

Not so long ago, my desktop had a 14-inch CRT display, a lonely and lowly processor clocked at 200MHz, a few gigs of hard disk space, and a 14.4k net connection. The dearth of screen real estate favoured spartan user interfaces, the CPU struggled unless tools were quick and nimble, and the cramped conditions of the hard disk and net connection favoured good things in small packages.

For example, when the time came to take sides in the text editor holy war, I chose vi because it would only take several minutes to grab a few hundred kilobytes. Emacs was over 6 megabytes.

Now I often work at a PC sporting multiple cores, each over 20 times faster than my old workhorse. I have multiple monitors, each so large that I turn my head to look at different parts of a screen. Network latency is low, and bandwidth is high.

Yet I remain minmalist. Indeed, I keep finding new ways to trim the fat, despite having ample space. For instance, my first web browser had big buttons and a plethora of bars: title bar, menu bar, bookmarks bar, status bar, location bar, etc. Today, even though I can afford extra fluff, my primary browser shows little apart from the webpage: a bar for tabs (itself a space-saving measure over multiple windows), a single textbox, and a handful of buttons.

I chose my style partly because I’m a programmer to the bone: my drive to automate, simplify, and optimize carries over from coding to real life. However, there is a more compelling reason.

Technological advances mean my laptop can handle tasks that once required a desktop. But the laptop screen is even smaller than my old monitor. The net connection can be spotty, especially via tethering or a smartphone WiFi hotspot. The CPU often underclocks to save battery, and moreover, I fritter away cycles on fripperies such as visual effects: I’m a Compiz plugin junkie. I therefore encounter the same problems I faced over a decade ago.

A Voyage to Lilliput

Beyond laptops, we have netbooks, tablets, and smartphones. The world is getting smaller. Being a geek means I’ll attempt text editing, gaming, programming, etc. even as devices keep shrinking, bringing new user interfaces challenges. My current extreme case is my Nexus One, with its 3.7 inch display and no keyboard.

Rather than work on the phone directly, I ssh to a more powerful computer via ConnectBot.

In the past I praised Bash one-liners. Shelling in from a smartphone drove me to appreciate Bash one-letter aliases. I’ve grown accustomed to using them on all computers. Some favourites:

alias c=cd
alias ..="cd .."
alias v=vim
alias g=git
alias l="ls -CF --color=auto"
alias s=sudo
alias k=colormake
alias rm='echo mv to /tmp instead'

(The "rm" alias trains me to avoid this command. See "Accidents Will Happen" in The UNIX-HATERS Handbook.)

Editing is unpleasant but bearable. In these circumstances, I feel Vim’s single-letter commands work in its favour, as does its modal nature. Also double-tapping the terminal brings up an "Esc" button, which suits Vim beautifully.

I don’t intend to code from my phone often as it’s like eating with tweezers. Though for fun, I followed through a suggestion in my last post and developed a J one-liner to sum the primes less than 100 using only a ConnectBox session on my phone. Soon I had:

+/i.&.(p:^:_1)100 NB. Cooler version.
v=:[:+/i.&.(p:^:_1) NB. Tacit verb definition.
v 100

Thursday, September 9, 2010

J and I

For years, I’ve been meaning to investigate the APL programming language family. Months ago, I finally leafed through a few introductions to the new and improved APL known as J.

I liked what I saw. I enjoy paring down C code, but J makes my best efforts look as verbose as a legal contract. Introductory J examples are already cryptic enough to induce watering in the untrained eye. In the right hands, a J program can be compressed so densely that it threatens to collapse into a black hole of inscrutability.

I’ve heard about an obfuscated C tattoo, as well as a Lisp tattoo. They should have gotten J tattoos! Instead of Hello World or the Fibonacci sequence, why not sport a terse program that solves a Sudoku, or finds a QR decomposition of a matrix?

More generally, J appears to be ideal when source size matters most. For example, programming via a smartphone with a tiny screen and keyboard.

A day at play with J

J systems are freely available, but I reasoned I’d get a better feel for the language by writing my own interpreter. Thanks to J’s elegant design, I soon got the celebrated "mean=:+/%#" example working. That was enough to gain an appreciation (but also a little contempt) for J’s simple grammar, array handling, and above all, compact notation. A J symbol is worth a thousand machine words. See my J notes.

I kept picking at it for a while, but I think I’ll stop soon. It’d take substantial effort to implement types apart from doubles, not to mention error handling, memory management, arrays with axes of zero length, and fills.