Opacity in KDE 3.4 and NVidia is SWEEEEEEET.

I should say up front that I am *not* a slave to eye candy. I’m a system/network admin, so if there’s any eye candy around, it’s likely to be various monitors that alert me, visually, to trouble of one sort or another. However, I’ve found a really useful piece of eye candy in KDE 3.4, enhanced by the NVidia driver — opacity.

Opacity is a setting that affects the translucency of an object in KDE 3.4. Anything can now be transparent to some degree, settable by the user. To those who are saying “this isn’t new, transparency has been around for 3 years now”, you’re mistaken. What was around before was what I call “faux-and-slow” transparency. It was useful *only* as eye candy, because all you could see behind the transparent window was the desktop wallpaper. That’s not transparency, that’s a picture of that region of your desktop pasted into the background of your application. The “slow” part comes from the fact that when you used to move your application around, it would take a second to update its background. This isn’t useful, and isn’t what I’m talking about.

I’m talking about having a terminal application open, having it be set up with its opacity setting adjusted such that you can see through it without disturbing your ability to do work, and being able to see, for example, when new mail hits your inbox, or any other folder in KMail or any other mail application, without hitting alt-tab, without setting up some other form of alert, and without keeping your mail application on top or on a separate monitor in a dual-headed setup.

Without a doubt, this feature will change the way I do my work, as it allows me to even further optimize how I use the real estate granted to me by my monitors.

Excuses for my poor pool game

Once I had a pool table in my house, it didn’t take me long to realize that my pool game is not what it once was. Lucky for me, I have one huge glaring issue I can blame it on: lighting.

Most people don’t realize when they see pool on tv that a lot of time and care has been put into setting up the lights in just the right way. On a 9-foot tournament table, a standard four-lamp lighting fixture is perfectly centered over the table, at a predetermined height off of the playing surface. The outer lamps of the fixture are installed with higher wattage bulbs than the inner ones, and things are checked and rechecked to make sure that there is just about *no* detectable shadow being thrown onto the cloth. The effort expended on this is quite amazing – and completely justified…


Shadows kill pool games. You just can’t play any kind of a predictable game with shadows lurking about. Shadows can hurt you in two places that I know of (er, that I’ve experienced myself), on the cloth, and on the balls themselves.

On the cloth, a ball that stands in the way of a beam of light can cast a long shadow onto the cloth. The problem with that is that it’s almost impossible to line up and visualize your cue ball or object ball’s path with these shadows on the cloth. When you’re trying to visualize a shot, a shadow is a whole lot like 15,000 screaming fans waving white towels behind the basketball backboard during a free throw. You can still make the shot, but you’re using far more intuition and “feel” than you’d like in order to do so.

When there’s a shadow on the balls, bad things also happen. Especially when the shadow is on the cueball. Relatively dim incandescent light can cause half of the cueball to sort of fade into the background when you line up your shot. If you’re not consciously seeing the whole cue ball, you’re not really measuring how thick or thin your hit is going to be with any accuracy. You have to really spend a lot of time figuring your shot to make sure you’ve accounted for the shadows.

Shadows on object balls are bad for this reason and more. Frozen object balls with shadows on them can appear not to be frozen, or vice versa.

Look – shadows and billiards don’t mix. It doesn’t matter what your favorite game is, shadows will make it your worst game. Shots you’ve made dozens of times with your eyes closed *yesterday* will seem unmakeable today. It’s like nothing I’ve ever experienced, and I urge everyone never to play for money on a table that lacks proper lighting.

Laughing in the face of digital danger

A recent BBC article reports that network security researchers have discovered that over 1,000,000 home user PCs have been hijacked by viruses/worms/trojans to collaborate, forcibly, on bringing down websites, spread spam, and propagate viruses. Networks of collaborating “zombie” computers called “botnets” are formed from these machines, the largest of which was 50,000 computers working in perfect harmony. I laugh at these reports…

Microsoft regularly issues patches to help mitigate the security risk involved in running pretty much any one of their products. This in spite of constant lip service indicating that security is the “top priority”. I smile at the quaint, old-fashioned updates, viruses, trojans, botnets, and all that stuff. I haven’t dealt with it in years….


Don’t ask me to fix your windows PC folks. I don’t run it. I haven’t run Windows in about 6 years now. I have just about everything else, though. I have a Mac, a few Linux boxes, a Sun Ultra 5, and some other pretty random stuff. It hasn’t held me back — au contraire. I do all of my audio and video on my Mac, and I do just about everything else under Linux.

Think about it. Does it really make sense to spend how ever much Windows costs now, and then spend another couple hundred bucks to get an office suite (free with any Linux distribution), then spend another several hundred bucks to get graphics editing (free with any Linux distribution), and then spend some ungodly amount of time and money trying to keep your pictures, your email, your files, and your machine safe from malicious attacks? I say no.

And have you been to CompUSA lately? There is an ENTIRE aisle DEVOTED ENTIRELY to tools that you have to BUY to work around flaws in Windows… which YOU PAID FOR! This is absolutely mind-blowing. I walked down the aisle and looked at all of the packages, and I have to say that, while some of them were pretty ho-hum, a lot of them were things I had never heard of in my life! There’s something called a “windows cleaner”. It’s a “disk cleaner”. WTF is that?

By the way, did you know that defragging a disk is a Microsoft thing? Yeah, I don’t do that. Popups? I haven’t seen one of those in three years! Viruses? Yeah, the dopey kids who write those don’t know about cross-platform development, so they only work on…. you guessed it… windows. Trojans and worms? They tend to exploit services which are basically programs that run in the background on your *windows* box. I’ve never had one under either Linux or the Mac.

To be fair, Linux *can* be made to look just as nasty from a security standpoint as Windows. Wait. Actually… I can’t think of how to do that, really. Um — disregard that. I don’t think that’s possible. Maybe the Mac, though. ;-)

I understand that many, many people *have* to use Windows at work, but at home, I hardly see a reason to use Windows. I’m not saying “use Linux”, I’m just saying “get the hell away from windoze”. If you have reasons to stick with windows (besides Dreamweaver and games), list them as a comment here — maybe it’ll be enlightening to others who come here and see that they’re not alone.

AMD64? Why Bother?

First, for the non-technical: AMD is a company. Explaining the “64″ opens a whole bucket of technical jargon, and it’s a source of confusion even (er, especially) for those who think they’re technically savvy (I don’t totally exclude myself from that group). Anyway, AMD makes a CPU (just like Intel makes CPUs), and the ones I’ve used are fine, but…


I don’t understand why otherwise intelligent people seem to be spending good, unrecoverable time fiddling with the AMD64. More specifically, I can’t imagine why anyone would buy an AMD64 machine, new, specifically for the purpose of running a Linux desktop system! If you just wanna show off that Linux can run on different hardware, put it on your (by now aging) iPaq or Zaurus. Otherwise, just wait the five years it’ll take for application developers to get around to tuning/fixing their applications to run on that architecture.

This isn’t anything new, by the way, and it’s not specific to AMD. When the first itaniums came out, even the applications that you’d really *want* to run on a 64-bit platform didn’t work. The Linux distributions that claimed to “work” on IA64 at that time didn’t, really. They were laden with 32-bit compatibility libraries, they were deathly slow, and half the stuff didn’t work right anyway.

It’s now three years or so later, and from what I’m hearing, things haven’t really improved much. Multimedia applications under Linux just plain don’t seem to work, judging by the flood of forum and mailing list postings all over the internet, which means the one application of 64-bit computing likely to benefit the end-user *today* also won’t work: games.

Games are programs. They are rather large programs which have to process a whole truckload of data on a continual basis during gameplay. Every time you move during a game, the entire world changes in the game. Your position changes, the way everything around you looks changes in relation to your movements, things come in and out of range of your weapons, things that were far away become visible through the fog (or not), things leave and enter the scenery. This is just the tip of the iceberg. It’s a lot of data to crunch. The fastest way for an application to crunch numbers is to put them into RAM. If the architecture is 64-bit, the amount of RAM that the application can address isn’t just double the 32-bit amount — it’s 32 times the 32-bit amount! I’ll spare you the details, but if you know binary, it’s not hard to figure out that double 2^32 is just 2^33, not 2^64 ;-)

So putting all of that extra data in RAM is far better than putting 1/32nd of it in RAM and getting the rest from disk or (gasp!) a CD. However, if the underlying multimedia layers don’t work, who cares?

Note that this is all via hearsay, but I’m hearing that NVidia and ATI have yet to produce usable video drivers for linux64 (that is, Linux on a 64-bit platform), and the audio stuff (JACK, ALSA, etc) don’t work reliably either.

We’ve come so far with desktop Linux. And I’m glad some people have the patience to do this and help it along and all, but the reasoning escapes me.

I wish them luck, nonetheless.

Why Gmail is kicking butt

I have a beef with free internet mail accounts in general. I have one at mail.com and another at Yahoo.com. Both say they do all they can to make my experience wonderful and eliminate spam, but meanwhile, the only part of those services that works flawlessly are the ads. Enter gmail.

Yesterday and today, I couldn’t even log into my mail.com account. However, I was able to log into the beta version of mail.com. Mail.com is the most horrendous email service I’ve ever seen. Back when nettaxi.com offered email, in 1998, it was better — a lot better — than the email that mail.com is providing after some 7 years of technological evolution. That’s poor. Oh – but the ads work wonderfully.

What’s more, when I finally *did* get into the account, there were 37 new messages waiting for me. Not a single one *wasn’t* spam, which I guess is ok, since it’s the email I use to sign up for stuff online anyway, but of those 37, 24 of them had completely unreadable subject lines (foreign character sets), and another 3 or 4 identified themselves as spam and/or “sexually explicit” mail using the standards for doing so.

As for Yahoo, they suffer from the same problem as mail.com. They say they’re trying to protect me from spam, but they’re not.

And why should these companies protect me from spam or *not* deluge me with ads to the point of unusability? The whole point of their existence is to make money, and what you get when you buy an “upgraded” email account is…. better spam protection and NO ADS!!

….or you can just go over to gmail.com and sign up for an account.

I’ve been using it since the beta was launched, which I guess was last summer or something. It’s actually worked pretty much flawlessly. Early on there were a couple of glitches where I couldn’t sign on at all, but I was always able to get in 5 minutes later, which is more than I can say for mail.com, whose service availability record is disgusting.

In addition to being a very solidly available service, if there *are* any ads, they’re unintrusive enough that I don’t remember them right now. I recommend gmail.com. Barring any major unforeseen changes in their service, their unique message handling features and flexibility, along with a great interface, they stand to be the best service around for some time.

Newsflash: Other People Exist

I have to say that I think the level of self-absorption in people has reached new highs, and I can’t help but be somewhat shocked. Yesterday, in the waiting room at the doctor’s office, a woman was, for a good 5 solid minutes (a long time to deal with it), endlessly pressing buttons on her cell phone. The phone was apparently configured to make a beeping noise to register to the user that the button-press had been recognized by the phone. This lady was, I guess, used to it. It drove me absolutely nuts. But was it really the noise that drove me insane, or the fact that this woman was completely oblivious to the fact that this noise might bother others around her? Wake up!

She’s not the only one. A few weeks ago, Natasha and I were grocery shopping, and came across a woman who had positioned herself precisely the right way to stop anyone else from using the aisle at all. It was truly amazing. Her cart was lengthwise across the aisle, and she stood at the end reading a box. I moved the cart… loudly… and we were on our way — only to find her again, in the same exact position, a few aisles later! Wake up!

People need to wake up and realize that, indeed, other people *do* exist.

As everyone knows, I work in computing. I support research computing, and one of the things I do is maintain a Linux beowulf cluster, which basically lets people do whatever they do on a normal computer, only a lot faster, partially by running a lot more programs at a single time. One program we run on the cluster has only 150 licenses available for use by some number of thousands of people. I came in to work yesterday morning to find that one person was using… 66 of them! Not only that, he was also using every single resource the cluster had to offer! Inconsiderate. Other people exist!

I decided I want to start my own, grassroots, “jackass” style show where I just keep a digital video camera around to capture people doing these things, and their reactions when I say “excuse me, do you realize that you’re not the only person in the world?”. Stay tuned for that. If you want to help, let me know.

Google lost my website

Well, they didn’t really *lose* my site. What happened is this: I run a site called LinuxLaboratory. Recently, it went through a major overhaul. So major, that I didn’t bother even trying to ensure that old links would work. It’s been about 6 weeks now, and I’m noticing that I’m not getting very much traffic. I figured the dropoff was due to broken links on other sites that used to link to me. Wrong.

What really happened is that Google hasn’t indexed my site since I redesigned. I’m no expert on this stuff, but I figured if I put some static html links to the content of the site right on the index page, that would help spur things along. Before I did that, all that was there was an expandable javascript tree menu, which I don’t think Google will crawl. On the flip side of all this, the search functionality on linuxlaboratory works wonderfully ;-)

NJ Drivers: Kindly hang up your phone

Dear NJ Driver,

It would appear as though you are in complete and utter denial of the fact that, when talking on a cell phone, you are unable to pay adequate attention to the task at hand which deserves the higher priority: driving. I will tell you that whether you are a mother, an executive, someone with an overexuberant social life, or otherwise, there is *nothing* that is so important that it needs to be discussed while driving. What did you do before the advent of the cell phone? What makes you think you’re so damned important and so much better than everyone else that you and you alone should be allowed to talk while driving? If you’re so damned important, then whoever is on the other end of the line will happily wait for you.

By the way, I’m not overlooking the fact that the real fault here lies with the self-important bastards who decided not to make this activity punishable on sight, so they can do it themselves. In NJ, you have to first do something else wrong, like drive over the yellow line, in order to be pulled over. Only then can you be written a ticket for talking on a cell phone. This makes it pretty much unenforcable 99.99999% of the time, and it’s really unacceptable.


There is not a day that goes by that I don’t see someone doing something completely mindless, only to discover that they were talking on a cell phone when they did it. Used to be that seeing people swerve all over the road incoherently was a sign of a drunk driver. Now, it’s a sign that either a teenager is driving, or the driver is on the phone — many times it’s both.

Please hang up. You aren’t super-human. No. No you’re not. You can’t do both. It’s not that important. Just stop it. Half of you don’t drive safely when you’re not on the phone. Adding a phone to your driving is just going to make it worse, and make our roads even more unsafe than they already are.

School Daze

So I never thought I’d say this, but school has been kinda fun so far. It’s also been kinda surreal in a way. At the end of the day, I think the best lesson I’ve learned is that, in order to conquer some of my anxieties toward school, I really need to just forget all about high school…

I was a poor student in high school. I’d start out every year doing sort of ok, and then slowly slide into boredom. By the time mid-semester came around, I’d be doing not-so-great, due mostly to missing homework (boring), and then I’d fail the exams. Well, fail enough exams, and you’ll get to thinking that maybe you’re not so smart. Poor self image, poor grades, and all kinds of other generally bad stuff are likely to follow.

By my junior year of high school, I had pretty much convinced myself that I was incapable of doing algebra. I was shuffled into a “business math” course put together to enable those students who were not looking to go to a four-year school to still get enough math credits to graduate. I got an A in that course, but it was little more than basic math applied to business-related word problems.

Now, I’m being absolutely blown away by how well I’m doing in math. I’m working hard, but if that’s all I have to do to do this well, that’s fine with me. I got 100% on my first exam, and when I asked my professor if he thought this class was adequate preparation for a shortened version of the *next* class, he said I should really just skip the next class altogether. Hm. This, he said, was based on the questions I ask and answers I give in class.

It’s unclear to me whether or not moving from college algebra to calculus without first taking pre-calculus is a wise move. I’m undecided. I’ll have to find some method of evaluating myself for this move. I’ll let you know. It’d be nice, because math courses in college don’t really count until you get to calculus.

later.