Bob on Development

February 24, 2007

We’ve Moved!

Filed under: Uncategorized — Bob Grommes @ 6:31 pm

Our traffic has gone through the roof in recent days so it’s time to get our own domain and move to a hosted solution. This blog is now at bobondevelopment.com. Everything including historical posts have moved, so come on over:

New URL: http://bobondevelopment.com

New RSS feed: http://bobondevelopment.com/feed

Thanks for reading!

February 23, 2007

More on “Cheap” Software

Filed under: Managment,Projects — Bob Grommes @ 2:53 pm

Last month’s post on this topic has proven very popular.

Recently, Web Worker Daily posted about ways to make money online, and one of the ways listed was to troll the freelance programming marketplaces.

Naturally I couldn’t resist commenting that I could not figure out how anyone can make honest money from those sites!

Warren Seen of Tasmania, Australia mentioned his excellent freelance programmer’s manifesto that I highly recommend you take a look at. I only wish he had called it the freelance software developer’s manifesto. It’s a positioning thing; “programmer” evokes the image of some kind of rote technician or assembly line worker; “software developer” is in my view a much more accurate way to describe what we do. We do not snap together things so much as organically evolve them.

In response to someone’s remark that “people with an eye for quality will always pay a premium for the right kind of service”, Warren also makes the point that “quality as a unique selling point is difficult in software”. How right he is. In a way, your clients have nothing but your word that your $90 an hour efforts have any more value than someone else’s $7 an hour efforts. If this perception were not tripping clients up, there would be no offshore labor market to contend with.

That is why developers must realize they are in a marketing game as much as a technical game. That is one reason I avoid using the term “programmer”. There are clients with “an eye for quality” and you want to attract those kinds of clients. As you get acquainted with the right kind of clients you learn what they like to see. Confidence, honesty and excellent communication skills (both orally and in writing) are important.

Another thing I’ve found invaluable is making clients feel in control by educating them in layman’s terms about what I’m doing and why, and what to expect next. This really differentiates you from someone who says “Sure, I can do that, it’s very simple and inexpensive” and then disappears, only to return (usually late) with something that’s not what the client actually wanted, is full of bugs, or turns out to be impossible to maintain.

All of these practices, coupled with best practices, lead to the truly “affordable” software … the stuff that’s on time, works right, is stable and flexible, and delights the customer. There are no shortcuts.

Lastly, to turn this discussion around, let’s look at the wrong kinds of clients and the whole issue of when to say “no” to a project. You can’t afford to waste your time on abusive clients with unrealistic expectations.

Some clients are dead giveaways — like the fellow who wanted to hire me to clone the Google search engine in six months, and then when speaking of compensation asked me “how much lunch money to do you need?”

Others are more subtle. Constant pressure to do things before they’re done is a major clue, though. In his excellent post, “Programmers are Brain Surgeons”, Sau Sheong Chang asks the rhetorical question, “if your brain surgeon said some procedure would take five hours, would you pressure him to do it in three?” (He gives credit for this quote to Scott Berkun in The Art of Project Management).

Now I grant you I am not as highly educated and trained as a brain surgeon and the software I write seldom influences life-or-death scenarios. But the point here is not that I’m trying to suggest what we do is as awesome as brain surgery; I’m suggesting that the results of rushed software development and rushed brain surgery will be equally unsatisfactory.

Another similarity between the two professions is that if a brain surgeon encounters unexpected problems he does not stop at five hours and walk away; he spends as much time as necessary to complete the job and overcome any difficulties. Clients who have a cow because you estimated five hours and spent ten on a task, just don’t understand what this game is all about. (Naturally if you always take twice as long as you estimate, you need to quit being so optimistic, but that’s a topic for another day).

Clients who always push you to do things yesterday either don’t trust you and assume you are always highballing your estimates; or, they are trying to take advantage of your desire to please them; or they are just plain clueless and cheap. Either way, they are toxic clients.

“Quality” trumps “cheap” any day … for those clients who can appreciate the difference and those developers who can help them appreciate the difference.

February 17, 2007

Seeing What your Users See: User Interfaces And Limited Choice

Filed under: Techniques,UI Design — Bob Grommes @ 11:18 am

Raymond Chen’s The Old New Thing is a must read for anyone creating software that interacts with users. Chen was one of the developers of Windows 95 and the surface reason his book is provocative and interesting is because it exposes a lot of the thinking behind the Windows UI and answers questions that bug people, such as “why do you press the Start button to stop the computer”?

The more compelling usefulness of the book, however, is Chen’s deep experience with how users respond to the user interface. I recently remarked about how a dialog box on the United Parcel Service web site probably is ignored by most users. Chen explains why. In the first chapter of his book, he has the following priceless mock dialog that shows what a user perceives when a dialog interrupts their work:

What a user sees when they read your dialogs.

Chen does not point this out to make fun of users. As he puts it, the problem is that “all the user wants to do is surf the web and send emails to their grandchildren.” The dialog is in the way of that, so it gets swatted away like an annoying fly.

This is a blow to the pride of many developers, who easily forget that their software is not the center of its user’s universe. It is simply a tool, a means to an end.

Developers throw up dialogs because it’s generally easier to let users make decisions than to figure out what needs to be done at a given moment. It is also easier than designing software that is forgiving, resourceful and resilient.

In my original story, I explained that if a UPS shipper specifies a shipment value over some threshold amount, a dialog appears explaining in some detail that because of the shipment’s high value they must obtain an acceptance signature for the shipment from an actual UPS employee. If the shipper doesn’t do this, any loss claims will be invalid. I opined that most users will never register this info, will swat the dialog aside, and buy worthless insurance.

How might this have been better implemented? I would have put it right in the workflow of the wizard that was managing the shipment, rather than interrupt with a notification and an OK button. Then I would give them the choice of reducing the valuation, requesting a pickup, or locating a company store to drop off at — all without losing the context of what they are working on, mysteriously completing the shipment, or otherwise doing anything unexpected. In this case, I’d require more decisions from the user, because it’s the only way to insure they are not just throwing money down a hole.

In most cases, though, it’s better to take some reasonable default action and say nothing at all. For example, usually if a users closes an open document without saving, they meant to save the changes up to that point, so at the very least, the default choice, if you present one at all, should be to save the changes. Instead what we see in most applications is a dangerous default such as not saving changes, or backwards questions that ask if you want to discard changes rather than keep them. Remember, the user isn’t really reading this, so if they just hit the Enter key you probably should err on the side of caution, and not risk throwing away the last forty minutes of work.

Come to think of it, this is why auto save features were probably born; I have long been in the habit of saving my work every minute or two but not all users have that healthy paranoia; a configurable auto-save feature, turned on by default, does that for them.

Users usually don’t want all the nifty choices we think to offer them. Often, these choices cause more harm than good. They should be absent, or hidden behind some kind of “expert” mode.

Think through the features in your user interaction and make them as free of “friction” and fluff and endless choices as you can. Let the user get the critical path of their work done with ease. Save your “interruption capital” for those times when there is something the user truly neeeds to stop and make a decision about … a decision that matters to the user.

February 14, 2007

What Kind of Software Will You Be Developing in Ten Years?

Filed under: Products — Bob Grommes @ 7:13 am

I try to read the tea leaves and determine where software developiment is headed over the medium-term, so I have some idea how to build my skills. Once in awhile I try to stretch outside the current box to see where things are heading, because every few years there is some kind of paradigm shift. For example, in 1990 it would have been difficult to predict that by 2000 my software development efforts would transition from 100% desktop applications to about 90% web-based applications.

Although not a hardware alpha geek, I’ve long been interested in the possibilities of practical robotics. Our household, for example, was an early adopter of the Roomba vacuum cleaning robot. Alas, we found it just a bit too glitchy and not that durable, even though our all-tile home is perfect for the device. For whole-house cleaning, by the time you lay down all the infra-red “fencing” and occasionally rescue the thing from getting wedged under couches or help it out of close quarters that confuse its collision sensors, plus empty, clean, and recharge it, it’s just as much bending and stooping and vigilance as it would require to do it the old-fashioned way.

But today I stumbled onto the LEGO Mindstorm, a truly programmable kit robot that is pushing the state of the art of what can be built in the way of interesting and flexible, if not yet useful, robots. For a few hundred dollars (for the base model and a few optional add-ons) you get a highly modular, bluetooth-enabled robot that understands compass directions, has crude vision (just light and color, no pattern matching), basic touch and auditory sensors, and “walks” on two “legs”. Client software for Mac or PC allows simplified graphical programming. It is still only a toy, but it is likely a harbinger of things to come, and I would not be at all surprised to find myself, ten or twenty years hence, customizing the behaviors of people’s servant robots.

If LEGO can produce this much functionality starting at about $250 US, what is already available for a few thousand? And how soon will voice-controlled quasi-humanoid robot companions be doing useful work for and with us? I suspect robots will eventually be a whole new platform for software development. For the truly interested, LEGO has even released the robot’s low-level operating system as open source software. If you want a little taste of the future without a lot of cash outlay, this may be one way to get it, although personally I plan to wait for something just a little more compelling to come along.

What I really want is a robot who will fetch me a glass of milk and feed the parrots. That’s a little beyond LEGO’s current product vision, but I’m starting to think it’s not exactly science fiction anymore. And it has not escaped the notice of Microsoft, either. How many of you out there knew that Microsoft is already heavily into this game? I give you Microsoft Robotics Studio.

Emerging standards … Korea stating they want a “robot in every home by 2013”Bill Gates crowing about this being a nascent new industry… hmm ….

February 12, 2007

Getting Your Feet Wet With VMWare

Filed under: Products — Bob Grommes @ 7:53 pm

If I’ve learned anything about Microsoft operating systems, your biggest enemy is what I call “system lint” — the accumulation of detrius that gets stuck in the plumbing when you do too many installs / uninstalls / upgrades. My last development box, a Dell Dimension that’s now pushing four years old, was beginning to suffer from this fate, and nothing dismays me more than spending many tedious hours installing dozens of packages and updates and add-ons to get to a usable system, moving all my settings and data over, etc.

Because I’m not a sysadmin type of geek, I tend to buy well-endowed, bleeding-edge machines to stave off obsolecense, and then keep them for a long time. I also try to keep the installed suite of software as simple and stable as I can, including Visual Studio and named instances of SQL Server 2000 and 2005 to support all my current projects. The new hardware is a Falcon Northwest 2.66 gHz Core Duo (conservatively overclocked to 2.87 gHz) with 4G of RAM and 700G of storage in a RAID5 configuration. By the time my next acquisition cycle comes around in about 2010 or 2011, this computer will be looking lame compared to the 16 core boxes that will likely be typical by then, but it should serve my purposes just the same.

Alas … one of my new clients writes software that integrates with TimberLine Accounting Software, which means I am faced with loading up my machine with icky things like Pervasive Database and Timberline itself (likely in multiple versions each for different clients), the ponderous COM-based API for talking to TimberLine from external applications, and (so far) three third-party libraries, with a fourth on the way.

I decided it was high time to get my feet wet with virtual machines (VMs). Yeah, I know, I’m a bit behind the times; I’m perfectly aware that virtualization has been hot, hot, hot during the past year or so. I have a couple of colleagues using VMs already. But I find that waiting until there is a compelling need usually means that (1) I have enough motivation to work through the pain of the learning curve and (2) the technology has had time to mature a bit.

Being a small business, I use a fairly unscientific product evaluation and testing process: the product under consideration must be affordable, and must Just Work. If none of the available options can Just Work then I’d better have a really compelling need to put up with the least of the available evils.

First out the gate was Parallels, the newer kid on the block. In the Mac world, Parallels is the only virtualization technology for running Windows on top of Mac OS/X and it works, by all accounts, quite well. Parallels Desktop is the version for hosting guest operating systems on the PC, and it’s a free download.

I found Parallel’s install simple and intuitive, but the first thing I did with it was to create a VM with 1G of RAM and tried to do a fresh install of Windows 2003 on it. It hung during the part of the install were drivers are loaded. I emailed tech support about this issue and never heard back from them. So much for Parallels. Brutal, I know, but it’s the way I do these things.

Next up was VMWare, on the theory that it’s a more mature product, and because I just heard about their ability to copy physical machines to virtual machines. One of the things I wanted to do was to take the basic XP SP2 install I already had on my dev machine and clone it to a VM — the idea is to have a VM called “BaseXP” and then copy that whenever I needed an XP box to handle something funky, such as the particular combination of Pervasive, Timberline and API for a particular end-user installation. And this new VMWare utility promised to be able to do that. If it works, I should be able to fire up a VM for the client I’m currently working on, and have a test environment for that particular combination of esoteric products. If something goes wrong with the setup in one of those VMs, I simply go back to a copy of BaseXP and start over … no need to reinstall Windows, Office, Visual Studio, or any of my other basics.

VMWare is Balkanized into several products and it took a few minutes of studying to figure out what I needed for my purposes (all of which, thankfully, was a free download, so long as I can content myself with self-support via knowledge base). The components are:

VMWare Server is an enormous 150 meg download, but the installation is very straightforward (don’t forget to grab as many free serial #s as you need before starting the download — the link is right on the download page). Next I ran the much smaller and equally simple VMWare Converter install, and fired up the Converter.

The free version of Converter is more than enough for the small shop. Its paid version is mostly for those in large enterprises who want to generate multiple VMs at a time.

My physical machine already occupied 50G of space and had quite a few apps installed, so I was frankly skeptical this was going to work. But I fired it up, asked a few questions, told it what I needed my VM to look like (1G of RAM, 100G of hard drive space since I work with large files), bridged networking … and sent it on its way.

What happened took nearly seven hours, but was otherwise nothing short of amazing. Without disturbing my ability to run other software while it churned away, Converter built my VM, including its virtual hard disk file, and the sucker JUST WORKED.

This morning morning when I booted my VM I saw the familiar Windows boot logo. The number of unexpected things was non-zero, but manageable: Windows and Office both needed to be activated online because of “extensive hardware changes” — both operations took seconds, no questions asked, and both the new and old (virtual and physical) installations still work and seem properly licensed. (If there had been licensing problems I was equipped with extra XP and Office licenses courtesy of my Microsoft Action Pack subscription).

Also, although I’d specified 1G of RAM for this VM, it ended up getting set up with 2.5G — I had to manually change it back to what I wanted. I also had to re-state my desire for a static IP for the VM. Once I did that, Internet access worked and I could browse my local network fine. In fact I mounted an ISO file as a CD across the network to install the VMWare Tools into the VM — this is a suite of drivers and utilities that optimizes peformance for the guest OS you’re running. For example once those were installed I was able to increase the virtual display size up to and including full screen on my 2560 x 1600 monitor.

The first time you boot a Windows XP VM it’s a little confusing because various programs are run automatically to complete the configuration and there is a short delay before that happens. There are a couple of reboots of the VM, and then you’re up and running.

Think of it: Converter performed the miracle of copying a running machine including its installed applications onto another “machine” such that it’s runnable. For free. That’s truly useful and remarkable.

I pared down my BaseXP VM, removing a few apps I don’t want in there and copies of data that I don’t care to replicate. It should be ready for action now.

VMWare lets you start out with a small virtual disk file that grows dynamically, but you have to assign a maximum drive size and you are never allowed to increase it. If you run out of space you must attach another virtual drive or use Converter to move over to a VM with more space. And of course you need all this space available on your host’s hard drive for the virtual disk drives you create.

This is looking very promising. I’ll report in this space of any other challenges as I replicate my “BaseXP” VM, and over time as I create VMs for Windows Server and probably a Linux distro or two … and yes, even a Vista installation one day. With Converter the way it is, I may well eventually convert my physical XP to a VM, install some Linux distro as my host OS so I can see all of my 4G of RAM instead of just 3G of it, and run all Windows instances as guests within Linux. But I’ll have to build some confidence in VMWare, first, and get a sense of how adequate the VM performance will be.

February 9, 2007

What PayPal, eBay and UPS Can Teach Us About How NOT to Treat Online Customers

Filed under: Managment,Products — Bob Grommes @ 4:11 pm

Two starkly contrasting things happened to me today. One of them makes me want to ram my head repeatedly into the wall, and the other one gives me faint but distinct hope that mankind is not about to be drawn into a bottomless suck-hole of mediocrity and indifference. In these experiences are some lessons for software developers designing customer-facing web sites, as well as some insight into how the devil such abominations can possibly be so common.

First, the head-ramming experience.

It started out simply enough. I had an LCD monitor to sell. I will spare you the details, but with my wife’s help (she’s been through all this before) we got the listing up on eBay and after a mysterious delay of a few hours, it actually showed up in eBay’s search engine, and was purchased by someone using “Buy It Now”.

So … item sold, the money is in our account, the item is packaged … story over, right? Ah, it was just beginning.

First, there is a feature integrated into PayPal where you can buy a UPS shipping label for this item. Great. So you enter the weight, the dimensions, and the value. The value I entered was $1650. The resulting error message: “The insurance valuation you’ve entered exceeds the maximum allowed.” No indication what the maximum would be, or how this could possibly be a problem since UPS will let you buy up to $50K of insurance if you’re so inclined.

I played around with different values (with and without the cents places) and figured out that anything over $1000 is apparently too much, but $1000 or less results in the message that “You have entered an invalid value”. Then it all came back to me that I’d sold something on eBay maybe 8 or 10 months ago and ran into this VERY SAME BUG which, incredibly, is STILL not fixed. Even though my customer had paid for insurance, I couldn’t actually insure the package.

So I did now what I did back then … said the heck with it, and went directly to ups.com. There, I was presented with a verbose JavaScript alert() (the kind that most end users just cancel and don’t even read) that said words to the effect that because I had such a high valuation on the shipment it had to be handed personally to a UPS employee and signed for, which gives me a choice between a 15 mile trip to the nearest UPS store or requesting a pickup. No problem … I’ll request a pickup. Sorry, no same-day service — they can’t pick up until Monday. And now I can’t back out to un-request the pick up.

In the end I voided the shipment and started all over, but now even though one of my two attempts shows in my shipment history as voided, they’ve still charged my credit card twice. Ah, I remember this happening before, too … it will get credited, eventually, in a couple of weeks. Still, it’s on my calendar to check up on this in awhile.

As a humorous coda to this whole thing, I phoned UPS to ask a human being whether I could get same day pick up, as I’m sure I’ve done in the past. The voice mail system took my name, address and tracking number, which of course, the human had to ask me for anyway (why do 95% of voice mail systems ask you for information they don’t give to the person you end up talking to anyway??!?!). But the recording did say I could pay extra for same day pickup. Then the human told me, no, the earliest they could come was Monday, and would that be okay? I asked if there wasn’t a fee I could pay for same day pickup. “Oh,” she said — “if you want same day pick up, you have to call the day before!”

At this priceless line I dissolved into manic laughter. I doubt the operator knew what I was laughing about. She seemed peeved. But I felt better, in a perverse sort of way.

So here we have three soulless corporations — er, multinational enterprises — all of which are doing their part to make what should be the simplest part of my whole customer experience a major pain in the touche. How can this possibly happen?

Let’s take the three-paragraph JavaScript alert() about the need for a UPS employee signature. Now … I’ve paid for insurance … what they are telling me is that insurance is worthless if I don’t personally hand the shipment to one of their employees. Although I suspect they’d be glad to charge me anyway. Somehow I suspect there isn’t a business rule that says to refund the insurance fee to me if it’s over X$ and they didn’t get a proper acceptance signature.

Is this corporate malfeasance, or just something that was bolted onto their web app in a rush one day and no one ever got around to fixing it? You be the judge. I suspect it’s somewhere in between: there is no financial motivation to do a better job, and some motivation to leave it the way it is, whether or not it was initially intended to increase net insurance revenue.

Okay … I promised you a contrast to this depressing incompetence.

I do some work for a business-to-consumer site. This site has a partner that processes online car finance applications for them. Today this partner called and said he’d noticed our application volume was down, had taken a look at our application form, and it was asking for more info than they really needed. Perhaps, he suggested, if we streamlined the form and reduced the “friction” for our visitors, more people would complete the form? He pointed me to an example form on his own site.

I looked into it and, sure enough, our form had been created for a bank we no longer partner with, and all our remaining loan partners have less stringent requirements. We can in fact make it simpler for our users.

Granted … the call was motivated at least in part by the self-interest of the finance partner. They want more loan applications, so they can approve more loans, and make more money. But this guy seems to understand what eBay and its partners — PayPal and UPS — do not: that I will remember the pain of today’s experience with them long after I’ve forgotten the particulars. This will get filed in my brain under “don’t go there” and I will try very hard to avoid using eBay in the future, especially since I only have an occaisional need and no motivation to learn all the little warts and hiccups in their cranky little system.

All of this falls under the general heading of “usability”. Nothing about eBay, PayPal, UPS or my customer’s own loan application pages is exactly broken, in the sense that it works for most people most of the time, such that these companies are making money even while they are making enemies. But the user experience still sucks. In the case of my client, we just haven’t re-evaluated old code in awhile. In the case of eBay and Friends, it’s probable that there is a whole comedy of errors behind the glitches and frustrations I encountered. But in all these cases no one is regularly asking whether the user experience is smooth and as simple as it can be.

February 7, 2007

Do You Use Language-Specific Features That Replicate provided .NET functionality?

Filed under: VB.NET — Bob Grommes @ 11:00 am

I have just embarked on a VB.NET project … not normally my first choice, but not a huge deal either. Some day when I’m feeling contentious I may dive into my take on the whole perennial C# vs VB.NET thing, but for now, I want to focus on just one limited facet of that debate.

C# was a new language in 2002, introduced with .NET. At that point it had zero backward compatibility baggage … an advantage that is by definition gradually oozing away ever since but still holds basically true.

On the other hand VB.NET, while it is a substantial departure from VB6 and its predecessors, still does hand-stands to provide a comfort zone to legacy VB developers. So you have, for instance, the Left(), Right() and Mid() functions in the language. C# lacks these because it doesn’t need them; String.Substring() handles all that just fine. Nothing prevents you from using String.Substring() in VB.NET, but nothing encourages you to do so, either.

I’m still trying to decide whether this is a Good Thing or not. Maybe in the Great Scheme of Things it isn’t that big of a deal. Iron Python, for instance, or Eiffel.NET, or COBOL.NET will certainly have to preserve legacy structures that duplicate .NET functionality, and they should, or else getting legacy code ported to .NET would be more of a pain than it’s worth.

The thing is, something offends my sense of order when these features are used on brand new code. To me, the key to working in .NET is to know the .NET class libraries (or API, if you will). If you know that you can write code in any language that anyone should be able to understand.

In addition, legacy language constructs tend to have their own little odd behaviors and ways of handling errors and boundary conditions differently. Drill down into the details of, for instance, CType() vs DirectCast() and you’ll find that DirectCast() is the exact equivalent of the C# cast operator, whereas CType() is not exactly the same thing … it has certain implicit behaviors; that’s why DirectCast() got the name that it has.

My thinking is that if you see DirectCast() used in VB.NET code, you know exactly what it does, whether or not you have been a VB jockey all your life. If on the other hand you’re an old pro at VB, you may or may not be fully aware of the nuances of how CType() works. Legacy VB was so forgiving, and that was one of its virtues. If you run with certain compile time options, it still is. Which brings up another point; I haven’t verified it but would be surprised if your use of Set Strict or Set Explicit influences the behavior of calls into non-legacy BCL methods. Just one less variable to deal with.

I suppose it all boils down to how much of a religion you want to make out of VB vs. .NET. Clearly, it’s up to you, the developer, how broad or parochial a view you want to take of these things. My personal tendency is to avoid legacy language constructs except when they aid porting. It generally makes perfect sense to disturb legacy code as little as possible. Other than that circumstance, though, why not speak a more universally-understood dialect and be a more congenial member of the .NET technology world?

On the other hand if you work with a VB-centric shop and you introduce a bunch of constructs that are unfamiliar to them, you may encounter Political Problems and then you may have to pick your battles.

It will be interesting to see what kind of reaction I get to my .NET-centric maintenance work on this non-legacy project which nonetheless is heavy on legacy VB contructs. I figure that it’s easier to ask forgiveness than to get permission, but if it ruffles too many feathers, I’m prepared to dust off my old VB skills and backtrack a bit.

February 6, 2007

Project Estimation Made Plain

Filed under: Managment — Bob Grommes @ 5:38 am

Okay, I’ve tried to get this across in previous posts, but as usual, Bob Lewis says it much better than I could ever hope to. Read his post, and get wisdom!

For those in a hurry, the key paragraph is reproduced below:

  • Projects of any size and scope should always be about helping one or more parts of the business operate differently, not about delivering software that meets specifications. That being the case, the whole process of collecting software requirements from various stakeholders and reconciling it is bogus. Start the conversation by asking how the business should operate and everything about what follows changes. In particular, software requirements stop being a matter of finding compromises among various statements of “I want” and start being a simple account of the role software plays in new or changed business processes.

February 2, 2007

Looking for Vista Drivers?

Filed under: Products — Bob Grommes @ 3:47 pm

RadarSync has created a web site aggregating the latest Vista drivers from a number of vendors. If you’re hunting for Vista device drivers, try this site first.

February 1, 2007

Vista: Resistance is Futile?

Filed under: Products — Bob Grommes @ 11:22 am

Despite my earlier rant about Vista, we are about to acquire the first Vista box in our household. My wife’s old Dell is so full of XP lint that we are faced with a clean re-install of XP, which seems like not worth the effort on a 4 year old box. So when her shiny new 2.66 ghz dual core machine with 2G of ram and the latest nVidia card with DirectX 10 shows up in a couple of weeks, it will come with 32-bit Vista Ultimate factory-installed.

This is how most people will get Vista; six months from now I doubt very much that anyone is going to offer factory-installed XP. Personally I’m waiting for SP1 and stable 64-bit drivers before even thinking about upgrading my working dev machine. But this seemed like a good opportunity to let the manufacturer worry about getting the basic system working.

One important thing for you to do before migrating to Vista is to check that all your critical apps and drivers are known to work under Vista. In our case, the only hold-out is the driver for our Epson scanners, and Epson expects to release updated drivers for our specific models in the next couple of weeks. Since we can move her to the new box incrementally, that’s fine.

Another advantage of doing this is I’ll get some Vista experience and have a limited Vista test-bed, should I end up needing it for some reason.

Still not sure what to do? Web Worker Daily just posted a really balanced and nuanced pros and cons discussion — have a look.

Next Page »

Blog at WordPress.com.