Bob on Development

February 14, 2007

What Kind of Software Will You Be Developing in Ten Years?

Filed under: Products — Bob Grommes @ 7:13 am

I try to read the tea leaves and determine where software developiment is headed over the medium-term, so I have some idea how to build my skills. Once in awhile I try to stretch outside the current box to see where things are heading, because every few years there is some kind of paradigm shift. For example, in 1990 it would have been difficult to predict that by 2000 my software development efforts would transition from 100% desktop applications to about 90% web-based applications.

Although not a hardware alpha geek, I’ve long been interested in the possibilities of practical robotics. Our household, for example, was an early adopter of the Roomba vacuum cleaning robot. Alas, we found it just a bit too glitchy and not that durable, even though our all-tile home is perfect for the device. For whole-house cleaning, by the time you lay down all the infra-red “fencing” and occasionally rescue the thing from getting wedged under couches or help it out of close quarters that confuse its collision sensors, plus empty, clean, and recharge it, it’s just as much bending and stooping and vigilance as it would require to do it the old-fashioned way.

But today I stumbled onto the LEGO Mindstorm, a truly programmable kit robot that is pushing the state of the art of what can be built in the way of interesting and flexible, if not yet useful, robots. For a few hundred dollars (for the base model and a few optional add-ons) you get a highly modular, bluetooth-enabled robot that understands compass directions, has crude vision (just light and color, no pattern matching), basic touch and auditory sensors, and “walks” on two “legs”. Client software for Mac or PC allows simplified graphical programming. It is still only a toy, but it is likely a harbinger of things to come, and I would not be at all surprised to find myself, ten or twenty years hence, customizing the behaviors of people’s servant robots.

If LEGO can produce this much functionality starting at about $250 US, what is already available for a few thousand? And how soon will voice-controlled quasi-humanoid robot companions be doing useful work for and with us? I suspect robots will eventually be a whole new platform for software development. For the truly interested, LEGO has even released the robot’s low-level operating system as open source software. If you want a little taste of the future without a lot of cash outlay, this may be one way to get it, although personally I plan to wait for something just a little more compelling to come along.

What I really want is a robot who will fetch me a glass of milk and feed the parrots. That’s a little beyond LEGO’s current product vision, but I’m starting to think it’s not exactly science fiction anymore. And it has not escaped the notice of Microsoft, either. How many of you out there knew that Microsoft is already heavily into this game? I give you Microsoft Robotics Studio.

Emerging standards … Korea stating they want a “robot in every home by 2013”Bill Gates crowing about this being a nascent new industry… hmm ….

Advertisements

February 12, 2007

Getting Your Feet Wet With VMWare

Filed under: Products — Bob Grommes @ 7:53 pm

If I’ve learned anything about Microsoft operating systems, your biggest enemy is what I call “system lint” — the accumulation of detrius that gets stuck in the plumbing when you do too many installs / uninstalls / upgrades. My last development box, a Dell Dimension that’s now pushing four years old, was beginning to suffer from this fate, and nothing dismays me more than spending many tedious hours installing dozens of packages and updates and add-ons to get to a usable system, moving all my settings and data over, etc.

Because I’m not a sysadmin type of geek, I tend to buy well-endowed, bleeding-edge machines to stave off obsolecense, and then keep them for a long time. I also try to keep the installed suite of software as simple and stable as I can, including Visual Studio and named instances of SQL Server 2000 and 2005 to support all my current projects. The new hardware is a Falcon Northwest 2.66 gHz Core Duo (conservatively overclocked to 2.87 gHz) with 4G of RAM and 700G of storage in a RAID5 configuration. By the time my next acquisition cycle comes around in about 2010 or 2011, this computer will be looking lame compared to the 16 core boxes that will likely be typical by then, but it should serve my purposes just the same.

Alas … one of my new clients writes software that integrates with TimberLine Accounting Software, which means I am faced with loading up my machine with icky things like Pervasive Database and Timberline itself (likely in multiple versions each for different clients), the ponderous COM-based API for talking to TimberLine from external applications, and (so far) three third-party libraries, with a fourth on the way.

I decided it was high time to get my feet wet with virtual machines (VMs). Yeah, I know, I’m a bit behind the times; I’m perfectly aware that virtualization has been hot, hot, hot during the past year or so. I have a couple of colleagues using VMs already. But I find that waiting until there is a compelling need usually means that (1) I have enough motivation to work through the pain of the learning curve and (2) the technology has had time to mature a bit.

Being a small business, I use a fairly unscientific product evaluation and testing process: the product under consideration must be affordable, and must Just Work. If none of the available options can Just Work then I’d better have a really compelling need to put up with the least of the available evils.

First out the gate was Parallels, the newer kid on the block. In the Mac world, Parallels is the only virtualization technology for running Windows on top of Mac OS/X and it works, by all accounts, quite well. Parallels Desktop is the version for hosting guest operating systems on the PC, and it’s a free download.

I found Parallel’s install simple and intuitive, but the first thing I did with it was to create a VM with 1G of RAM and tried to do a fresh install of Windows 2003 on it. It hung during the part of the install were drivers are loaded. I emailed tech support about this issue and never heard back from them. So much for Parallels. Brutal, I know, but it’s the way I do these things.

Next up was VMWare, on the theory that it’s a more mature product, and because I just heard about their ability to copy physical machines to virtual machines. One of the things I wanted to do was to take the basic XP SP2 install I already had on my dev machine and clone it to a VM — the idea is to have a VM called “BaseXP” and then copy that whenever I needed an XP box to handle something funky, such as the particular combination of Pervasive, Timberline and API for a particular end-user installation. And this new VMWare utility promised to be able to do that. If it works, I should be able to fire up a VM for the client I’m currently working on, and have a test environment for that particular combination of esoteric products. If something goes wrong with the setup in one of those VMs, I simply go back to a copy of BaseXP and start over … no need to reinstall Windows, Office, Visual Studio, or any of my other basics.

VMWare is Balkanized into several products and it took a few minutes of studying to figure out what I needed for my purposes (all of which, thankfully, was a free download, so long as I can content myself with self-support via knowledge base). The components are:

VMWare Server is an enormous 150 meg download, but the installation is very straightforward (don’t forget to grab as many free serial #s as you need before starting the download — the link is right on the download page). Next I ran the much smaller and equally simple VMWare Converter install, and fired up the Converter.

The free version of Converter is more than enough for the small shop. Its paid version is mostly for those in large enterprises who want to generate multiple VMs at a time.

My physical machine already occupied 50G of space and had quite a few apps installed, so I was frankly skeptical this was going to work. But I fired it up, asked a few questions, told it what I needed my VM to look like (1G of RAM, 100G of hard drive space since I work with large files), bridged networking … and sent it on its way.

What happened took nearly seven hours, but was otherwise nothing short of amazing. Without disturbing my ability to run other software while it churned away, Converter built my VM, including its virtual hard disk file, and the sucker JUST WORKED.

This morning morning when I booted my VM I saw the familiar Windows boot logo. The number of unexpected things was non-zero, but manageable: Windows and Office both needed to be activated online because of “extensive hardware changes” — both operations took seconds, no questions asked, and both the new and old (virtual and physical) installations still work and seem properly licensed. (If there had been licensing problems I was equipped with extra XP and Office licenses courtesy of my Microsoft Action Pack subscription).

Also, although I’d specified 1G of RAM for this VM, it ended up getting set up with 2.5G — I had to manually change it back to what I wanted. I also had to re-state my desire for a static IP for the VM. Once I did that, Internet access worked and I could browse my local network fine. In fact I mounted an ISO file as a CD across the network to install the VMWare Tools into the VM — this is a suite of drivers and utilities that optimizes peformance for the guest OS you’re running. For example once those were installed I was able to increase the virtual display size up to and including full screen on my 2560 x 1600 monitor.

The first time you boot a Windows XP VM it’s a little confusing because various programs are run automatically to complete the configuration and there is a short delay before that happens. There are a couple of reboots of the VM, and then you’re up and running.

Think of it: Converter performed the miracle of copying a running machine including its installed applications onto another “machine” such that it’s runnable. For free. That’s truly useful and remarkable.

I pared down my BaseXP VM, removing a few apps I don’t want in there and copies of data that I don’t care to replicate. It should be ready for action now.

VMWare lets you start out with a small virtual disk file that grows dynamically, but you have to assign a maximum drive size and you are never allowed to increase it. If you run out of space you must attach another virtual drive or use Converter to move over to a VM with more space. And of course you need all this space available on your host’s hard drive for the virtual disk drives you create.

This is looking very promising. I’ll report in this space of any other challenges as I replicate my “BaseXP” VM, and over time as I create VMs for Windows Server and probably a Linux distro or two … and yes, even a Vista installation one day. With Converter the way it is, I may well eventually convert my physical XP to a VM, install some Linux distro as my host OS so I can see all of my 4G of RAM instead of just 3G of it, and run all Windows instances as guests within Linux. But I’ll have to build some confidence in VMWare, first, and get a sense of how adequate the VM performance will be.

February 9, 2007

What PayPal, eBay and UPS Can Teach Us About How NOT to Treat Online Customers

Filed under: Managment,Products — Bob Grommes @ 4:11 pm

Two starkly contrasting things happened to me today. One of them makes me want to ram my head repeatedly into the wall, and the other one gives me faint but distinct hope that mankind is not about to be drawn into a bottomless suck-hole of mediocrity and indifference. In these experiences are some lessons for software developers designing customer-facing web sites, as well as some insight into how the devil such abominations can possibly be so common.

First, the head-ramming experience.

It started out simply enough. I had an LCD monitor to sell. I will spare you the details, but with my wife’s help (she’s been through all this before) we got the listing up on eBay and after a mysterious delay of a few hours, it actually showed up in eBay’s search engine, and was purchased by someone using “Buy It Now”.

So … item sold, the money is in our account, the item is packaged … story over, right? Ah, it was just beginning.

First, there is a feature integrated into PayPal where you can buy a UPS shipping label for this item. Great. So you enter the weight, the dimensions, and the value. The value I entered was $1650. The resulting error message: “The insurance valuation you’ve entered exceeds the maximum allowed.” No indication what the maximum would be, or how this could possibly be a problem since UPS will let you buy up to $50K of insurance if you’re so inclined.

I played around with different values (with and without the cents places) and figured out that anything over $1000 is apparently too much, but $1000 or less results in the message that “You have entered an invalid value”. Then it all came back to me that I’d sold something on eBay maybe 8 or 10 months ago and ran into this VERY SAME BUG which, incredibly, is STILL not fixed. Even though my customer had paid for insurance, I couldn’t actually insure the package.

So I did now what I did back then … said the heck with it, and went directly to ups.com. There, I was presented with a verbose JavaScript alert() (the kind that most end users just cancel and don’t even read) that said words to the effect that because I had such a high valuation on the shipment it had to be handed personally to a UPS employee and signed for, which gives me a choice between a 15 mile trip to the nearest UPS store or requesting a pickup. No problem … I’ll request a pickup. Sorry, no same-day service — they can’t pick up until Monday. And now I can’t back out to un-request the pick up.

In the end I voided the shipment and started all over, but now even though one of my two attempts shows in my shipment history as voided, they’ve still charged my credit card twice. Ah, I remember this happening before, too … it will get credited, eventually, in a couple of weeks. Still, it’s on my calendar to check up on this in awhile.

As a humorous coda to this whole thing, I phoned UPS to ask a human being whether I could get same day pick up, as I’m sure I’ve done in the past. The voice mail system took my name, address and tracking number, which of course, the human had to ask me for anyway (why do 95% of voice mail systems ask you for information they don’t give to the person you end up talking to anyway??!?!). But the recording did say I could pay extra for same day pickup. Then the human told me, no, the earliest they could come was Monday, and would that be okay? I asked if there wasn’t a fee I could pay for same day pickup. “Oh,” she said — “if you want same day pick up, you have to call the day before!”

At this priceless line I dissolved into manic laughter. I doubt the operator knew what I was laughing about. She seemed peeved. But I felt better, in a perverse sort of way.

So here we have three soulless corporations — er, multinational enterprises — all of which are doing their part to make what should be the simplest part of my whole customer experience a major pain in the touche. How can this possibly happen?

Let’s take the three-paragraph JavaScript alert() about the need for a UPS employee signature. Now … I’ve paid for insurance … what they are telling me is that insurance is worthless if I don’t personally hand the shipment to one of their employees. Although I suspect they’d be glad to charge me anyway. Somehow I suspect there isn’t a business rule that says to refund the insurance fee to me if it’s over X$ and they didn’t get a proper acceptance signature.

Is this corporate malfeasance, or just something that was bolted onto their web app in a rush one day and no one ever got around to fixing it? You be the judge. I suspect it’s somewhere in between: there is no financial motivation to do a better job, and some motivation to leave it the way it is, whether or not it was initially intended to increase net insurance revenue.

Okay … I promised you a contrast to this depressing incompetence.

I do some work for a business-to-consumer site. This site has a partner that processes online car finance applications for them. Today this partner called and said he’d noticed our application volume was down, had taken a look at our application form, and it was asking for more info than they really needed. Perhaps, he suggested, if we streamlined the form and reduced the “friction” for our visitors, more people would complete the form? He pointed me to an example form on his own site.

I looked into it and, sure enough, our form had been created for a bank we no longer partner with, and all our remaining loan partners have less stringent requirements. We can in fact make it simpler for our users.

Granted … the call was motivated at least in part by the self-interest of the finance partner. They want more loan applications, so they can approve more loans, and make more money. But this guy seems to understand what eBay and its partners — PayPal and UPS — do not: that I will remember the pain of today’s experience with them long after I’ve forgotten the particulars. This will get filed in my brain under “don’t go there” and I will try very hard to avoid using eBay in the future, especially since I only have an occaisional need and no motivation to learn all the little warts and hiccups in their cranky little system.

All of this falls under the general heading of “usability”. Nothing about eBay, PayPal, UPS or my customer’s own loan application pages is exactly broken, in the sense that it works for most people most of the time, such that these companies are making money even while they are making enemies. But the user experience still sucks. In the case of my client, we just haven’t re-evaluated old code in awhile. In the case of eBay and Friends, it’s probable that there is a whole comedy of errors behind the glitches and frustrations I encountered. But in all these cases no one is regularly asking whether the user experience is smooth and as simple as it can be.

February 2, 2007

Looking for Vista Drivers?

Filed under: Products — Bob Grommes @ 3:47 pm

RadarSync has created a web site aggregating the latest Vista drivers from a number of vendors. If you’re hunting for Vista device drivers, try this site first.

February 1, 2007

Vista: Resistance is Futile?

Filed under: Products — Bob Grommes @ 11:22 am

Despite my earlier rant about Vista, we are about to acquire the first Vista box in our household. My wife’s old Dell is so full of XP lint that we are faced with a clean re-install of XP, which seems like not worth the effort on a 4 year old box. So when her shiny new 2.66 ghz dual core machine with 2G of ram and the latest nVidia card with DirectX 10 shows up in a couple of weeks, it will come with 32-bit Vista Ultimate factory-installed.

This is how most people will get Vista; six months from now I doubt very much that anyone is going to offer factory-installed XP. Personally I’m waiting for SP1 and stable 64-bit drivers before even thinking about upgrading my working dev machine. But this seemed like a good opportunity to let the manufacturer worry about getting the basic system working.

One important thing for you to do before migrating to Vista is to check that all your critical apps and drivers are known to work under Vista. In our case, the only hold-out is the driver for our Epson scanners, and Epson expects to release updated drivers for our specific models in the next couple of weeks. Since we can move her to the new box incrementally, that’s fine.

Another advantage of doing this is I’ll get some Vista experience and have a limited Vista test-bed, should I end up needing it for some reason.

Still not sure what to do? Web Worker Daily just posted a really balanced and nuanced pros and cons discussion — have a look.

January 26, 2007

How to Use Team Foundation Server Source Control with Visual Studio 2003

Filed under: Products,Techniques — Bob Grommes @ 9:33 pm

I’ve just begun work on extending a product that was authored in VS 2003 and is now three years old. The developers wanted to port it to VS 2005 but ran into a brick wall when it was discovered that the client’s old version of Citrix could not cope with .NET 2.0 being installed on the same machine; apparently it caused a complete meltdown requiring a from-scratch reinstall.

In the meantime the developers had standardized on Team Foundation Server (TFS). Rather than face losing the benefits of TFS source control while being stuck (hopefully temporarily) in the VS 2003 world, they came up with a pretty interesting workaround.

1. Put the VS 2003 project into a TFS workspace.

2. In the VS 2005 client, open the Source Control Explorer (View | Other Windows | Source Control Explorer). Right click on the root of the VS 2003 project and do a Get Latest. This downloads all the code to the client system without attempting to open the solution or any of its projects, which would trigger the VS 2003 to VS 2005 conversion wizard, which, if run, would render the projects unusable in VS 2003.

3. From here you work with the project using an instance of VS 2003, using the separate VS 2005 instance to do your check-outs and check-ins.

This is not a bad solution but I wondered … hasn’t someone solved this more elegantly? A quick Google search led me to a Microsoft download that not only allows you to access TFS from VS 2003, but from VS 6, or even FoxPro, Sybase or Toad!

This is heartening in a world where Microsoft doesn’t even support running many of these older but still very much in-use tools on the latest release of its own operating system. I was astounded that they’d even consider telling developers that as recent a product as VS 2003 isn’t and won’t be supported under Vista (although, oddly, VB6 is supported). I was even more surprised that even VS 2005 support will not really be here for months yet, when SP2 is released. Yet, somehow they have managed to support a number of old and even 3rd party tools in TFS. Could it be that at least some people at Microsoft have managed to overcome the Pointy-Haired Ones?

Then it stuck me that supporting these old environments will help sell significantly more TFS licenses, whereas supporting them in Vista will not sell significantly more copies of Vista. Think about it: development teams are 100% of the market for TFS, but probably just a small percent of the total market for Vista licenses. And Microsoft’s thinking is that developers know how to run VMs within Vista anyway, to support old products using old operating systems. Penny-wise and pound foolish, in my view — but no one is asking me.

January 3, 2007

DotNetNuke in the Trenches

Filed under: Products — Bob Grommes @ 2:37 pm

I have not yet figured out whether I’m blessed or cursed to be involved in maintaining an extensive DotNetNuke site. I’ve had a year to develop an opionion of DNN, which is a framework that I really wanted to like.

During that year I’ve been sidetracked with other responsibilities for the same client, such as developing an automated billing system and doing some sysadmin tasks, but I’ve finally had some significant time to get cozy with the DNN architecture.

The site began as a generic ASP.NET 1.1 site, then the developer discovered DNN 2, and bolted that onto the legacy parts of the app. Before long this was upgraded to 3.0.13, and currently, I’m in the throes of moving it to DNN 3.3.7 — with the plan being to get that stable and then switch to ASP.NET 2.0 and finally whatever flavor of DNN 4.x is then current.

You may notice that is that this is an awful lot of re-tooling in the course of just a couple of years. Today I was looking for answers to some upgrade questions (little is published about upgrading, and what is published is mostly about major DNN version upgrades). I stumbled across the web site of a DNN plug-in module vendor. The info I’m about to quote is in the Google cache — the current live site tones it down considerably, so I won’t provide a link or identify the vendor. However, I have to say, it’s a pretty revealing rant and validates some of the growing suspicions I’ve developed about DNN.

The vendor was addressing a FAQ regarding why they don’t participate in the DNN certification program for module vendors:

… very simply, DNN changes too frequently. Core API changes have occurred in the last 3 to 4 years to versions 2.0, 2.1, 3.0, 3.3 – that is essentially 4 significant API layer changes in less than 4 years time – with the prospect of another major API change forthcoming [he’s referring to DNN 4.x, which is now out and itself has had some significant minor updates since]

These types of certifications are useful with regards to long-term solid API foundations, such as Windows 32 Bit API, [the] .NET platform, Java and other technologies where the core API does not change on a whim. This is not the case with DotNetNuke. In essence, a DotNetNuke certification does not guarantee that the module that you purchase will work past the next release of DNN – or that the module developer will maintain versions for your current version of DNN if you choose not to upgrade.

I don’t know how to define “too frequently” or whether it’s even the real problem here … the real problem is far too many breaking changes. The core developers are not afraid to change interfaces or namespaces. Sometimes the results are annoying (you get eight zillion “deprecated method call” warnings, but the code still works because the old signature is mapped to the new one). In other cases, things just break because they’re in the wrong place. Matters are exacerbated if you have custom code that calls into the framework.

Maybe this is unavoidable, but a portal engine that supports third party plugins and encourages users to create their own modules, probably needs to be more commited to stability than they are. I know this is making me and the other person working on this system kind of crazy. It’s also costing the client too much buckolas too early in the game I think. Probably a good solid person month or more of labor to move from a 3.0.x to a 3.3.x release seems a bit much. Granted some of it may be learning curve — of the original developers, and of me doing the upgrade. But that’s pretty normal turnover on any project these days.

DNN has other warts too — its documentation is lacking in many important ways, for example. It’s infinitely easier to find answers about the .NET framework even if you confine yourself to Microsoft resources, than it is to Google up answers about DNN. One of the problems, aside from thin docs, is … there it is again … if you do get an answer it’s probably not for the version of DNN you’re currently struggling with. It will be a three year old post about DNN2, or a brand new one about DNN4, or it will not tell you whether or not version issues are relevant. And that’s just the core technology, not the many add-ons out there.

I have yet to decide whether this is “Good Enough” or “The Best We Can Do” or “More Touble Than It’s Worth”. One of these days I’ll settle it in my mind, and post again about it.

December 31, 2006

Windows PowerShell

Filed under: Products,Tools — Bob Grommes @ 3:40 pm

If you need to do non-trivial scripting under Windows 2003 Server or Windows XP, you should probably take the time to install and learn Windows PowerShell 1.0. In addition, if you need to learn WMI this is a great way to get acquainted with it. Indeed, PowerShell may often be a better place to interface with WMI than from within standard .NET applications.

The one-sentence skinny: PowerShell is a .NET 2.0 command line application where all input and output takes the form of objects. Think about the implications of this for about two minutes and some light bulbs should go on. This is insanely great stuff, and it’s reassuring in light of some of the recent train wrecks to come out of Microsoft (e.g., Vista) that they can still produce great, innovative and useful tools like this.

December 20, 2006

On Technology “Churn”

Filed under: Managment,Products — Bob Grommes @ 1:12 pm

Back in the days before dirt was invented — sometime in 2002 — Microsoft released version 1.0 of the .NET platform along with Visual Studio 2002. Among the technologies provided was an RPC stack called .NET Remoting, or Remoting for short.

Remoting was, like everything else in .NET, promoted like the Second Coming. It was the greatest thing since sliced bread.

I had a project that involved orchestrating a lot of data acquisition, conversion and scrubbing tasks on about a dozen PCs, and Remoting was absolutely the perfect solution. It worked absolutely great. The project was a complete success. With a little work, I had a complete workflow system with fabulous fault tolerance, even though one of the third-party applications called by each worker machine was notoriously temperamental.

Now that the project is “in the wild” it should serve my client well for the next dozen years or so.

Except for one problem: It’s obsolete already.

On a timeline that can only mean that Remoting was nothing but a stop-gap measure all along, Microsoft released a completely new and incompatible API for RPC work, the Windows Communication Framework, or WCF. Remoting is still in the .NET framework, and probably will be for the foreseeable future. But it has been sidelined and is now officially a Second Class Citizen. And all this happened in considerably less than four short years — two, if you count technology previews.

I’m not saying WCF is a Bad Thing. In fact, it looks like pretty good stuff. But consider what has happened here:

1. A small business invested scarce funds to produce a solution and expects (reasonably, I think) to get a return on that investment for many years with minimal additional costs.

2. Technically, they will get just that, but .NET developers who haven’t happened to work on Remoting in the past (including every new developer minted after about 2006) will know nothing about it. And likely, even those like myself who have experience with it, will be rusty. In addition, no new work will probably be done on Remoting by Microsoft, so if my client needs to integrate with future technology they may be forced to do a rewrite much sooner than they otherwise would have. If any bugs get introduced into Remoting, they are less likely to get caught and fixed quickly.. And so on.

Is this a big deal? Arguably, no. But it illustrates a fact of life that bullet-point-mongers in marketing seem to keep forgetting: no matter how great a given new technology is, the marketplace can only absorb it so fast.

There are limits to how many times per year a developer can retrain; actual work does occasionally have to get done.

There are also limits to how many platforms any developer can simultaneously be proficient at. It’s one thing if core functionality available in 2000 still works in 2006 and whatever has been bolted on since can be learned and leveraged as needed; it’s another thing if (as is the case with RPC) there’s been both a platform change and an API change in the same time frame.

Would I have handled this particular instance differently? Yes. In 2002 I would have given customers guidance about Remoting — that it was a stopgap technology and that a more comprehensive and flexible framework was under development for release three or four years hence. Then customers could make decisions and plans in the full light of reality rather than under the burden of marketing claims.

Of course, that’s never going to happen, but I can dream. I’d sure be nice if vendors were more respectful of the concept that when you provide a platform, it needs to provide a reasonable application life cycle for the things it supports. The lower level the platform, the longer that cycle needs to be. The only thing with a longer life cycle than an API would be the OS infrastructure itself.

This doesn’t mean the vendor can’t innovate within that framework, but they should consider it a personal failure if they have to needlessly pull the entire rug out from under customers. You might be forced to do that if, say, a competitor came up with something better and you had to match or surpass it to get the mindshare and momentum back in your court. But that’s making your problems your customer’s problems, which ultimately isn’t good business.

Sometimes when it comes to keeping up with technology churn like this, I feel like the Dutch boy with his fingers in the dike, trying to plug leak after leak. This isn’t because I’m old and set in my ways, or trying to Keep Everything the Same. I have, after all, willingly reinvented myself at least four times in my career, in the name of staying informed and competent. The transition from procedural to object-oriented development comes to mind, for example.

But there’s a balance to be had. There seems to be an ethos today that developers are somehow failing to Keep Up if they don’t attend, each week, at least one webcast about some vendor’s Latest and Greatest New Thing. Don’t believe it. Vendors just know we are suckers for new things, that’s all.

Instead of chasing every New Thing that comes along, prioritize relentlessly based on what you can get done for your clients effectively over time. Hint: this generally doesn’t have much to do with being on the Bleeding Edge 110% of the time.

December 17, 2006

Windows Vista: Wait for SP1 (at least)

Filed under: Products — Bob Grommes @ 4:10 pm

I’ve never been one to trip over my own feet in an attempt to replace the Old and Lousy version of any OS or application with the New and Improved version. I keep pretty current, but as I generally lack a lot of time to play with technology just for the sake of playing with it, I prefer to let others be the pioneers with the arrows in their backs.

On the other hand, I got an offer from my local Microsoft office a couple of months ago that I felt I couldn’t refuse: “bring your computer to our Vista install fest and upgrade your XP machine to Vista RC2 with our help. In exchange for the data from the upgrade and your feedback, you get a free copy of Windows Vista Ultimate.” I had a brand new Falcon Mach V and a 30″ Apple Cinema monitor that I needed to get set up as my main dev machine, so I figured … what I have I got to loose?

Plenty, as it turned out.

One would expect, particularly for a product so stunningly late and much reduced from its original ambitious goals, that a second release candidate would be reasonably stable. After all, it’s not much different from the release to manufacturing (RTM) code.

What ended up happening can be summarized as follows:


  • The upgrade process turned out to have a stunningly rudimentary UI, almost entirely text-based. It appeared to freeze for about 40 minutes at 21% complete “unpacking compressed files”. I was told not to worry, it was normal. I can only imagine the tech support calls this is going to generate for Microsoft.

  • When the upgrade finally finished some 3 and a half hours after it started, it went into an endless loop trying unsuccessfully to reboot. I called a tech over, they shrugged and said I’d have to do a clean install. WHOA! Just a minute! Don’t you want to run a diagnostic on this to see what happened? Aren’t you going to help me recover? You know, “technicians are standing by”, right? No, they weren’t going to help, and weren’t even curious about what happened.

  • Thirty minutes later I had a clean install of Vista, handily wiping out two or three evenings of work partially setting the box up as an XP workstation. The wireless service was down at that location so I couldn’t connect to the Internet for the latest updates, so I went home with an uneasy feeling about the whole thing.

  • That night I plugged in my USB external hard drive that I use for backups and turned it on, and got an instant Blue Screen of Death (BSOD). WTF?! It’s a standard USB device and Just Works under XP. There aren’t even any drivers to install! And I thought Vista was supposed to slay those BSODs!

  • After further muttering and trouble shooting and getting results ranging from the device not being recognized to more BSODs, I spoke to Microsoft. They said it must be a driver problem. I told them there is no driver involved — just the built in random access device driver for USB. They said okay, a tech will call you tomorrow. The call never came.

  • In the end, after a brief flirtation with Windows XP 64 bit Edition, which had problems with my supposedly-compatible video drivers, I went back to the world of 32-bit Windows XP, where everything Just Works.


Maybe sometime in the next 12 months or so, Vista will Just Work. But what I saw and experienced has all the hallmarks of something half-baked and Not Ready for Prime Time. Add to this the announcement that Visual Studio 2005 will not be fully supported under Vista until a future serivce pack 2, and Visual Studio 2003 — still in wide spread real-world use — will not be supported at all.

I make my living specializing in Microsoft technology, so I’m not a Microsoft hater, and not normally a Microsoft basher. But if they don’t quit filling middle management over there with Pointy-Haired Ones, I worry about the future. They are losing their Mojo and need to get a grip before the trickle becomes a flood.

This business of something Just Working has a lot to say for it. XP may not see 25% of my 4G of installed RAM, it may not have transparent / transclucent / flying windows, but — it Just Works. There’s tremedous value in that!

This whole experience also proves that the following conventional wisdom still holds: NEVER, EVER install a new version of Windows as an upgrade to an existing version. ALWAYS do a CLEAN install. The XP-to-Vista upgrade had the overall feel of something demanded by marketing and tacked on at the last minute. I suspect the Vista team knew better, but had no choice but to cobble something together. A little birdie within Microsoft that I spoke to just before release even told me there was talk internally of pulling this feature as infeasible. I doubt that actually happened, however.

At least I have my Vista Ultimate license, which I can run (along with some service packs) in a VM or as a secondary boot option down the road when things settle down a bit.

Update: In addition to releasing a new OS late, with significant rough edges, and incompatible with its own development platform for months to come, it’s also significant to note that the current version of Microsoft SQL Server will not work correctly on Vista either, and requires a service pack available at some future date. SQL Server Express SP1 can be used on Vista today, but has issues that require workarounds.

A common thread in all of this train wreck seems to be the new security features in Vista; disabling them or running as administrator may allow Visual Studio and SQL Server to function as-is, but you’re on your own. In fairness, maybe the pain of the new security features will ultimately be worth it, but it is not promising that Microsoft’s own products are struggling to adapt, and that Microsoft’s own product managers seem to have been caught by surprise with significant new work in order to fix the problems.

Another update: In a lighter vein, I think this fellow captured the essence of my original feelings about this whole debacle. However, by March I will nevertheless having a copy of Vista running, just not on my precious dev machine.

Next Page »

Create a free website or blog at WordPress.com.