Bob on Development

December 31, 2006

Windows PowerShell

Filed under: Products,Tools — Bob Grommes @ 3:40 pm

If you need to do non-trivial scripting under Windows 2003 Server or Windows XP, you should probably take the time to install and learn Windows PowerShell 1.0. In addition, if you need to learn WMI this is a great way to get acquainted with it. Indeed, PowerShell may often be a better place to interface with WMI than from within standard .NET applications.

The one-sentence skinny: PowerShell is a .NET 2.0 command line application where all input and output takes the form of objects. Think about the implications of this for about two minutes and some light bulbs should go on. This is insanely great stuff, and it’s reassuring in light of some of the recent train wrecks to come out of Microsoft (e.g., Vista) that they can still produce great, innovative and useful tools like this.

Advertisements

December 27, 2006

Proper Casing a String in .NET

Filed under: C#,Techniques — Bob Grommes @ 2:22 pm

Rick Strahl saved me a minor headache today by pointing out a somewhat hidden and arguably misplaced method in the BCL for proper casing strings. Combining that with a small fix provided by one of his respondents (the method doesn’t work if the input string is all upper cased), you get:

private string ProperCase(string s) {
  return System.Threading.Thread.CurrentThread.CurrentCulture.TextInfo.ToTitleCase(s.ToLower());
}

As he aptly points out, it’s all there somewhere in the dogpile, if only you can find it.

Is Development a Science or an Art?

Filed under: Communication,Managment,Projects — Bob Grommes @ 2:10 pm

Let me say right up front that we’re not going to clear up the age-old argument between developers and bean-counters in this space. It’s been tried, over and over, since the beginning of the computer age.

Instead, I’ll just say that I think that, in the same way light has properties of both waves and particles, software development is both art and science. Indeed, in my view it’s a little more art than science. And I think that, just as there are very few true athiests in the world, there are very few people who would truly maintain that there is no art to development, and even fewer who would maintain there is no science to it.

Now that we have that out of the way I want to address the folly of going to the extreme of removing the craftsmanship from development and treating it as science.

Leaving aside my deeply-held suspicion that science isn’t going to save us anyway … that the image of dispassionate, relentlessly objective Men in White Lab Coats is a myth … let’s just say that there are powerful forces that want to make development into something that can be quantified, bottled, automated, commoditized, offshored, and in general made Predictable. These forces are well understood.

Historically every New Thing starts out as an arcane craft and is made one at a time, painstakingly. At some point, when its worth is proven, that product becomes standardized and commoditized so that it can be mass-produced, mass-affordable and more reliable (or at least Good Enough).

It is perfectly understandable that the world of business would hope that this paradigm could be applied to development. Especially since, tantalizingly, it appears that in some ways, it can be applied to development. For example, we can codify “best practices” and “naming standards”, and perform “static analysis”. We can use team development environments that enforce Corporate Standards — which can be either good or Stoopid, but let’s just assume they’re all good.

Even while conceding those points, though, there is a significant part of any non-trivial software development project that requires, not cold hard science, but passionate craftsmanship. And this is not just isolated to, say, the analysis phase of a project. It’s a thread that runs all through every part of the development life cycle.

Software develpment requires precious intellectual resources to be done right. Every day on a project, dozens of decisions must be made as to how to spend and apply those scarce resources. One must not only navigate the technical decisions, but the often conflicting and irrational political, regulatory, and marketing considerations which the project must effectively engage in order to be useful to anyone. And all of these factors are in constant and often chaotic change.

It’s tempting for non-developers and new developers to focus on the relatively easy stuff … the bare, obvious technical considerations, and assume that standard reductionist scientific thinking will eventually bring about predictable, reproducible results. But the truth is that organic, integrative, holistic thinking is needed, too. And that is the work of the mature craftsman.

This is happening in other fields of scientific endeavor as well, but it’s just easier to overlook. Historically, science’s hope was that by providing increasingly detailed understanding of natural processes we would come to fully understand them. But the fatal mistake was assuming that the universe is not infinitely intericate and complex, and that the same rules apply to the tiny picture (the sub-atomic level) as to the big picture (the cosmolocial level) and to everyday life at human scale. Science is increasingly running into trouble here. We have not found a hard limit to the size of the universe in either direction; we keep discovering annoying new dimensions we didn’t know existed; we keep encountering new rules that put previously accepted scientific fact to the lie.

It’s no different in the world of software development. You’ll never remove the need for the true craftsman. It takes someone who can deal with both the slippery, subjective, gut-level and human factors; it’s not just the technicians twiddling dials. And even the technicians should use Common Sense ™, which is not something that grows on trees.

The core skill of the software developer is correctly identifying and solving problems. Everything else is just execution. When software development projects are so often misbegotten and misdirected, we should not redouble our efforts to execute better. We should redouble our efforts to correctly understand the problem and the best points of leverage for addressing the problem.

Alas, accurate perception and accurate judgments about what to do with those perceptions, are not terribly amenable to Standard Processes. Knowledge and wisdom are two different things.

December 22, 2006

Playing a Game of Twenty Questions

Filed under: Managment — Bob Grommes @ 1:56 pm

In recent years it’s become fashionable to attempt to quantify the technical competence of developers using systems like exams (certification exams, basic competence exams, and so forth), gang interviewing techniques, and other formulaic methods. I ran across yet another list of interview questions today, and while it’s not a bad list by any means (and the author’s disclaimer that it’s more food for thought than an ironclad litmus test is also excellent), it does make me wonder if there isn’t a better system than this.

Most interviewers seem either unaware or unwilling to acknowledge that development has become a vast ocean of ever-churning platforms, components, libraries, add-ons, fashions and fancies. As such, it is virtually impossible for anyone to have an in-depth knowledge of every nook and cranny of even a single language much less a single platform.

So when someone wants to know what ports need to be open over a firewall for DCOM or to regurgitate, say, a variable locking pattern for thread-safe access, my reaction is, “I don’t know, and you shouldn’t care”. In the work I happen to do from day to day, I very seldom need those things, so I don’t clutter my brain with them. If I need them, I can Google them up in about 20 seconds and woof out a proof of concept in a few minutes.

What’s core to a good developer’s ability is not how many rote facts he can parrot on demand, but his analysis and problem-solving skills. Any developer thus equipped can suss out most any unfamiliar territory in very short order.

Yes, if your shop does a lot of work with, say, DCOM, then it’s a nice plus if a prospective new developer has deep experience with the technology. At the very least you ought to establish that the developer doesn’t fear it and has done work of similar difficulty and character before. But I would much rather have a developer with analytical common sense and seasoned general experience who doesn’t know much about DCOM beyond what the acronym stands for, than some wet behind the ears kid who happens to have cramed for the “exam” and knows some disconnected facts about it.

December 20, 2006

On Technology “Churn”

Filed under: Managment,Products — Bob Grommes @ 1:12 pm

Back in the days before dirt was invented — sometime in 2002 — Microsoft released version 1.0 of the .NET platform along with Visual Studio 2002. Among the technologies provided was an RPC stack called .NET Remoting, or Remoting for short.

Remoting was, like everything else in .NET, promoted like the Second Coming. It was the greatest thing since sliced bread.

I had a project that involved orchestrating a lot of data acquisition, conversion and scrubbing tasks on about a dozen PCs, and Remoting was absolutely the perfect solution. It worked absolutely great. The project was a complete success. With a little work, I had a complete workflow system with fabulous fault tolerance, even though one of the third-party applications called by each worker machine was notoriously temperamental.

Now that the project is “in the wild” it should serve my client well for the next dozen years or so.

Except for one problem: It’s obsolete already.

On a timeline that can only mean that Remoting was nothing but a stop-gap measure all along, Microsoft released a completely new and incompatible API for RPC work, the Windows Communication Framework, or WCF. Remoting is still in the .NET framework, and probably will be for the foreseeable future. But it has been sidelined and is now officially a Second Class Citizen. And all this happened in considerably less than four short years — two, if you count technology previews.

I’m not saying WCF is a Bad Thing. In fact, it looks like pretty good stuff. But consider what has happened here:

1. A small business invested scarce funds to produce a solution and expects (reasonably, I think) to get a return on that investment for many years with minimal additional costs.

2. Technically, they will get just that, but .NET developers who haven’t happened to work on Remoting in the past (including every new developer minted after about 2006) will know nothing about it. And likely, even those like myself who have experience with it, will be rusty. In addition, no new work will probably be done on Remoting by Microsoft, so if my client needs to integrate with future technology they may be forced to do a rewrite much sooner than they otherwise would have. If any bugs get introduced into Remoting, they are less likely to get caught and fixed quickly.. And so on.

Is this a big deal? Arguably, no. But it illustrates a fact of life that bullet-point-mongers in marketing seem to keep forgetting: no matter how great a given new technology is, the marketplace can only absorb it so fast.

There are limits to how many times per year a developer can retrain; actual work does occasionally have to get done.

There are also limits to how many platforms any developer can simultaneously be proficient at. It’s one thing if core functionality available in 2000 still works in 2006 and whatever has been bolted on since can be learned and leveraged as needed; it’s another thing if (as is the case with RPC) there’s been both a platform change and an API change in the same time frame.

Would I have handled this particular instance differently? Yes. In 2002 I would have given customers guidance about Remoting — that it was a stopgap technology and that a more comprehensive and flexible framework was under development for release three or four years hence. Then customers could make decisions and plans in the full light of reality rather than under the burden of marketing claims.

Of course, that’s never going to happen, but I can dream. I’d sure be nice if vendors were more respectful of the concept that when you provide a platform, it needs to provide a reasonable application life cycle for the things it supports. The lower level the platform, the longer that cycle needs to be. The only thing with a longer life cycle than an API would be the OS infrastructure itself.

This doesn’t mean the vendor can’t innovate within that framework, but they should consider it a personal failure if they have to needlessly pull the entire rug out from under customers. You might be forced to do that if, say, a competitor came up with something better and you had to match or surpass it to get the mindshare and momentum back in your court. But that’s making your problems your customer’s problems, which ultimately isn’t good business.

Sometimes when it comes to keeping up with technology churn like this, I feel like the Dutch boy with his fingers in the dike, trying to plug leak after leak. This isn’t because I’m old and set in my ways, or trying to Keep Everything the Same. I have, after all, willingly reinvented myself at least four times in my career, in the name of staying informed and competent. The transition from procedural to object-oriented development comes to mind, for example.

But there’s a balance to be had. There seems to be an ethos today that developers are somehow failing to Keep Up if they don’t attend, each week, at least one webcast about some vendor’s Latest and Greatest New Thing. Don’t believe it. Vendors just know we are suckers for new things, that’s all.

Instead of chasing every New Thing that comes along, prioritize relentlessly based on what you can get done for your clients effectively over time. Hint: this generally doesn’t have much to do with being on the Bleeding Edge 110% of the time.

December 18, 2006

English: The “other” programming language

Filed under: Communication — Bob Grommes @ 9:00 pm

I just finished reading J.Timothy King’s provocative essay, “Does Bad Writing Reflect Poor Programming Skills?”. King’s answer is an unqualified “yes”, and I have to mostly agree with him.

The difference, of course, between communicating with human beings and communicating with a computer is that computers have infinite patience. You can probably be a boring, unimaginitive writer and still have good programming skills (although arguably, if you lack imagination you may fail to think outside the box enough to come up with the best solutions consistently). What you definitely cannot be, though, is an unclear writer.

King’s point is well-taken, too, that a program must be comprehensible to other developers and maintainers, and to your future self months or years from now — not just to the computer.

The scary thing is that an astounding number of programmers (I won’t dignify them with the term “developer”) not only can’t write, they can’t even read effectively.

In my experience if you assign a task to a good developer, they will annoy you with pertinent questions until they understand every detail of what you’re asking them to do. If you assign a task to a lousy developer they will read the first two sentences and might manage to misinterpret even that much.

For example, I recently gave instructions to a programmer working on a web form that included home and work phone number fields. I stated, “In both the home and work phone numbers, the last seven digits may not be the same.” In other words, 480-111-1111 is not a valid phone number, nor is 480-777-7777.

Instead of what I asked for, the code the programmer wrote checked that the last seven digits of the home and work phone numbers were different from each other. So 480-111-1111 would be okay for a home phone so long as a different number was used for the work phone. Not even close to what I was looking for.

Predictably, other aspects of the code were “off”. One of the objectives of the code rewrite was to establish redundant validation on both client and server so that people with JavaScript disabled — and spam-bots — could not enter invalid data. The classic ASP code being worked on submitted the form to a separate ASP page for processing. Rather than combine the two or make sure the server-side validation was on the processing page, the programmer did the server-side validation on the form ASP and then submitted the validated data to the processing page via an in-the-open querystring, so that things were still readily hackable by simply submitting a bogus form directly to the processing page.

Coincidence? I think not. And no, I didn’t hire this guy. Thankfully he took 30 hours to do 8 hours worth of work, which got the attention of the non-technical hiring manager, and this particular programmer was un-hired in short order — proving once again that money talks. It saved me a half hour of patient explanation of exactly how this fellow was dropping the ball.

If you’re not comfortable quickly and accurately reading and clearly writing in plain old English, tackle English as your next “programming language”. The job you save may be your own.

December 17, 2006

Finding Meaning In Doomed Projects

Filed under: Managment,Projects — Bob Grommes @ 9:36 pm

Back in the late 90’s I was the primary, and often the only, developer of a fairly large project that I had the pleasure of being associated with from beginning to end. The software was the core product of a commercial credit bureau, and that product was desperately needed by grateful paying customers. In addition, no one else had ever done it before, and perhaps more gratifying, some have tried — and failed — since. The owners of that startup successfully cashed out, and I have that warm, fuzzy feeling deep down of having worked hard to produce something I’m very proud of.

Alas, such projects are extremely rare.

I’ve known developers who go through entire careers without having anything resembling the above experience. I was fortunate to be in the right place at the right time, and I know it.

Most projects, by contrast, suck in one or more significant ways.

A surprising number of them are actually, in some way, doomed.

In the “alternate reality field” that surrounds many businesses, software development projects are more often then not, created for the wrong reasons, to solve the wrong problems, in the wrong way. It’s one of the dirty little secrets that not only are most projects over budget, late, and wrong … but quite frankly, off target in the first place. You don’t read much about this problem for some reason. But I have a notion that a lot of those screwed up projects would manage to “gel” and produce something grand if everyone involved knew in their deepest soul that what they were doing was actually meaningful. Think about the projects you have the fondest memories of. I’ll bet not many of them were exercises in futility.

And so I come to the point of this essay: what do you do when you find yourself (as you inevitably and often will) in the midst of The Project That Will Never See the Light of Day, or The Project that Solves the Wrong Problem, or one of the other horrors of project management?

Well, there are some prerequisites:

  • Never, ever give a client bad advice. It’s true that their eyes will often glaze over and they will not follow your good advice. But, that is their choice. Never tell them what they want to hear. You can learn to do this without being needlessly confrontive. Sometimes both you and the client know their request for your input is only pro-forma; they’ve already made up their minds and don’t really want to be confused with facts. Go along with it … but make sure that no one can tell you, down the road, you didn’t tell them a better course. You’re a consultant, after all. Give good counsel. It’s the only way to build respect. Shoot straight!
  • Never accept abusive or exploitive situations, such as death march environments, a constant crisis atmosphere, lousy working conditions, insufficient tools, broken payment promises, etc.
  • Always confirm all agreements in writing. In practice this doesn’t generally need to be a contract, just a series of emails along the lines of “To confirm our conversation today about Project X …” You’ll be astounded at how seeing what they’ve just told you to do, in writing, and having it documented as their idea, can suddenly turn around an insane idea and bring it down to earth. But, if it doesn’t — you don’t hang for it later.

Assuming the above … you have a paying gig, a forewarned client, reasonable working conditions, and clearly agreed-to responsibilities. Now maybe the misbegotten project will not serve the client or employer; maybe it will ill serve them. But, you have a learning experience before you — a chance to make the architecture, the implementation, the realization the best it can possibly be. If it turns out that you created a pencil sharpener where a screwdriver was what was needed — so long as that’s what the client insisted on, despite your counsel — just make sure it’s a damn good pencil sharpener. A job well done is its own reward.

The sad fact of our craft is that most of our best work will be largely invisible to the world, even when the finsihed product is widely and successfully used. Most often no one will know how you jumped through rings of fire and ate little pieces of glass so that the transaction layer works correctly every time, and is fault-tolerant. But you know … and for most of us, most of the time, that has to be enough. If it’s not, you’re in the wrong line of work.

Windows Vista: Wait for SP1 (at least)

Filed under: Products — Bob Grommes @ 4:10 pm

I’ve never been one to trip over my own feet in an attempt to replace the Old and Lousy version of any OS or application with the New and Improved version. I keep pretty current, but as I generally lack a lot of time to play with technology just for the sake of playing with it, I prefer to let others be the pioneers with the arrows in their backs.

On the other hand, I got an offer from my local Microsoft office a couple of months ago that I felt I couldn’t refuse: “bring your computer to our Vista install fest and upgrade your XP machine to Vista RC2 with our help. In exchange for the data from the upgrade and your feedback, you get a free copy of Windows Vista Ultimate.” I had a brand new Falcon Mach V and a 30″ Apple Cinema monitor that I needed to get set up as my main dev machine, so I figured … what I have I got to loose?

Plenty, as it turned out.

One would expect, particularly for a product so stunningly late and much reduced from its original ambitious goals, that a second release candidate would be reasonably stable. After all, it’s not much different from the release to manufacturing (RTM) code.

What ended up happening can be summarized as follows:


  • The upgrade process turned out to have a stunningly rudimentary UI, almost entirely text-based. It appeared to freeze for about 40 minutes at 21% complete “unpacking compressed files”. I was told not to worry, it was normal. I can only imagine the tech support calls this is going to generate for Microsoft.

  • When the upgrade finally finished some 3 and a half hours after it started, it went into an endless loop trying unsuccessfully to reboot. I called a tech over, they shrugged and said I’d have to do a clean install. WHOA! Just a minute! Don’t you want to run a diagnostic on this to see what happened? Aren’t you going to help me recover? You know, “technicians are standing by”, right? No, they weren’t going to help, and weren’t even curious about what happened.

  • Thirty minutes later I had a clean install of Vista, handily wiping out two or three evenings of work partially setting the box up as an XP workstation. The wireless service was down at that location so I couldn’t connect to the Internet for the latest updates, so I went home with an uneasy feeling about the whole thing.

  • That night I plugged in my USB external hard drive that I use for backups and turned it on, and got an instant Blue Screen of Death (BSOD). WTF?! It’s a standard USB device and Just Works under XP. There aren’t even any drivers to install! And I thought Vista was supposed to slay those BSODs!

  • After further muttering and trouble shooting and getting results ranging from the device not being recognized to more BSODs, I spoke to Microsoft. They said it must be a driver problem. I told them there is no driver involved — just the built in random access device driver for USB. They said okay, a tech will call you tomorrow. The call never came.

  • In the end, after a brief flirtation with Windows XP 64 bit Edition, which had problems with my supposedly-compatible video drivers, I went back to the world of 32-bit Windows XP, where everything Just Works.


Maybe sometime in the next 12 months or so, Vista will Just Work. But what I saw and experienced has all the hallmarks of something half-baked and Not Ready for Prime Time. Add to this the announcement that Visual Studio 2005 will not be fully supported under Vista until a future serivce pack 2, and Visual Studio 2003 — still in wide spread real-world use — will not be supported at all.

I make my living specializing in Microsoft technology, so I’m not a Microsoft hater, and not normally a Microsoft basher. But if they don’t quit filling middle management over there with Pointy-Haired Ones, I worry about the future. They are losing their Mojo and need to get a grip before the trickle becomes a flood.

This business of something Just Working has a lot to say for it. XP may not see 25% of my 4G of installed RAM, it may not have transparent / transclucent / flying windows, but — it Just Works. There’s tremedous value in that!

This whole experience also proves that the following conventional wisdom still holds: NEVER, EVER install a new version of Windows as an upgrade to an existing version. ALWAYS do a CLEAN install. The XP-to-Vista upgrade had the overall feel of something demanded by marketing and tacked on at the last minute. I suspect the Vista team knew better, but had no choice but to cobble something together. A little birdie within Microsoft that I spoke to just before release even told me there was talk internally of pulling this feature as infeasible. I doubt that actually happened, however.

At least I have my Vista Ultimate license, which I can run (along with some service packs) in a VM or as a secondary boot option down the road when things settle down a bit.

Update: In addition to releasing a new OS late, with significant rough edges, and incompatible with its own development platform for months to come, it’s also significant to note that the current version of Microsoft SQL Server will not work correctly on Vista either, and requires a service pack available at some future date. SQL Server Express SP1 can be used on Vista today, but has issues that require workarounds.

A common thread in all of this train wreck seems to be the new security features in Vista; disabling them or running as administrator may allow Visual Studio and SQL Server to function as-is, but you’re on your own. In fairness, maybe the pain of the new security features will ultimately be worth it, but it is not promising that Microsoft’s own products are struggling to adapt, and that Microsoft’s own product managers seem to have been caught by surprise with significant new work in order to fix the problems.

Another update: In a lighter vein, I think this fellow captured the essence of my original feelings about this whole debacle. However, by March I will nevertheless having a copy of Vista running, just not on my precious dev machine.

December 16, 2006

Your Brain on EverNote

Filed under: Products,Tools — Bob Grommes @ 2:15 pm

It is nearly impossible to maintain control of your busy stable of projects without some kind of organizational tool.

Here’s a quick recommend for a note-taking tool: EverNote, which comes in both free and paid versions. If you don’t use a tablet PC for taking handwritten notes, you likely don’t need the features in the paid version.

I’ve tried Microsoft OneNote, and have looked at some open source options, but for me at least EverNote strikes the sweet spot. It is flexible, fast, easy to use, and has excellent clipping tools so that you can save web pages, or sections thereof, as notes. My only complaint is that the scrolling thumbs seem to be nonstandard custom controls that can behave a little strange when you use the program remotely (at least in GotoMyPC).

How many times do you come across bits of info that save your life, but you know you’ll never find it again 17 months from now when the need arises once again? With EverNote and just little diligence, that won’t happen.

If you’d like to compare EverNote with some other options, here’s a decent overview.

December 15, 2006

Ad Hoc Version Control

Filed under: Methodologies — Bob Grommes @ 1:55 pm

Most developers today seem to understand the benefits of a source code version control system, and its ancillary uses in tracking changes to project notes, documentation, and other non source code files. Version control generally makes sense even if you’re the only person working on the project; once set up it’s fairly trivial to use, especially if it integrates with your IDE, and it’s often very useful to do diffs on old versions of code, and to be able to split off release and dev versions, and things of that nature.

There aren’t really any good excuses for not having a version control system in place, especially when you can get something like SubVersion (SVN) for free. However, in the real world, there are environments in which there is resistance to version control, or at least to extending it to documentation. Yet, if you’re not the only person to maintain the documents, there needs to be a system in place to prevent overwrites and conflicts. I’ve had two projects recently where this was an issue.

Sometimes you just have to pick your battles — maybe the other people with edit rights don’t understand source control, and don’t want to take the time to understand it. People usually fear what they don’t understand.

If you are dealing with a small number of people who are reasonably computer-literate and somewhat disciplined in their approach to work, here’s a simple ad-hoc system I’ve worked out that has been successful for me.

Let’s say that you have a folder on a share someplace that you call Documentation. It contains several Word documents documenting the system you’re working on and the business rules it implements. You might create the initial draft but parts of the documentation are subject to edit by others — domain experts, etc.

Simply create a CheckOut subfolder within Documents, and then within CheckOut create a folder named after each person with edit rights. Then get buy-in to the following procedure:

  • In order to check out a document, first see if it exists in the Documents folder. If it does, move it to your personal checkout folder, then copy it to your local hard drive to work on. If the document doesn’t exist, you know someone else has it checked out, and you can find out who that is by seeing what checkout folder the document has been moved to. Documents in the chekout folders are for read purposes only; they can’t be modified until the “owner” puts them back in the Documents folder.
  • To check a document in, copy your edits back to the Documents folder and delete the original in your checkout folder.

It’s far from foolproof, but it’s simple, non-threatening, and cost-free. It doesn’t require that anyone learn anything new, including figuring out Word’s proofing features, which is a kind of crude version control in its own right.

The main understanding you need to have with all the participants is that if they forget or “cheat” then their work is subject to being overwritten by someone else, and they have no right to whine about it. When this happens to someone once or twice, they sharpen up in a hurry. In practice, even that is not usually a big deal because often, individual changes are fairly minor.

A downside is the “PITA factor” — this approach creates a certain amount of intertia over making a quick change on the spur of the moment. Of course, if that begins to wear on people you can always mention that they could leverage the document proofing / review features or place the document under version control. 😉

Of course you can vary the above … files can simply be renamed with a prepended “(Checked Out by XX)” to check them out. This has the advantage that you can see all the documents at a glance, including which ones are checked out and who has them. The details don’t matter, you just need an agreed convention.

Next Page »

Blog at WordPress.com.