Monday, April 18, 2011

We don't know

Glen Campbell recently wrote a post entitled Software contingency planning, Donald Rumsfeld, and the epistemology of ignorance. The basic point that he wanted to make was that if you are developing software according to an "agile" development process such as scrum, your focus should be on developing working software.

Unexpected events happen: for example, you might start on a feature, only to discover that it’s much more difficult than you originally anticipated....[Y]ou need to ask yourself what you can do to deliver some working software. If you can reduce the scope and only deliver a small part of the larger feature, then do that. If you can bring in someone else to help, then do that. There’s no right answer.

If you're a developer, or even if you're not, read the rest of Campbell's post here.

Campbell cited Donald Rumsfeld in his discussion of uncertainty, but he is not the only person to do so. Rajeev Edmonds also referred to our former Secretary of Defense in his discussion of web analytics:

I'll use a famous quote by former defense secretary Donald Rumsfeld that highlights the crux of reporting and data analysis.

"Reports that say that something hasn't happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns -- the ones we don't know we don't know." - (Former defense secretary Donald Rumsfeld)

Secretary Rumsfeld is indicating about three types of data sets that can be mined by any reporting system. It is applicable to any reporting system regardless of the type of data and the type of method involved in extracting the relevant information. The first type is - known knowns - the information whose existence we know beforehand. In this context (web analytics) it can be compared to a pageview. The second type is - known unknowns - the information we know does exist, but somehow we are not able to quantify it in measurable terms. Any triggering condition that may inflate your pageviews or any causal factors that increase or decrease your conversions comes under this category. These conditions do exist, but we cannot predict, quantify, measure or detect them effectively. And, the third type is - unknown unknowns - the information that's present in our web traffic data set, but we don't even have the slightest idea of its existence. And mining this information is the most challenging problem for any web analytics system.


So now that we've moved from development to web analytics, let's move further to my neck of the woods, which is roughly the sales/marketing area.

In both product management and proposals, I have had to analyze customer requirements. When a customer, or someone hired by a customer, writes software requirements, he/she has to describe something that needs to be implemented. But even if the requirement is written by the customer, and even if the customer understands the needs of all stakeholders (in my case, IT people, forensic professionals, managers, and a slew of others), it is still very difficult to put down in words the vision that you see in your head.

I'll give you an example. One common thing that I see in RFPs and other requirements documents is a desire from the customer to have the new system incorporate all of the functionality of the old system. Inasmuch as my industry has a lot of customers who tend to stay with the same vendor, this sounds like something wonderful if the RFP comes from one of your own customers. But this can be problematic in certain cases.

When I became a product manager, the workstations that were used by my product ran on the Windows 2000 operating system. Most of the customers who were upgrading to my product, however, did not have Windows workstations; instead, their workstations were running Digital UNIX/Tru64 UNIX with the X Window System. There were clear differences between the X Window System look and feel and the Windows look and feel. Did our customers want us to violate the Microsoft Windows look and feel and do things the old way? Or did the customers think that we would of course do things in the Microsoft Windows way? And what exactly was the Microsoft Windows way?

Here's another one. The old product was client/server, with significant processing taking place at the client. An AFIS requires extraction, comparison, and (usually) verification. The extraction would be performed at the client; the comparison would be performed automatically at the server; and the verification would be performed at the client. To perform verification, a whole bunch of text and binary data (including image data) would need to be sent from the server to the client; we called this a "verification packet," and it would include information on a bunch of possible candidates.

My new product was NOT client/server. While extraction would still take place at the client, the remaining steps - including verification - would be managed at the server level. Now, instead of having to send a bunch of stuff to a client (and pre-determining which client would get the stuff), the verification operator would simply open a web browser and retrieve stuff from the server as needed. This had the benefit of allowing work to be performed at ANY authorized workstation. The disadvantage, however, was that the work was slower - rather than retrieving images from your C drive, you'd be getting them from the network. How would our customers react to these changes, especially the latter change? Would customers complain that our system, despite its architectural advantages, was a step back from the previous generation because of the perceived slowness?

Some of this was "known unknowns," but we were probably missing a whole lot of "unknown unknowns" as we considered these questions.

So how do you plan for the future, whether to lay out a product roadmap or to propose something to a customer?

You do it as best you can.

Incidentally, Rumsfeld discussed knowledge on at least one other occasion, in a 2001 Wall Street Journal article:

Learn to say “I don't know.” If used when appropriate, it will be often.

Whatever you think of Rumsfeld or his bosses, there are some fascinating items in the article. I may revisit this list on occasion.
blog comments powered by Disqus