Estimation and the Inestimable

Share

I’ve just spent the day training some of my colleagues in ‘non-technical skills for consultants’, those skills that aren’t specific to any particular profession but which you need irrespective of whether you’re a lawyer, an engineer, a project manager, an IT consultant, an architect or any other kind of expert peddling his or her expertise.

What’s common to all professions is the requirement that you listen, ask good questions, structure facts and opinions well (thereby demonstrating your understanding of your client’s predicament), document them well, explain them and present them well, design solutions, make good judgements, persuade your clients of the merits of your judgements, implement your recommendations well, and so on.

There are a lot of skills on top of the technical ones.

(See The Art of Consulting – Some Golden Rules)

They’re not really skills that can be taught. My aim is to show how important they are and to provide examples of how they can be exercised well or badly. How do you teach good writing other than by providing examples of clarity, simplicity and brevity? How can you teach good design other than by pointing at something that’s well designed or badly designed? What I’m trying to instil through this training is an awareness that these skills matter and that complexity, redundancy, cliché, obscurity, and so on, should be avoided.

Judgement is the hardest topic. Consultants must assess what they’ve found out (facts, opinions, capabilities, priorities, and assumptions), and recommend a course of action in the (well-documented) circumstances. That is what they are paid well to do. It carries risk, and consultants are often wrong, something that consultants and clients must always be willing to admit and understand. But clients don’t engage consultants to tell them what they already know. They engage them to tell them what to do.

estimates

One kind of judgement in particular haunts businesses of my kind – estimation of effort required. We (LLP Group and systems@work) configure and implement business software packages. We’ve had more than two decades’ experience of understanding what our clients need and we’re good at designing pragmatic, value-for-money solutions. We’re even quite good at estimating how long it will take to configure and implement such solutions, as long as they don’t involve extensive software development.

But when it comes to software development itself our judgements are too often unreliable. Of course, we usually pay the penalty for that ourselves, because software development is very often contracted on a fixed price basis. Even after nearly forty years in the business of software development I still can’t understand why estimating how long things will take to develop is uniquely difficult. But it is, and not only for my company.

I remember the scorn I expressed nearly thirty years ago when I first came to ‘Eastern Europe’ and was told, ‘Well, how can I know how long it will take until I’ve finished it?’ when I asked for a software development estimate. I was used to asking the question and getting a number in reply.

But of course it was always the wrong number, and almost always a number lower than the number of hours, days, weeks or years a task would actually take.

The fact is that it is hard to predict the implications of change deep inside a software system without fully working it out in advance. And fully working it out in reality means actually doing the task. When it comes to the development of our own very sophisticated business software packages – time@work, expense@work and forms@work – it is the same. I have received estimates for years for enhancement work, and they’ve always been wrong. And I’ve long since given up the blame game. Things just have a habit of taking as long as they take. So now I ask whether it’s a big task, an intermediate task, or a small task and we fit as many of them into the available time as we can, assigning appropriate priorities so that we do the most important ones first. And we accept that our release dates must recede into the future.

What is the alternative? Does anyone have an algorithm for determining the impact of change on a system (all the places where one change implies another) or for measuring the complexity of change (which implies how likely it is that the team will do it correctly)?

Consultants must make judgements all the time, but sometimes they must accept that their judgements will never be as accurate as they would wish them to be., especially when it comes to estimates of the human effort involved. Some tasks are simply inestimable.

One of my colleagues recommended the Window Method. Look out of the window and think of a number.

Another recommended taking the average of multiple estimates. But in this case you would still have to double the number, at least.

Another recommended obtaining estimates, measuring actuals, discussing the variance and applying sanctions. But I know this doesn’t work.

Any other ideas?

Consulting is More Than Technical Knowledge

Share

I’m having a row with MarketUP, a consultancy in online marketing we’ve employed this year at LLP Group and systems@work to sharpen up our websites, our Google AdWords campaigns and our use of other marketing channels. In fact, I fired them yesterday.

marketing consultants

As so often, the ostensible reasons for our row are one thing, and the resentments and disappointments that fuel our row are another. From my point of view it isn’t so much about what they know or didn’t know, or did or didn’t do, as about their style as consultants.

Online marketing advice is in short supply. Everyone wants to improve the way their business is presented, and there are ever more sophisticated tools to use – websites (where fashions change with depressing rapidity), Google AdWords campaigns, LinkedIn pages and ads, blogs (such as this one) and specific campaign-oriented micro websites. These must be consistent in terms of content and style. It takes specialist knowledge that you’re not likely to have in-house, to use these tools well.

True, MarketUP did a good job of improving the graphical styles of our websites and microsites, but I was concerned at the start that most of their work had been in the area of B2C (business to consumer) where traffic is high, messages are relatively simple, and sales, in terms of units, are numerous.

In our world, of business software and consulting, messages are complex, target markets are very specific, traffic is low, and sales, in terms of units, may be fewer than twenty a year.

I’d actually had some success with Google AdWords campaigns. Over the last five years and more I’ve tracked the costs of our systems@work AdWords campaigns and the revenue derived from them. The results look like this (revenue blue, cost orange):

adwords

It’s a good story, but it doesn’t reflect much recent success. Once we’ve ‘captured’ a client, revenue rolls in for as many as ten years. In fact, over the last two years we’ve added very few new clients directly through our Google AdWords campaigns.

It was to address this issue that I turned to MarketUp. They presented themselves well and seemed to know as much as anyone in this area and their price was reasonable.

But consultants need more than knowledge. I have written extensively on this (see The Art of Consulting). Consultants need to ask good questions, they need to listen, they need to understand the underlying needs of their client, they need to take responsibility and do much more than deliver mere technical expertise. Telling what they know is just a small part of the job.

Things began to go wrong when it became clear that they didn’t really understand our products and couldn’t come up with the Ads that would speak to our potential clients. In the end I had to write these myself. I didn’t mind and I wasn’t surprised but MarketUP were curiously reluctant to admit any kind of fallibility.

And in the end the AdWords campaigns they put together were no more effective than those that came before, and over six months I’ve had not a single good lead from the site. I’d hoped that a website redesign (and I accept wholeheartedly that the new site for systems@work is immensely more attractive than before) would attract more visitors and contacts, but it didn’t.

Of course, consultants cannot guarantee success, and there is no online marketing agency in the world with the specialist knowledge to predict how online marketing can best be used for our particular products and market. But that was not my issue with MarketUP.

What has really annoyed me is their response to challenge and criticism. They kept insisting that ‘measurements’ showed that the website and campaigns were performing better, as if there could be any measurement of importance that matters to me other than obtaining good leads, of which there were none.

When criticised for their failure to put together Ads that made sense they were simply defensive.

And when, finally, there was a vast misunderstanding about a project to improve our natural listings, they made no attempt to see things from my point of view, or to understand that I only had one simple objective, to obtain more good leads through paid campaigns or natural listings.

It was their reaction to criticism that made me see red. I cannot remember a single occasion over the last year when they have admitted error.

And as far as I can recall, they provided no scoping documents, no memoranda of understanding, at any time to document their understanding of our needs.

It’s a lesson in consulting. Technical skills are essential, but there’s far more to consulting than knowing things. You must ask well, listen well, document well, understand well, and manage well, and you must respond well to criticism.