Business Management

Business Measurement Fails, Part I

Why is getting the numbers right so hard when managing a business?
Feb. 14, 2023
11 min read

When I started my MBA some decades ago, I was excited to learn how important numbers were for managing an organization. Whether in finance, accounting, statistics, business analysis, marketing, economics, or operations management, the development and application of quantitative analysis is paramount.

Math was always a strong subject for me, and I enjoyed learning the newest quantitative methods, even if some professors didn’t teach them with aplomb. But while the business adage, “If you can't measure it, you can't manage it” is a good general principle, it’s not a universal truth, as we shall see.

In business school, quantitative measures are presented as straightforward; something everyone can trust. Find the data, apply the technique, do the analysis, make your decision, done. Some techniques are simple, such as a basic data table or an X/Y graph. Others grow in complexity, from “regression analysis” to “analysis of variance” to calculating “confidence intervals.”

In my first job out of college, I quickly discovered things were rarely that simple. Many of the data sources and methods to calculate them were suspect but seldom challenged. Financial ratios such as IRR (internal rate of return) or ROI (return on investment) were simple enough on the surface, yet how the numerators and denominators were calculated varied significantly from company to company. In ratios—whether financial or in some other arena—so often the numerator, the denominator, or both are no better than estimates. That led to one of my personal mantras: “An estimate divided by an estimate … is bull*%#@.”


RELATED


Financial Abuse: Creative Manipulation of Quantitative Measures

One of the most appalling abuses in countless financial offices today is the proverbial “double Y-axis” graph. The horizontal X axis is usually time. On the left, a Y axis is labelled as a quantitative factor, such as “number of homes.” Then on the right end of the X axis, we see another Y axis for another factor, such as “price of homes.”

Inferences, and even decisions, are based on where the two lines cross, as well as their slopes. Trouble is, you can change both the intersection(s) and the slopes significantly by altering the scales on your two Y axes. One trick is to compress or expand either scale. Another is to alter either the value where the scale begins, or where it tops out.

If you are so inclined to “influence” the message, you now have four opportunities to manipulate both the intersection and slopes. Often though, this is done innocently, with scales arbitrarily chosen without much, if any, thought. The authors of the graphs don’t understand how the analysis is affected by their random choices.

At least in the example above, looking at units sold versus unit price over time, the two factors appear to have a potential, if uncertain, relationship. However, a hard look into many of these double-Y graphs reveals the two factors often have little to do with each other.

When you see a graph like this, or even a simpler X/Y graph, how often do you ask critical questions, such as: What is the source of the data for each axis? How was it calculated? In our housing example, did we use average price or median? Was it all homes or just single-family homes? Are the factors truly related? Do the scales make sense—were they chosen at random without much thought or have they been manipulated to push an agenda?

Ah yes, the simple days of business school. I can’t recall any class where we discussed how these quantitative measures could be used to bias, hide, mislead, or manipulate. There should be a required course on that subject in every MBA program.

Company Data: Managing the Unmeasurable

Then there are those things that are extremely difficult, even impossible, to measure, yet we must still manage them. Measuring differences in people and culture for practical applications in human resource management is a good example.

Most serious attempts to objectively quantify people end up going down the proverbial rabbit hole. Ask any numbers person in a company to calculate the savings to be had from firing a person and you’ll quickly get a dollar value. Now ask the true total cost of recruiting, replacing, training, and bringing a new person up to speed—and what may be lost during the interim—and you’ll make their head spin.

Similarly, what is the cost of one lost customer? Do they go down the road to the next builder and keep to themselves? Or do they tell everyone they know, from the soccer moms to the church committee and their coworkers, how your firm couldn’t provide the value, or didn’t treat them right in some way … and then post a string of comments on social media? Quantify that!

Perhaps the most troubling measurement conundrum today is gauging the impact of social media on just about anything.

Something as apparently simple as counting the number of punch list items on a house at final walk-through is fraught with complications, so much so that comparing numbers from builder to builder is a hopeless exercise. By the time you get agreement from everyone to count items the same way, then train people to do it consistently, you’ll be running for the hills, hair on fire, babbling incoherently. (Trust me on this one; I tried it with a group of 25 builders and it didn't end well.)

If you really want to mess up your data, integrate financial incentives into these very qualitative measures. The best you can do is establish your own internal tracking system with a goal of continual improvement, respond to genuine trends, and avoid knee-jerk reactions to one-off events, keep communicating, and learn from the experience. But forget about comparisons with other divisions within your company, let alone comparisons with other builders.

Perhaps the most troubling measurement conundrum today is gauging the impact of social media on just about anything, be it a new ad campaign, new product, or a bad experience with your firm, or even a scandal. What catches fire on the web and what doesn’t—and why?

OK, first we count “views” or “opens.” Then we count “forwards or “shares” or “retweets.” Then how about secondary and tertiary shares across multiple media platforms including Facebook, Instagram, Twitter, WhatsApp, Google ratings, or Yelp, among many others. Can we truly count those accurately?

And still, we haven’t even asked the real question: Did the recipients or those who shared and re-shared even read the content? We have no precise, reliable way of knowing. Thus, we derive a number, an estimate divided by an estimate (and we’ve already discussed what that equals). Assuming recipients did read the content, did they do anything tangible, let alone measurable, as a result? Good luck ascertaining that!

How many of you are like me, visually tuning out virtually every ad on every website I visit? Seriously, if there was a genuine ad offering tailored just for me, featuring something for free I truly need, I’d miss it.

Management According to Dr. Deming

Dr. W. Edwards Deming, the world’s foremost guru on all things quality, included “Management by use only of visible figures, with little or no consideration of figures that are unknown or unknowable” in his list of seven deadly diseases of management.

It is indeed easier to analyze, understand, and manage with good measurements, but we acknowledge they are not always available in the depth and accuracy we seek. If we keep our eyes wide open for the myriad measurement obstacles and not kid ourselves, that problem is manageable.

The real issue is our failure to understand what we are measuring, the inaccuracies, assumptions, presumptions, and even “pretending” that goes on in the measurement business. In home building, these hazards apply whether you're measuring the impact of cycle time, traffic through the sales center, employee turnover, house cost as a percentage of sales, return on the selection center ... . I could go on (and on, and on).

The real issue is our failure to understand what we are measuring, the inaccuracies, assumptions, presumptions, and even “pretending” that goes on in the measurement business.

I still study math and statistics. Family members even razz me about reading—and enjoying—books on calculus, statistics, and physics, along with my diet of mystery novels, history books, biographies, and books on cooking, among others.

As a student of numbers and their use, I’m regularly dismayed by errors, inaccuracies, and projections based on fallacious assumptions. This example from Mark Twain’s book Life on the Mississippi (1884), is as painful as it is humorous because we make similar mistakes all of the time:

“In the space of one hundred and seventy-six years the Lower Mississippi has shortened itself two hundred and forty-two miles.* That is an average of a trifle over one mile and a third per year. Therefore, any calm person, who is not blind or idiotic … can see that seven hundred and forty-two years from now the Lower Mississippi will be only a mile and three-quarters long, and Cairo [Illinois] and New Orleans will have joined their streets together, and be plodding comfortably along under a single mayor and a mutual board of aldermen. There is something fascinating about science. One gets such wholesale returns of conjecture out of such a trifling investment of fact.”

*Editor’s note: This is a factual statement, the result of engineering projects and floods that changed the course of the Mississippi River.

Stein’s Law and Measurement Obstacles

And you thought Mark Twain was just an incredibly gifted author? At age 23 he became a licensed steamboat pilot on the Mississippi and clearly knew a thing or two about math and how future projections based on historical data can lead one far astray. Twain’s treatise leads to another favorite of mine, known as Stein’s Law, a rule in economics that says: “If something cannot go on forever, it will stop.”

My version of the law goes like this, “Any trend that cannot be sustained, won’t be.” A college physics professor of mine once demonstrated if the rate of power plant expansion from 1940 to 1970 continued for the next 30 years, 100% of the land in the United States would be covered by power plants by the year 2000. The lesson is we should be extremely wary of forward projection of numbers, even more so if any component of the equation is exponential.

Not long after reading the Twain quote and recalling the words of my physics professor, I ran across an article by a well-respected home building economist who was projecting growth of a certain type of housing market. He based his forecast on an assumption from a historical rate that simple math would show could not continue more than three years out. We’ve all made this mistake, among countless others.

Pondering this, other measurement obstacles came to mind, many of which I have written about over the years. So many, in fact, that this column will be a two-parter. The next installment, which will run in the February issue, “Measurement Fails, Part II – The numbers don’t lie?” It will explore some of the most challenging measures that require us to do a better job, including cycle time, variance/VPO, profit vs. ROI, bid price vs. true total cost, comparing off-site to on-site methods, and the problems with so-called “objective” survey data.

If that sounds like fun, or perhaps you just feel the issue is as important as I do, while you wait for Part II, I highly recommend a remarkable book, How Not to Be Wrong – the Power of Mathematical Thinking, by math professor and prodigy Jordan Ellenberg.

I typically listen to Audible books at 1.3 or 1.4 times the normal speed, but this book was so pithy and prescient, I never exceeded 1.2, frequently slowed to 1.0, and often backed up 5 minutes to repeat a section for deeper understanding. That’s how deep it is, and anyone seeking a better understanding of the numbers by which you run a business will benefit.

Yes, that’s all of you. I guarantee Ellenberg’s lessons will improve your decision-making.

If you would like a PDF of Scott’s columns on “Measurement Fails” or the Excel template mentioned in this column, please email [email protected] with your request.

Read Part II of the Measurement Fails series

 

 

About the Author

Scott Sedam

Scott Sedam is president of TrueNorth Development, a consulting and training firm that works with builders to improve products, process, and profits. A senior contributing editor to Pro Builder, Scott writes about all aspects of the home building business and won the 2015 Jesse H. Neal Award, business journalism's most prestigious prize, for his commentary in Pro Builder. Scott invites you to join TrueNorth's Lean Building Group on LinkedIn and welcomes your feedback at [email protected] or 248.446.1275.

Sign-up for Pro Builder Newsletters
Get all of the latest news and updates.