The New CFO

The New CFO – 04 Masters of Time

In the “The New CFO” series we already talked about capital allocation and operating models. That was the entry point into the discussion on how the financial and market view on the company is gearing valuations which ultimately creates access to financing to fuel growth. Something we will look into later on.

The next article will feature a discussion on KPIs and how to use metrics to drive the operating model performance of the company and guide decision making on capital allocation decisions. So this article is a pre-requisite to the KPI article and it sets the foundations. And it is all about time.

To understand time, we first must understand what a “well-directed” company is. This is a topic that could fill books, but we are going to stick to two concepts. (1) That of what Elon Musk called Vector theory (please read it up -> Google Elon Musk Vector Theory), which essentially talks about looking at individuals as being driven to head a certain direction and it is the role of management that this direction points into the same direction for everyone. And (2) that of synchronicity in time – something that I picked up in my Stanford classes and also related to the process synchronicity problems I solved at a deeply process and IT driven company I worked for earlier in my life.

Synchronicity in time means that the clock cycles – speaking in CPU language here and in Hertz – of the organization are in sync.

This is such an important concept that it deserves an own article. So we are going to look at it on different ancles.

The slowest hertz frequency in the company

Let’s consider you close your books once a year. That is a very slow hertz frequentz of 31 nanoHertz. Yes, that as 31 divided by ten to the power of 9 Hetz. Consider a weekly sprint that reacts to daily collected customer feedback. That is operating at 52 times the frequency of the financial cycle. Consider hourly generated online traffic analytics data. That is roughly 9000 faster than the annual cycle of accounts.

That concept is understood, right?

Shifting frequencies and starting points

Now consider process A and process B both running at 2 microHertz. But they are shifted by a random length unequal half the cycle length. In this case, these two cycles never meet if we say they only meet at the cycle end. Just consider two shifted sinus functions.

Also a clear concept. Now drop the shift and double the frequency of B. B and A meet at the end of the cycle of A only. So the slower process dictates the joint frequency where alignment takes place.
Now consider instead of doubling cyclelength to double, you only multiple by 1.5. It now takes 2 cycles of B to meet A after the 3rd cycle. The joint cycle length doubled. Ouch.

That’s maybe a bit theoretical. Let’s look what this means in reality.

If CEO always takes a decision at 10 days intervals and takes time to sit down and review material. The CTO takes 15 days. If both start at the same day, they meet and sit down together every thirty days. On day 0, both sit down and talk based on information of day 0. On day 10, CEO decides what to do alone, with information from CTO of day 0. So 10 day old information. On day 15, CTO sits down alone with input from CEO from day 10 – 5 days no update – and decides. Which again, at day 20, CEO reviews and decides alone. On day 30 both sit down and discuss together.

Is that too artificial? Let’s look at a marketing department publishing 10 campaigns a day and the marketing contribution to sales leads analysis is performed once a month based on cumulative monthly data. The contribution of marketing activity to sales can in no realistic sense be attributed to one of the original marketing campaigns. So all that is left is a measure of all marketing activity, which might be budget. So you look at budget spent on marketing and the impact it had on sales leads generation that month. If now also other departments contributed to sales leads, there is no way to perform contribution or attribution analysis. Period. If this is not possible, you cannot evaluate the efficiency of the campaigns. If the campaigns are run by distinct marketing groups that do not talk to each other and have different performance, there is no way you have information to control who to push.  You can assume stable ROI on dollar spent for marketing, assuming everything else equal. But then you factor in politics and your low performance control group wins a few political battles in months 1, 4, 7, and your high performerms win a few other months, etc. The ROI will be volatile, even if every department is stable in ROI. And you have no real means to do contribution or attribution analysis. You might do some hefty research if you collect the political information, but if you are not solving the synchronicity issue you likely are also not factoring in this into your analysis and hence you fly blind. Your dollar spending on marketing activity is no longer a KPI for lead generation. Because the uncertainty in the ROI is deteriorating the predictability and hence you have no “indicator”. Or if you consider fuzzy impact as indicative enough, you have no key indicator, something that is highly relevant and controllable.

Back to the CFO and mastery of time

The mostly quality assured and at the same time standardized – under accounting principles – metrics in a businss are the financials. That is why financial indicators are KPIs for valuation discussions. But as such, they are not operative KPIs.

The entire goal if you want to deploy capital efficiently is to understand your KPI trees. And the KPIs your generate must be generated at (a) the same frequency, and (b) the right frequency that is relevant for controlling the business.

We will go into KPIs in more detail in the next section. But is crucial to understand that capital allocation – or the decision on what to spend money on in an organization – can only be done if a few simple things are taken care of:

  1. The impact of the spending must be measurable. So you need to reverse financial impact / KPIs down to operational KPIs that capture your allocation strategy and the activity the spending creates.
  2. The only way to measure and run such a KPI tree is to build the foundation for sober and clear contribution and attribution analysis. If this fails, you (a) cannot measure the impact and tie it to the spending.
  3. Contribution and attribution analysis require a sufficient frequency in capturing metrics to seperate events and their contributions and their impact on metrics. If any business activity is running at significantly higher frequency than a measuring activity or another activity, the attribution/contribution link breaks.
  4. In that case, the system dynamics is not possibly modelled and the “system is not identifiable” from a statistical point of view. Or your “noise” is increasing. Noise being the non-controllable part of an equation model, you lose control over the business. The factors that contribute to your output become less managable and hence they stop being “KPIs”.
  5. If 4. is the case, you stop losing control over impact. So you lose managability.
  6. If 4. and 5. is the case, you stop losing your ability to compare impact or ROI for every dollar spent on your capital allocation strategy and you are unable to efficiently allocate capital.

That is the caveat of the first issue. Now looking at the dynamic aspect over time.

  1. If you cannot measure contributions and KPIs right now, you can not improve on your metrics. Because for that you need to vary processes that generate these metrics and observe changes in behaviour of the system over time. If you are not having reliable, predictable and managable metrics, you cannot create initiatives to change the relationship of these metrics.
  2. If all of the above is not the case, you simply cannot manage the company towards a particular goal that reflects in metrics. If you want to get your CAC to TLV ratio down but have no means to understand the operational contributions, you simply cannot – with sufficient certainty or probability – change the direction of the relationships of spending on the performance of these metrics.

In other words, you entire ability to allocate capital vanishes. With that, there is no way to control and steer your operating model to drive the valuation based on metrics.

Why is this CFO stuff?

It is a blend of COO and CFO activity. Stable and predictable metrics don’t play out on ad-hoc activity. For having stable metrics, you need processes (COO turf). But even if everything is process, if you are not solving the synchronicity challenge that allows processes to talk to each other at reasonable frequency, you cannot have KPI trees. If you cannot have KPI trees, you have no absolutely no control of measuring how every dollar spent on a particular cost item contributes to your metrical performance of the business. And hence you have no capital allocation. ANd hence your CFO is back to  doing accounting. The “New CFO” paradigm breaks down. No capital allocation implies no control over the operation model. And hence no control over valuations and shareholder value. Bad.

 

 

 

Leave a Reply