Computer simulations are vital appraisal tools because we cannot ever forecast with any precision ** exactly** how many customers, clients or patients (CCPs)
will transfer,

Rules of thumb and the income method as currently applied by most credentialed appraisers utilize static models of forecasting the future profitability of firms subject to appraisal. Formulas and capitalization or earnings methods (a variant of the income method) simply assume that the most recent financial performance of the firm will go on forever. Another variant of the income method will use at most a five year forecast and then assume that all subsequent years will mimic the fifth year with a constant never ending growth rate in profits. All these static models are wildly unrealistic and there application does a grave disservice to buyers and sellers of closely held firms. The more useful and realistic approach is to utilize computer simulation modeling.

We seek a forecasting technique that directly measures the impact of significant risks and normal variations on future expected DE. We know that combining the impact of all such risks and variation on future DE through one discount rate is far too simplistic and crude. The better approach is to utilize computer simulations that directly model both risk and variation.

In a computer simulation, hundreds of alternative future DE paths can be forecast in seconds. The first step in a simulation is to develop a coherent economic model of the company and its competitive environment. A coherent model is one that identifies the causal variables that drive future DE. Here is the general form of the causal model for future DE for the forecast of the Maximum Competitive Advantage (MCA) and the Compete or Buy (COB) under the “buy” option. :

Future DE = CCPs transferred in x future patronizations x DE per patronization

Computer
simulations substitute probability functions for constants in the forecasting
of transfer rates, the number of future patronizations and the DE per
patronization. *This simulation process allows us to directly measure the impact of
variation and risk on DE. Since DE determines the maximum value of a CCP base,
computer simulations allow us to estimate the impact of specifically identified
risk and variation on equity value.*

What are probability functions and how do they work in computer simulations?

A probability function, or more properly a probability distribution function (PDF), is a mathematical expression that describes the expected frequency of certain future events or certain characteristics in a population of interest. These functions are often best understood by the shape of their graphs.

Many of you will be familiar with the most widely used distribution: the normal or bell-shaped curve. This curve embodies a wide variety of natural and human phenomena in which events or characteristics occur most frequently at one point (the mean) with a large frequency of events or characteristics occurring equally on either side and near this point. The frequency of occurrence drops off significantly the further one moves from the mean. The normal distribution is but one of many distributions that are so defined mathematically that all probabilities of events or occurrences can be specified if we know the value of the mean and the variance of the events or occurrences. The mean and variance are measures of central tendency and dispersion respectively and are referred to as parameters. Here is a graph of the normal curve:

In normal distributions the most probable event or frequently occurring property is at the top of the curve and less probable events or less frequently occurring properties are represented symmetrically in the tails of the graph. The numbers on the horizontal axis (-2 to +2) represent standard deviations, which are measures of dispersion from the mean. The percentages indicate the frequency of events occurring in each standard deviation unit. For example, 34.13% of events are expected to occur above and below one standard deviation unit from the mean. Of course, not all economic or natural phenomena can be modeled accurately utilizing a normal distribution.

Some events occur rarely and these are best modeled using non-symmetrical distributions. Of particular concern to us are the rare, risky events such as the loss of a major local employer or the loss of a key employee. These cannot be accurately modeled by a normal distribution. A better model is called a Bernoulli distribution. Here is a graph of this type of distribution:

In the above graph the “0” represents the non-occurrence of the rare risky event while the “1” represents the occurrence of the rare risky event. In this example the risky event is expected to occur 30% of the time. One reason that the graphs of the normal and Bernoulli distribution look so different is due to the fact that the normal distribution models events that can take on a range of continuous values while the Bernoulli distribution models events that can take on only discrete values. A possible transfer rate can take on any value that can occur within a plausible range and thus can be modeled by a normal distribution (or some other type of continuous distribution). On the hand an event like a key employee leaving or a large employer shutting down its operations is a discrete event. The event occurs or does not occur. These kind of events can be modeled by a Bernoulli distribution (or some other discrete distribution). In simulations of equity value we need both types.

One way to think about a computer simulation is to think of a physical lottery drawing that utilizes a very large spinning drum to insure randomness of selection. Picture the drum filled with 5,000 identically shaped ping pong balls each with a number on it. With this physical set up every ping pong ball has an equal chance of being chosen after the drum has been spun several times. Note that there are many ways we can assign numbers to the ping pong ball.

Now the natural
thing to do if we were performing a lottery with a cash prize would be to label
each ping pong ball with a unique number. But what if instead the point of the
lottery was to *model* the variation of
some natural or economic phenomena? Let’s suppose we wanted to model the
natural variation in men's heights.

We know from much reliable data the average height of men in the population at large. We know the range and we know that by far the largest proportion of heights are fairly close to the average. In fact we know that a normal probability distribution function is the best way to model this kind of property. So how would we go about numbering the ping pong balls? Well clearly we would avoid numbers less than 48” or higher than 96”. We would label most balls at or near 70” (the average), and few balls near our upper and lower ranges. And in fact utilizing the above graph of the normal distribution we would number about 34% of the balls in a range of one standard deviation above the mean and about 34% of the balls one standard deviation below the mean.

Let’s suppose we do not have the time or resources to draw out and record all 5,000 ping pong balls. If we are careful in labeling our balls we would expect that after we drew and recorded say 100 draws from the drum the distribution of numbers would be very close to the distribution of heights of 100 men we might select at random from the population at large. Furthermore, we can be reasonably sure that the shape of the distribution of recorded heights would mirror fairly closely the shape of the distribution of men’s heights in the larger population.

We can program a computer to imitate the output of this kind of physical lottery setup. Computers can select thousands of random numbers and record the results far more quickly than a physical drawing and recording of numbered ping pong balls from a well spun drum. Why would we want to do this? And specifically why would a buyer be interested in using such simulations in deciding how much to pay for the equity interest in a closely held firm?

** Since the future DE derivable from a firm is
subject to variation and risk, using lottery like simulations based on
probability distributions that mirror the variations and risks a buyer will
encounter can serve as a means of accurately forecasting the possible different
levels of DE from the investment.** In a computer simulation every
variable subject to material variation and risk can be assigned its own virtual drum of appropriately
numbered ping balls. Each simulation run represents a future path for the DE of
the investment.

When a computer
simulation calculates hundreds of possible future levels of DE, certain levels
will occur more frequently than other levels, just as certain heights of men
drawn from samples based on a normal probability distribution will appear more
frequently than others. Those levels of DE that occur most often in a
simulation run are likelier to be realized than those levels that occur less
frequently. In fact, if we have set up a coherent model of DE and chosen
probability functions that mirror the risks and variation that an investor is
likely to encounter, the investor can use the mean or median of the outcomes as
a basis for negotiating equity prices. *This mean or median DE does not have to be
further discounted for risk because risk was already accounted for in the
selection of the probability functions.*

Our appraisal approach requires forecasts of DE under two measures of comparative advantage: the COB forecast if the optimal buyer is a firm, or the differential compensation forecast if the optimal buyer is an employee. Both forecasts allow the optimal buyer to determine if the net economic benefits are sufficient to compensate them for assuming the burdens of owning and operating the firm. These forecasts are based on an economic model of the firm’s local competitive and labor market.

The economic model is a hybrid of causal and probabilistic variables that can be developed in an Excel spreadsheet. Constants can be used where there is a high degree of certainty in the value of the parameter and very little risk or variation expected in the future. A constant can also be used where any variation or risk would have little impact on future DE.

In an Excel spreadsheet
each variable that drives DE is included in a separate cell. Some cells contain
probability functions while other contain fixed constants (usually placeholders
for averages). These probability functions allow us to capture the inherent
variability and uncertainties in forecasting future DE. There are several
simulation software packages that now integrate with Excel.^{ } Click here for a comprehensive example of a computer simulation.

For more computer simulation resources check out the Probability Management Organization website.

Copyright 2018 Michael Sack Elmaleh