# On finite vs countable additivity and Cox/Jaynes/DeFinetti etc

There is some concern in the back of people's mind about Cox/Jaynes Bayes and things like continuous parameter spaces, or infinite dimensional models and soforth. It comes down to questions about Finite Additivity. About the only good thing for me that came out of Dan Simpson posting on Andrew's blog about the "Gay Face" neural net baloney that came out this week is that he indirectly pointed me to a guest post on Xian's blog from a few years back. (To be clear, it's the gay face "research" baloney not Dan's post that I object to)

So, here I want to try to organize some thoughts I have on finite additivity type stuff. I've mentioned why I don't think this is actually an issue for science in the past.

As regular blog readers know, I'm a fan of the IST form of Nonstandard Analysis. I think it connects models to formalism in a way that is transparent, and in a way that "typical" measure theory type math doesn't.

So, let's try to define some probability foundations in a Nonstandard Analysis setting (warning, I will probably do a poor job of this compared to professionals but if I screw it up I am pretty confident it will be both detectable and fixable by a more serious formalist)

First, let's work with a simple continuous valued random variable.

Define a set of points where and and is a nonstandard integer.

Now define a function which is non-negative and

Suppose the standardization exists as a standard function, and therefore is a standard probability density function, proof is simple, it's non-negative, it's integrable, and its integral is 1. Every standard integrable function that is non-negative and has integral 1 is a probability density by definition of a probability density.

Now, suppose instead that is a nonstandard function, that is, it takes on nonstandard values in such a way that is appreciable for some values, but suppose this only occurs for values infinitesimally close to some standard values (ie. the "delta functions" are at standard locations). Then there is no standardization of this function. However, we can define a probability measure on such that if is a standard set of points in

It's trivial to show that this is a countably additive probability measure, first off we're adding up non-negative values, and the sum of all the nonstandard values equals 1 so the outcome is always in . If s1 and s2 are disjoint standard sets then by the property of the sum operation that defines in terms of . This is true for all unions of disjoint standard sets for any standard integer. This is true because the standard sets are disjoint, and therefore they partition the sum, we don't double-count any of the nonstandard grid points for example. By Transfer this is true for all including nonstandard . To see that this implies countable additivity we use proof by induction. In standard mathematics, we have a sequence of subsets whose full union is the full set , our equality is true for every prefix of the sequence of subsets by our partitions-the-nonstandard-sum argument, and hence by induction is true for the whole sequence of subsets.

So, finite additivity may be problematic as a foundation for Bayesian statistics on continuous parameter spaces, but nonstandard-finite additivity is sufficient to give measure theory, and to boot, in the nonstandard case, every standard measure has a nonstandard density.

Now, suppose we're dealing with infinite dimensional spaces, like the sample paths of gaussian processes. There may not be any standard Lebesgue measure on infinite dimensions, but there is a perfectly fine Lebesgue measure on every finite dimensional space of standard positive integer dimension , therefore there is a Lebesgue measure on every nonstandard positive integer dimension as well by Transfer. Let be the graph of a sample path on a nonstandard grid of nonstandard dimension in x and in each f. Let have a nonstandard density defined by the nonstandardly-discretized multivariate normal distribution. The the standardization of some realization is a gaussian process sample path (because every finite standard set of x values has values with multivariate gaussian distribution, which is the definition of a gaussian process).

The problem with standard measure theory is that when you build your formalism on taking limits, things change from one type of thing to another under certain limits. For example, the normal(0,s) density for s an infinitesimal is a perfectly fine nonstandard density function, but in standard mathematics it is a "delta function measure" which is to say a measure over sets, not a function .