[BOOK ANNOUNCEMENT] Fuzziness and Probability

S. F. Thomas (sthomas@decan.com)
Mon, 4 Dec 1995 22:43:40 +0100



The book "Fuzziness and Probability" by S. F. Thomas is now in print.

1995; 320 pages; ISBN 0-8050-2356-0; soft cover; $29.95

Publisher: ACG Press, PO Box 782948, Wichita KS 67278-2948, USA

Sales: CyberScript, Inc.
Email: David.King@acginc.com


o L.A. Zadeh: "Fuzziness and Probability" is a must-resd for anyone
who is interested in developing a better understanding of
methods for dealing with uncertainty and imprecision. What
Dr. Thomas has to say is highly original, insightful and

o Brian Gaines: There have been many over-simplistic accounts of
the relations between fuzziness and probability. This is the
first clear and correct tutorial exposition of the issues. It
will be of great value to theorists and practitioners alike.

o Anonymous Reviewer: The work is brilliant... Clearly, Dr. Thomas
is an outstanding thinker ... even if one may take issue
with some of his conclusions.


The major claims of the book are, within a unified treatment, to have:

o found the extended likelihood calculus that eluded
Fisher, and generations of statisticians since

o corrected the mistake in the Zadehian
fundamentals of fuzzy set theory which put him at odds with
Aristotle -- the laws of excluded middle and contradiction
are restored, without undoing the essential fuzziness of
natural language descriptors

o corrected the mistake in the Bayesian inference
theory that treats universals (uncertainty about models) and
particulars (uncertainty about the chance of occurrence of
individual events) symmetrically, while keeping the essential
idea that sources of evidence -- prior belief plus
experimental data -- may be combined to give a posterior
expression of belief, scientific or otherwise, about the
universals involved

o developed a calculus for the combination of evidence, and
for group decision-making

o found the solution to the nagging foundational
problem of measurement theory -- how to address errors of
measurement within the theory itself. First you change the
paradigm from one of assumed precision of measurement to one
where imprecision (fuzziness) is the general case. It is
easy for precision to fall out of a paradigm of imprecision,
but quite difficult to make imprecision fall out from within
a paradigm of precision.

There are other claims besides, but these are the main ones,
and the last one is key: we are dealing here with a paradigm
change. At the foundations of science, where we are addressing
issues of measurement -- including the measurement of attributes
of probability, belief, utility -- and issues of scientific
inference, where again we must confront issues of probability and
its measurement, the paradigm in place is one of precision, the
idea that the total ordering axiom applies, that sample spaces may
be continuous, that points within sample spaces may be discerned
with absolute levels of precision, and that they may be measured
in principle to an infinite number of decimal places. Once this
notion is relaxed, everything changes. One then finds oneself at
a point where everything in science seems to converge (or from
where everything diverges) and simultaneously in the realm of
inference throry, probability theory, measurement theory, fuzzy
set theory, decision theory, even the foundations of deductive
logic, of the philosophy of truth, of semantics, and so forth. In
much the same way that Einstein's theory of relativity changed the
analytic paradigm for the theory of motion by putting the observer
into the picture, theories of inference regarding real-world
phenomena change when the imprecision inherent in the use of
language for the conveying of measurement reports -- the language-
use phenomenon itself -- is explicitly brought into the picture.
And, as with Einstein's theory of relativity, which did not
diminish the usefulness of the Newtonian theory in most everyday
applications, the change of paradigm here explored will not change
the usefulness of the paradigm of precision in those application
areas where "adequate" precision is capable of being achieved for
the attributes of concern. Equally, however, the change of
paradigm appears necessary in areas where the attributes of concern
(eg., subjective probability, utility) are not susceptible of
sufficiently precise measurement/description.

While the breadth of coverage seems immodest to say the
least, that's where I was led, purely serendipitously, as I have
indicated in the Preface and Introduction. It's all due to Zadeh,
as the idea of fuzziness started it all, and is at the center of
the whole thing. But surprisingly, the connectedness of all these
things seems to have eluded Zadeh, who in insisting on the
distinction between probability and fuzziness seems to have cut
himself off from exploring the intimate relation between the two.
I think I have exposed the connection, which far from weakening
the power and beauty of Zadeh's master stroke, strengthens it even

S.F. Thomas