# Papers by J.J.BUCKLEY and Y.HAYASHI

Bernhard Schmidt (schmidtb@Informatik.Uni-Marburg.de)
Fri, 26 Jan 1996 13:34:47 +0100

In December I posted a search for a paper about function
approximation by fuuzified neural networks writen by J.BUCKLEY
and Y.HAYASHI.
I recieved a great number of answers. Thanks to everybody
for helping me out.

To summarize the answers I quote from them:

The paper I was searching for was apparently the following:

(*) J.J.BUCKLEY, Y.HAYASHI:
Can fuzzy neural networks approximate continuous fuzzy
functions?, Fuzzy Sets and Systems 61, pp. 43 - 52, 1994.

Fuzzy systems can approximate ALL continuous and measurable
functions on compact domains and they do so if the functions are bounded.
(Constructive proof by B.KOSKO, KOSKO shows how to use neural
or other statistical schemes to learn the rules from data).
FRED WATKINS proved that no additive fuzzy system can exactely
represent the product function f(x,y) on any domain that
includes the origin.

Regular neural networks (MLPs in thsi case) approximate all
continuous functions to any degree of accuracy (e.g. G.CYBENKO,
K.FUNAHASHI, R.HECHT-NIELSEN etc.). Those networks are called
universal approximators.

Are fuzzy neural nets -fully fuzzified -
(FNN, HFNN (see e.g. (*))) universal approximators?

IF FZ is the set of all triangular fuzzy numbers, we can define
F_all f:FZ^n (U subset of FZ^m).
FNNs in which operators are fuzzified by means of the extension
principle (FNN_ext) CAN'T approximate all fuzzy functions element
of F_all because this FNN is a monotonic mapping (*).
To show that J.J.BUCKLEY and Y.HAYASHI gave the non-monotonic
triangular fuzzy function f(x_m,x_l,x_r)=(1/x_m,1/x_l,1/x_r).
If we limit F_all of fuzzy functions to those fuzzified on the
basis of the extension principle f_ext we can approximate them
to any degree of accuracy by FNN_exts. The class of fuzzy functions
element of F_ext is identical to the class of FNN_ext.
(Shown by T.FEURING, W.M.LIPPE).

FNNs which use other operators besides addition and multiplication
to compute the output are called hybrid fuzzy neural nets. (HFNN).
In (*) it is shown that certain not continuous HFNNs are universal
approximators. (But apparently the proof is a non-constructive
so that there is no advice given how to actually build such a net).

Bernhard Schmidt, University of Marburg, Germany