ANFIS training

From: Tom De Muer (tom.demuer@skynet.be)
Date: Tue Sep 25 2001 - 15:21:00 MET DST

  • Next message: NESI: "IEEE Conf.on Software Maintenance, Italy,Florence, ICSM2001"

    Hello,

    I've read the article of R. Jang about the ANFIS architecture. If I'm
    correct the network is (or can be) trained with a gradient descent. How
    does this work: I think it has to be something like this:
    - make f(input,a,b,c,...)
    - derive df/da (partial derivatives)
    - the adaptation of a is delta_a = etha * (target - f(input,a,b,c...))

    Is there another way : I was thinking of building some elementary building
    blocks like a multiplier, a summation, ... which are connected and then get
    trained by backpropagation. This works fine with relative simple functions
    but my ANFIS network diverges from the solution. Is it possible to connect
    random building blocks (pure tree, no loops) and train them by
    backpropagation or are there limitations?

    Thanks,
    Tom De Muer

    ############################################################################
    This message was posted through the fuzzy mailing list.
    (1) To subscribe to this mailing list, send a message body of
    "SUB FUZZY-MAIL myFirstName mySurname" to listproc@dbai.tuwien.ac.at
    (2) To unsubscribe from this mailing list, send a message body of
    "UNSUB FUZZY-MAIL" or "UNSUB FUZZY-MAIL yoursubscription@email.address.com"
    to listproc@dbai.tuwien.ac.at
    (3) To reach the human who maintains the list, send mail to
    fuzzy-owner@dbai.tuwien.ac.at
    (4) WWW access and other information on Fuzzy Sets and Logic see
    http://www.dbai.tuwien.ac.at/ftp/mlowner/fuzzy-mail.info
    (5) WWW archive: http://www.dbai.tuwien.ac.at/marchives/fuzzy-mail/index.html



    This archive was generated by hypermail 2b30 : Tue Sep 25 2001 - 15:39:56 MET DST