Re: Please help in Neural Network Implementaion !!


Subject: Re: Please help in Neural Network Implementaion !!
From: Chin Wei Chuen (eng60235@nus.edu.sg)
Date: Tue Feb 01 2000 - 16:54:43 MET


What you are doing interests me very much. I'd like to know more details
of your experiment. Please read my inserted comments below carefully. I
hope I can provide you with some helpful comments later. Thanks!

Joe Smith wrote:

> Hi,
>
> I'm interested in developing a neural network in Perl. I've read
> the FAQs on this group as well as many of the resources it points
> to and most of the messages of the past month or so .. however
> I still don't understand how to take it from theory to practice.
>
> Let me explain what I would like to use it for and what I'm currently
> doing:
>
> I have to process thousands of matrices of 25 - 40 predictors or
> independent variables (columns) and 500 or so cases (rows) and 1
> dependent or desired output vector (column) corresponding to each
> case.
>
> 1) This matrix is a result of a factorial analysis (principal
> components analysis and data dimensionally reduction functions)
> by means of Jacobi transformations. Basically I created a series
> of matrices based on clusters of common variance of the original
> sample data.

How did you do the Jacobi transformation.
What are these matrices?
You said that you did a factorial analysis; how are the 'clusters of
common variance' related it?

>
>
> 2) I calculate a multi-variate regression analysis on these
> matrices using matrix algebra to create a series of weights that
> allow me to predict y given X (y(j)=b(0)+b(1)x(1)+ .. +b(k)x(n)).
> This procedure gives me good results with an average adjusted multiple
> correlation coefficient of 0.9987

How is the 'average multiple correlation coeff' calculated?

>
>
> My problem lies in the fact that I need to crunch every matrix
> individually in order to predict them even though all the matrices
> are a product of the same process and I should be able to predict
> all of them by processing just one matrix the beta weights that
> result from the analysis are different for every matrix. As you
> can imagine this calculations take for ever and are not easy to
> parallelize since they are always dependent on the previous step.

What are you trying to predict? The correlation coeffs between your
predictors or something else?

>
>
> I would like to know from you folks in this group, what kind of NN
> would best suit this problem, that would learn how to solve all the
> matrices without having to crunch all the matrices and that would do
> this in parallel rather than a serial manner.

It's hard to answer this question unless you're more specific about your
goal.

>
>
> If I understand the information I've gather on this group and on
> the FAQs there a several types of network designs, I would imagine
> that we would need a probabilistic approach to this problem, perhaps
> using Bayesian inference to unify the results into a common set of
> weights by calculating the similarities of the matrices and their
> corresponding weights.
>
> Please help ...
>
> Thank you,
>
> Joe Smith
> jsmith@currentreviews.com

############################################################################
This message was posted through the fuzzy mailing list.
(1) To subscribe to this mailing list, send a message body of
"SUB FUZZY-MAIL myFirstName mySurname" to listproc@dbai.tuwien.ac.at
(2) To unsubscribe from this mailing list, send a message body of
"UNSUB FUZZY-MAIL" or "UNSUB FUZZY-MAIL yoursubscription@email.address.com"
to listproc@dbai.tuwien.ac.at
(3) To reach the human who maintains the list, send mail to
fuzzy-owner@dbai.tuwien.ac.at
(4) WWW access and other information on Fuzzy Sets and Logic see
http://www.dbai.tuwien.ac.at/ftp/mlowner/fuzzy-mail.info
(5) WWW archive: http://www.dbai.tuwien.ac.at/marchives/fuzzy-mail/index.html



This archive was generated by hypermail 2b25 : Thu Apr 06 2000 - 15:59:41 MET DST