首页 > > 详细

辅导IS71083A编程、辅导Data语言程序、Python,Java程序辅导解析Java程序|辅导R语言程序

IS71083A Financial Data Modelling
Coursework Assignment
Implementing an Incremental (online) Training Algorithm
for Density Neural Networks
Design and implement an incremental backpropagation algorithm for training density neural networks
(DNN). DNN networks compute not only the mean of the target distribution, but also the variance with a
second output node. That is why you have to develop the backprop algorithm to train all weights leading
to each of the two output nodes in DNN given in the figure below. Start the training process using
plausible randomly generated weights. After the design of the training algorithm, apply it to model and
forecast a time series, generated using the Mackey-Glass equation (available in the lecture handouts).
Use more lagged inputs and hidden units if necessary to achieve better predictions.
sum
Design a working prototype of the incremental backpropagation for DNN density networks in Matlab.
The prototype development should include data structures for the input-to-hidden and hidden-to-output
connections, and loops for the forward and backward pass.
References:
D.A.Nix and E.Weigend (1994). Estimating the mean and variance of the target probability distribution,
In: Proc. 1994 IEEE Int. Conf. on Neural Networks (ICNN'94), Orlando, FL, pp. 55-60.
Nikolay Nikolaev and Hitoshi Iba (2006).
"Adaptive Learning of Polynomial Networks: Genetic Programming, Backpropagation and Bayesian Methods",
Springer, New York, pp. 256 - 262.

联系我们
  • QQ:99515681
  • 邮箱:99515681@qq.com
  • 工作时间:8:00-21:00
  • 微信:codinghelp
热点标签

联系我们 - QQ: 99515681 微信:codinghelp
程序辅导网!