Causal Model Based On ANN
Causal Model Based On ANN
37
International Journal of Computer Applications (0975 – 8887)
Volume 75– No.7, August 2013
generalizations of these patterns. ANN provides outstanding The general process responsible for training the network is
results even in the presence of noise or missing information mainly composed of three steps: feed forward the input signal,
(Ortiz-Arroyo D., et all 2005). back propagate the error and adjust the weights.
The back-propagation algorithm try to improve the
3. METHODOLOGY performance of the neural network by reducing the total error
3.1 Feedforward Neural Networks which can be calculated by:
Feedforward neural networks allow only unidirectional signal
1
flow. Furthermore, most feedforward neural networks are
organized in layers and this architecture is often known as
E o jp d jp
2 p j
MLP (multilayer perceptron) (Wilamowski B. M., 2011). An
example of feedforward neural network is shown in figure 1.
It consists of an input layer, a hidden layer, and an output
Where E is the square error, p the number of applied
layer. patterns, d jp is the desired output for jth neuron when pth
Output
Y
x1 X2
wi1
x2 wi 2
Oi X3
x3 wi 3
….
∑ f
….. Fig.3 Causal method and multi-layer perceptron
wiN
3.3 Time series and multilayer perceptron
xN Fig.2 Single neuron with N inputs. based model
The extrapolative or time series forecasting model is based
only on values of the variable being forecast. Thus for the
The computation related with this neuron is described below: multilayer perceptron used to forecast the time series the
inputs are the past observations of the data series and the
N output is the future value. The MLP performs the following
Oi f ( wij x j ) function mapping:
j 1
yˆt f ( yt 1 , yt 2 , , yt n )
Where: Oi is the output of the neuron i , f () the transfer
Where yˆ t is the estimated output, ( yt 1 , yt 2 , , yt n ) is
function, wij is the connection weight between node j and
the training pattern which consist of a fixed number (n) of
node i, and xj the input signal from the node j. lagged observations of the series.
38
International Journal of Computer Applications (0975 – 8887)
Volume 75– No.7, August 2013
With P observations for the series, we have P-n training Table 2. Data set for causal model based MLP
patterns. The first training pattern is composed of
( y1 , y2 , y3 , y4 , y5 ) as inputs, and ŷ6 as the output N° X1 X2 X3 Y
(Table 2). 1 4949 7409 43500 16793
There is no suggested systematic way to determin the number 2 5369 7903 20209 22455
of the input nodes (n). After several essays this number has 3 6149 9289 47640 25900
been fixed successfully at five nodes for this case study. 4 6655 9914 34563 25591
Table 1. Inputs and estimated output for time series model 5 8114 8193 36581 34396
Assigned data to the input The estimated 6 8447 8711 52206 38676
N° layer output value 7 10472 9453 76788 34608
8 10508 9194 42107 35271
1
9 11534 10149 78902 39132
2 10 12719 10403 73284 46594
3 11 13743 10806 61871 57023
4 12 16168 11557 85265 59720
5 13 16035 11092 28585 62805
6 (Y1 , Y2 , Y3 , Y4 , Y5 ) Y6 14 16112 10979 26921 61905
15 18861 12117 94457 65963
7 (Y2 , Y3 , Y4 , Y5 , Y6 ) Y7 16 19270 11319 17489 72869
8 (Y3 , Y4 , Y5 , Y6 , Y7 ) Y8 17 21536 12702 99820 71963
18 20783 12419 40152 74915
…. …………………. ……
19 21917 13265 65490 81030
22 (Y17 , Y18 , Y19 , Y20 , Y21 ) Y22 20 20878 13173 37639 86686
21 21508 13211 19425 97088
The neural network architecture can be shown in figure 4. 22 22261 14070 45300 108739
Other design parameters as training algorithm will be
discussed and selected in the next paragraph.
4.1 Causal model
Neural Network Toolbox provides a complete environment to
Ym design, train, visualize, and simulate neural networks.
For the causal model we use the function newff to create a
Yˆm5
feed forward neural network. The Levenberg-Marquardt back
propagation algorithme is implemented. “trainlm” is a
Ym 1 network training function that updates weights according to
Levenberg-Marquardt optimization (Anandhi V. 2012).
. . . . .
. . . . .
Ym 4
39
International Journal of Computer Applications (0975 – 8887)
Volume 75– No.7, August 2013
40
International Journal of Computer Applications (0975 – 8887)
Volume 75– No.7, August 2013
5. CONCLUSION
Demand forecasting plays a crucial role in the supply chain of
today’s company. Among all forecasting methods, neural
networks models are capable of delivering the best results if
they are properly configured. Two approaches based on
multilayer perceptron have been developed to predict demand:
time series model and causal methods. The best training
algorithm is the Levenberg-Marquardt back-propagation
algorithm. The number of hidden layers and the number of
neurons in each layer depends on the chosen method and case
study. With a judicious choice of the architecture and
parameters of the neural network, both approaches have
yielded good results. However, the cost of the prediction
method parameter allows us to prefer the time series model
since we have the same results at a lower cost.
6. REFERENCES
[1] Kesten C. Green, J. Scott Armstrong 2012. Demand
Forecasting: Evidence-based Methods.
https://marketing.wharton.upenn.edu/profile/226/printFri
Fig.11 Demand forecasting with time series model endly.
based on neural network
[2] Gosasang, V., Chan., W. and KIATTISIN, S. 2011. A
Comparison of Traditional and Neural Networks
Forecasting Techniques for Container Throughput at
41
International Journal of Computer Applications (0975 – 8887)
Volume 75– No.7, August 2013
Bangkok Port. The Asian Journal of Shipping and [8] Wilamowski B. M. 2011. Neural Network Architectures.
Logistics, Vol. 27, N° 3, pp. 463-482. Industrial Electronics Handbook, vol. 5 – Intelligent
Systems, 2nd Edition, chapter 6, pp. 6-1 to 6-17, CRC
[3] Armstrong, J. S. 2012 , Illusions in Regression Analysis, Press.
International Journal of Forecasting, Vol.28, p 689 - 694.
[9] Zhang G., Patuwo, B. E., Hu, M.Y. 1998. Forecasting
[4] Chase, Charles W., Jr., 1997. “Integrating Market with artificial neural networks : The state of the art.
Response Models in Sales Forecasting.” The Journal of International Journal of Forecasting.Vol.14, , p 35–62.
Business Forecasting. Spring: 2, 27.
[10] Norizan M., Maizah H. A., Suhartono, Wan M. A. 2012.
[5] Chen, K.Y. 2011. Combining linear and nonlinear model Forecasting Short Term Load Demand Using Multilayer
in forecasting tourism demand. Expert Systems with Feed-forward (MLFF) Neural Network Model. Applied
Applications, Vol.38, p 10368–10376. Mathematical Sciences, Vol. 6, no. 108, p. 5359 - 5368
[6] Mitrea, C. A., Lee, C. K. M., WuZ. 2009. A Comparison [11] Anandhi V., ManickaChezian R., ParthibanK.T. 2012
between Neural Networks and Traditional Forecasting Forecast of Demand and Supply of Pulpwood using
Methods: A Case Study”. International Journal of Artificial Neural Network. International Journal of
Engineering Business Management, Vol. 1, No. 2, p 19- Computer Science and Telecommunications, Vol.3, Issue
24. 6, June, pp. 35-38
[7] Daniel Ortiz-Arroyo, Morten K. Skov and Quang Huynh, [12] Wilamowski B. M. 2011 Neural Networks Learning.
“Accurate Electricity Load Forecasting With Artificial Industrial Electronics Handbook, vol. 5 – Intelligent
Neural Networks” , Proceedings of the 2005 Systems, 2nd Edition, chapter 11, pp. 11-1 to 11-18, CRC
International Conference on Computational Intelligence Press.
for Modelling, Control and Automation, and
International Conference on Intelligent Agents, Web [13] Wilamowski B. M., Yu H. 2010. Improved Computation
Technologies and Internet Commerce for Levenberg Marquardt Training. IEEE Trans. on
(CIMCAIAWTIC’05) , 2005 Neural Networks, vol. 21, no. 6, pp. 930-937.
42
IJCATM : www.ijcaonline.org