13.RSK Ijrte 3
13.RSK Ijrte 3
Abstract--- Wind energy is one of the essential renewable the key segments of the wind turbine like gearbox,
energy resources because of its consistency due to the mechanical brake, electrical generator, control frameworks,
development of the technology and relative cost affordability. The yaw drive, and so forth [4].
wind energy is converted into electrical energy using rotating
The main objective of condition monitoring method is to
blades which are connected to the generator. Due to
environmental conditions and large construction, the blades are create the power yield as calculated. When there is an
subjected to various faults and cause the lack of productivity. The awareness of the time and categories of defects,
downtime can be reduced when they are diagnosed periodically maintenance activities can be arranged ahead of time [5].
using condition monitoring technique. These are considered as a Thus, condition monitoring of blade fault is a critical
machine learning problem which consists of three phases, activity. There are two types of approaches which are
namely feature extraction, feature selection and fault
carried out for condition monitoring: traditional approach
classification. In this study, statistical features are extracted from
vibration signals, feature selection are carried out using J48 and machine learning approach. The traditional approach is
algorithm and the fault classification was carried out using mainly used for frequency component does not change with
logistic model tree algorithm.” respect to time.
Keywords--- Fault Diagnosis; Condition Monitoring; Rotating machines produce non-stationary signals. Since
Statistical Features; J48 Algorithm; Logistic Model Tree (LMT) the frequency components change due to wear and tear, fault
Algorithm. discrimination is very difficult using an automated system in
the traditional approach. Hence, not preferred. In machine
I. INTRODUCTION learning approach, algorithms have the capability to learn
In recent years, wind energy becomes the most demanded continuously and adapt themselves to the varying situations.
resource in power generation. Because of worldwide Researchers often resort to machine learning approach for
ecological contamination due to other resources, the fault diagnosis of mechanical systems.”
sustainable energy and natural resources like wind energy Many studies were carried out using machine learning
have risen [1]. “Wind energy is one of the efficient studies and simulation studies to name a few, Godwin and
renewable sources and an alternative option for commonly Peter Matthews [6] have done classification and detection of
used sources. The objective of the wind turbine is to boost wind turbine pitch faults through SCADA data analysis and
ideals by extreme energy extraction and in this manner, the RIPPER algorithm which yield them 87.05% classification
performance of the wind turbine frequently differs with accuracy in pitch angle fault. “N. Dervilis et al., [7] carried
wavering winds. To make wind energy more focused from out a research on damage diagnosis for a wind turbine blade
different sources of energy, like performance, accessibility, using pattern recognition such as principal component
reliability, effectiveness, the life of turbines must be analysis (PCA), nonlinear principal component analysis
improved [2]. (NLPCA), artificial neural network (ANN), auto-associative
Generally, horizontal axis wind turbine (HAWT) is used neural network (AANN) and Radial Basis Functions (RBF)
for power generation since it is more effective than the using vibration signals.
vertical axis wind turbine [3]. Most HAWT are with three Mark Mollineaux et al., [8] carried out a work on damage
rotor blades typically placed upwind of the tower and the detection methods of wind turbine blade testing with wired
nacelle. The nacelle is typically furnished with anemometers and wireless accelerometer sensors using benchmark data
and a wind vane to calculate the wind rate and direction of and autoregressive moving average (ARMA) and
the wind. The nacelle also contains aviation light signal and Continuous Wavelet Transform (CWT) used as modeling
techniques.
Mahmood Shafiee et. al [9] have done work on an
Manuscript received September 16, 2019.
A. Joshuva*, Centre for Automation and Robotics (ANRO), opportunistic condition based maintenance policy for
Department of Mechanical Engineering, Hindustan Institute of Technology offshore wind turbine blades subjected to stress corrosion
and Science, Old Mahabalipuram Road, Padur, Kelambakam, Chennai, cracking and environmental shocks and simulated using
T.N, India. (e-mail: [email protected])
G. Deenadayalan, Centre for Automation and Robotics (ANRO), MATLAB.
Department of Mechanical Engineering, Hindustan Institute of Technology H. M. Slot et al., [10] have made a review of coating life
and Science, Old Mahabalipuram Road, Padur, Kelambakam, Chennai, models on the leading edge erosion of coated wind turbine
T.N, India.
S. Sivakumar, Department of Mechanical Engineering, Hindustan
blades and he suggested the material and prediction of the
Institute of Technology and Science, Old Mahabalipuram Road, Padur, lifetime of the blade.
Kelambakam, Chennai, T.N, India..
R. Sathishkumar, Department of Automobile Engineering, Hindustan
Institute of Technology and Science, Old Mahabalipuram Road, Padur,
Kelambakam, Chennai, T.N, India.
R. Vishnuvardhan, Department of Mechatronics Engineering, Sri
Krishna College of Engineering and Technology, Coimbatore, T.N, India.
Published By:
Retrieval Number: B10330982S1119/2019©BEIESP Blue Eyes Intelligence Engineering
DOI: 10.35940/ijrte.B1033.0982S1119 202 & Sciences Publication
LOGISTIC MODEL TREE CLASSIFIER FOR CONDITION MONITORING OF WIND TURBINE BLADES
Shizhong Zhang et al., [11] carried out a work on design Wind Turbine with
Accelerometer
and analysis of jet based laboratory equipment for
performance evaluation on the erosion of wind turbine blade Data Acquisition (Vibration signal)
coatings.
Xiang Li et al., [12] have carried out a research on crack Feature Extraction (Statistical)
Published By:
Retrieval Number: B10330982S1119/2019©BEIESP Blue Eyes Intelligence Engineering
DOI: 10.35940/ijrte.B1033.0982S1119 203 & Sciences Publication
International Journal of Recent Technology and Engineering (IJRTE)
ISSN: 2277-3878, Volume-8, Issue-2S11, September 2019
Published By:
Retrieval Number: B10330982S1119/2019©BEIESP Blue Eyes Intelligence Engineering
DOI: 10.35940/ijrte.B1033.0982S1119 204 & Sciences Publication
LOGISTIC MODEL TREE CLASSIFIER FOR CONDITION MONITORING OF WIND TURBINE BLADES
corresponding vibration signals were acquired. Figure 3 wind. The blade was made to flap wise bend with
shows the different blade fault conditions which are 10 º angle.
simulated on the blade. b) Blade crack (BC-2): This occurs due to foreign
a) Blade bend (BB): This fault occurs due to the object damage on the blade while it is in operating
high-speed wind and complex forces caused by the condition. On the blade, 15mm crack was made.
Published By:
Retrieval Number: B10330982S1119/2019©BEIESP Blue Eyes Intelligence Engineering
DOI: 10.35940/ijrte.B1033.0982S1119 205 & Sciences Publication
International Journal of Recent Technology and Engineering (IJRTE)
ISSN: 2277-3878, Volume-8, Issue-2S11, September 2019
III. FEATURE EXTRACTION – STATISTICAL value v. Note the first term in the equation for gain is just
ANALYSIS the entropy of the original collection S and the second term
The vibration signals were acquired for good and other is the expected value of the entropy after S is partitioned
faulty conditions of the blade. If the time domain sampled using feature A. The expected entropy described by the
signals are given directly as inputs to a classifier, then the second term is simply the sum of the entropies of each
number of samples should be constant. “The number of subset Sv, weighted by the fraction of samples |Sv|/|S| that
samples obtained which are the functions of rotation of the belong to Sv. Gain (S, A) is, therefore, the expected
blade speed. Hence it varies with speed. Also, the number of reduction in entropy caused by knowing the value of feature
digitized data points in a signal is too large; generally the A. Entropy is a measure of homogeneity of the set of
work of classifiers may not handle it efficiently. Therefore, examples and it is given by
a few features must be extracted before the classification (2)
process. Descriptive statistical parameters [19] such as sum, where, c is the number of classes, Pi is the proportion of S
mean, median, mode, minimum, maximum, range, belonging to class „i’.
Skewness, kurtosis, standard error, standard deviation and The J48 decision tree algorithm has been applied to the
sample variance were computed to aid as features in the problem of feature selection. The input to the algorithm is
feature extraction process. the set of statistical features described above and output of
the decision tree shown in Figure 5. It is clearly shown that
IV. FEATURE SELECTION - J48 ALGORITHM the top node is the best node for classification. The other
features in the nodes of decision tree perform in descending
J48 decision tree algorithm is adapted from the C4.5
order of significance. It is to be mentioned here that only
algorithm in WEKA [20]. “It consists of a number of
features that contribute to the classification appear in the
branches, one root, a number of nodes, and a number of
decision tree and other features do not contribute much. The
leaves. One branch is a chain of nodes from the root to a
features which have the less discriminating capability can be
leaf, and each node involves one attribute. The occurrence
consciously discarded by deciding on the threshold. This
of an attribute in a tree provides information about the
concept is made use for selecting good features. The
importance of the associated attribute [21]. A decision tree
algorithm identifies the good features for the purpose of
is a tree based knowledge representation methodology used
classification of the given training data set, and thus reduces
to represent classification rules. J48 decision tree algorithm
the domain knowledge required to select good features for
is a widely used one to construct decision trees [22]. The
pattern classification problem [24-26]. Referring from
procedure of forming the decision tree and exploiting the
Figure 5, one can identify the most dominating features to
same for feature selection is characterized by the following:
represent the blade conditions are the sum, range, standard
1. The set of features available at hand forms the
deviation, and kurtosis.
input to the algorithm; the output is the decision
tree.
2. The decision tree has leaf nodes, which represent
class labels, and other nodes associated with the
classes being classified.
3. The branches of the tree represent each possible
value of the feature node from which they
originate.
4. The decision tree can be used to classify feature
vectors by starting at the root of the tree and
moving through it until a leaf node, which provides
a classification of the instance, is identified.
5. At each decision node in the decision tree, one can
select the most useful feature for classification
Figure 5: J48 Tree Classification for Feature Selection
using appropriate estimation criteria. The criterion
used to identify the best feature invokes the
V. FEATURE CLASSIFICATION - LOGISTIC
concepts of entropy reduction and information
MODEL TREE (LMT) ALGORITHM
gain.
Information gain measures how well a given attribute A logistic model tree essentially comprises a normal
separates the training examples according to their target decision tree arrangement with logistic regression tasks at
classification. The measure is used to select the candidate the leaves [27]. “As in normal decision trees, an assessment
features at each step while growing the tree [23]. on one of the qualities is connected with each internal hub.
Information gain is the expected reduction in entropy caused For an identified property with k values, the hub has k child
by portioning the samples according to this feature. hubs, and illustrations are sorted down one of the k branches
Information gain (S, A) of a feature A relative to a relying upon their estimation of the feature. For numeric
collection of examples S, is defined as: features, the hub has two child hubs and the test comprises
(1)
where Value (A) is the set of all possible values for
attribute A, and Sv is the subset of S for which feature A has
Published By:
Retrieval Number: B10330982S1119/2019©BEIESP Blue Eyes Intelligence Engineering
DOI: 10.35940/ijrte.B1033.0982S1119 206 & Sciences Publication
LOGISTIC MODEL TREE CLASSIFIER FOR CONDITION MONITORING OF WIND TURBINE BLADES
of contrasting the characteristic significance to the From logistic model tree (LMT), the kappa statistic was
threshold. Generally, a logistic model tree comprises of a found to be 0.884. It is used to measure the arrangement of
tree arrangement that is comprised of an arrangement of likelihood with the true class. The mean absolute error was
internal or non-terminal hubs N and an arrangement of found to be 0.0496. It is a measure used to measure how
leaves or terminal hubs T. Let S indicate the entire close forecasts or prediction are with the ultimate result
occurrence in space, spread over by all characteristics that [30]. The root mean square error was found to be 0.1621. It
are available in the information [28]. At that point, the tree is a quadratic scoring rule which processes the average size
structure gives a separate section of S into areas St, and each of the error. The detailed class wise accuracy is shown in
area is characterized by a leaf in the tree. Table 3. The class wise accuracy is expressed in terms of the
(3) true positive rate (TP), false positive rate (FP), precision,
Not like all decision trees, the leaves t ∈ T has a related recall, F-Measure, receiver operating characteristics (ROC)
logistic regression function ft rather than only a class name. area [31-34].
The regression function ft considers a subset Vt⊆V of all TP is used to predict the ratio of positives which are
characteristics existing in the data and models the class correctly classified as faults. FP is commonly described as a
relationship possibilities as false alarm in which the result that shows a given fault
condition has been achieved when it really has not been
(4) achieved. The true positive (TP) rate should be close to 1
and the false positive (FP) rate should be close to 0 for a
Where better classifier [35-41]. One can observe from table 3, the
TP rate of most of the classes are close to 1 and FP rate were
If =0 for vk ≠ Vt. The model denoted by the whole close to 0. This reassures that the result presented by
logistic model tree is given by, confusion matrix in Table 2.
(5) Table 2: Confusion Matrix of LMT
Where I(x∈ St) is 1 if x∈ St and 0 otherwise. Blade
Goo Ben Crac Loos
Pitc
Erosio
condition h
d d k e n
VI. RESULTS AND DISCUSSION s twist
Good 86 0 0 14 0 0
The vibration signals were acquired for good condition Bend 0 92 4 0 0 4
and faulty blade conditions using DAQ (total 600 samples Crack 0 4 92 3 0 1
were collected; 100 samples for each condition). From Loose 13 0 4 82 1 0
vibration signals, twelve descriptive statistical features were Pitch
extracted. Out of them, four best contributing features were 0 0 0 0 95 5
twist
selected using J48 decision tree algorithm. They are the Erosion 0 3 0 0 5 92
sum, range, standard deviation, and kurtosis. From Figure 5, Table 3: Class Wise Accuracy of LMT
the feature „sum‟ is the most contributing features when TP F-
compared to other features. The other contributing features FP Precisio Recal ROC
Class Rat Measur
are a range, standard deviation, and kurtosis. Rate n l area
e e
The minimum number of instances per leaf and the 0.8 0.02 0.97
number of data used for reduced-error pruning was kept at Good 0.878 0.86 0.869
6 4 4
50 for selecting 4 dominating feature in J48 decision tree 0.9 0.01 0.99
algorithm. The rest of the features like mean, median, mode, Bend 0.93 0.93 0.93
3 4 4
minimum, maximum, skewness, sample variance and 0.9 0.98
Crack 0.01 0.948 0.91 0.929
standard error were eliminated as they contribute very less 1 7
in fault classification. With the selected features, the fault 0.8 0.97
Loose 0.04 0.81 0.85 0.829
classification process was carried out using logistic model 5 1
tree. The classification accuracy of LMT algorithm using Pitch 0.9 0.98
0.01 0.949 0.94 0.945
WEKA software package was found to be 90.33% with the twist 4 8
minimum number of instances to be 20 and the logitboost Erosio 0.9 0.01 0.98
0.912 0.93 0.921
iteration to be -2. The confusion matrix of the logistic model n 3 8 4
tree algorithm is shown in Table 2. In confusion matrix, the From 600 samples, 542 samples are correctly classified
diagonal elements represent the correctly classified (90.33%) and remaining 58 are misclassified (9.67%). The
instances and the others are misclassified instances [29]. time taken to build the model is about 0.23 seconds. This
Also one can observe more misclassifications between good can be used in real time for the condition monitoring of the
and loose conditions. For the loose condition, the bolts wind turbine due to low computational time.
between the hub and the blade were made loose (please note The logistic model tree algorithm is a better classifier that
that the blade was in good condition). However, at high can be used in multiclass fault prediction on wind turbine
wind speed, the blade can stick to the hub and behave like a blades.”
good condition during operation. Because of this, the
signature of the loose condition sometimes resembles good
condition and the classifier finds difficult to distinguish
between them; hence, more misclassifications.
Published By:
Retrieval Number: B10330982S1119/2019©BEIESP Blue Eyes Intelligence Engineering
DOI: 10.35940/ijrte.B1033.0982S1119 207 & Sciences Publication
International Journal of Recent Technology and Engineering (IJRTE)
ISSN: 2277-3878, Volume-8, Issue-2S11, September 2019
Published By:
Retrieval Number: B10330982S1119/2019©BEIESP Blue Eyes Intelligence Engineering
DOI: 10.35940/ijrte.B1033.0982S1119 208 & Sciences Publication
LOGISTIC MODEL TREE CLASSIFIER FOR CONDITION MONITORING OF WIND TURBINE BLADES
Published By:
Retrieval Number: B10330982S1119/2019©BEIESP Blue Eyes Intelligence Engineering
DOI: 10.35940/ijrte.B1033.0982S1119 209 & Sciences Publication