Zhang 2014
Zhang 2014
Richong Zhang, Chune Li, Hailong Sun, Yanghao Wang, Jinpeng Huai
School of Computer Science and Engineering
Beihang University
Beijing, China
Email: {zhangrc, lichune, sunhl, wangyh, huaijp}@act.buaa.edu.cn
Abstract—This paper studies the quality of web service pre- where each entry of the matrix is the quality achieved by a
diction problem. We formalize the QoS prediction problem by user when calling a web service. In practice, however, other
incorporating multiple contextual characteristics via collective associated contextual properties of users and services affect
matrix factorization that simultaneously factor the user-service
quality matrix and contextual information matrices. Using the the quality of the web service, such as the categories of the
service category and location context, we develop three context- functionality that a service provides and the location where
aware QoS prediction models and algorithms to demonstrate this service is hosted. This paper studies the possibility of
the advantages of this modeling technique. The advantages incorporating these side information when designing quality
of our proposed models are demonstrated via experiments on of web service prediction systems to improve the quality
real-life data sets.
prediction performance.
Keywords-quality of web services; matrix factorization; QoS The contextual characteristics that may affect the quality
prediction. of web services make it desirable that the quality of service
(QoS) prediction model is capable of characterizing all
I. I NTRODUCTION
these features. In this paper, we exploit the collective ma-
The web services or RESTful APIs, are more and more trix factorization [1] which simultaneously considers these
prevalent in the context of web developments. By taking the contextual features. Specifically, we formalize context-aware
advantages of the increasing availability of these programs, prediction models for the quality of web services and design
web developers enjoy a simple developing experience than learning algorithms for these models. These models general-
ever before. Still, the quality of web services and APIs varies ly take the combination of multiple contextual characteristics
significantly. In addition, with the explosive growth of the of web services and their users, and provide the QoS
number of public available service components, potential predictor with effectiveness.
users have to spend an immense amount of time retrieving Furthermore, we develop QoS predictors based on the
these services that assist them in better developing web stochastic gradient descent algorithms. Experiments using
applications. data sets from wsdream.com and comparisons with existing
Some web service and API aggregating web sites, such as quality of web service prediction algorithms confirm the
Seekda1 , and ProgrammableWeb2 listing these components, superiority of our proposed models.
now allow users to write feedbacks, annotate tags, or rate on The remainder of this paper is organized as follows.
the services and make these information available for web Section II introduces the related works. Section III discusses
developers. In addition, Zheng et al. exams the performance the contextual characteristics of web service invocation.
of web services and makes the detected results available for Section IV presents a formulation of quality of web ser-
all web developers 3 . Potential service users may make use vice prediction and provides algorithms resulting from a
of these collected information to estimate the performance collective matrix factorization model. Section V presents an
of services and to decide whether to adapt them for their experimental evaluation of our approach. This paper ends
application or not. Nevertheless, the accumulation of these with some discussion and brief conclusions in section VI.
feedbacks and performance takes time before an actual high-
quality service or API can be discovered. To address these II. R ELATED W ORK
problems, our goal is to develop models that can effectively A. Web Service Recommendation and QoS Prediction
discover the services with the highest quality to assist web The general goal of web service recommendation and QoS
developers’ programming process. prediction is to predict missing values in the user-service
Previous studies of the quality of web service prediction invocation quality matrix. Collaborative filtering is one of
primarily consider the user-service invocation quality matrix, the most commonly-used recommendation approaches and
1 webservices.seekda.com is successfully exploited by many service recommender
2 www.programmableweb.com/ systems and QoS prediction methods. For example in [2] and
3 www.wsdream.net [3], authors proposed methods of determining user similarity
C. Matrix Factorization
433
IV. Q UALITY OF W EB S ERVICE P REDICTION M ODEL
In this section, we propose the formulation of the quality
of web service problem and the QoS prediction models.
A. Problem Statement
We denote all web services and the set of all users by S
and U respectively. We also denote by Xu,s the quality of
web service s experienced by user u. The estimation of this
value is denoted by X̂u,s . Service quality Xu,s is observed
for every (u, s) ∈ U ×S. The objective of the QoS prediction
problem is to estimate the missing value of the user-service
invocation quality matrix X.
Figure 2. Average throughput. The two-dimensional matrix is a visual B. Matrix Factorization Model
representation of a corresponding numerical matrix whose entries show the
average throughput of invocations from and to different countries. Same as The matrix factorization model has been used for solving
previous figure, higher values are indicated with darker cells. this quality of the web service prediction problem. The basis
of matrix factorization is assuming a latent low-dimensional
space RD on which for each user u, a user feature pu is
1.4 defined and for each service s, an service feature qs is
defined. That is, pu and qs both belong to RD , and the
1.2
estimated rating X̂u,s is defined by the inner product of
1
these two vectors, namely,
0.8
X̂u,s = pu qsT . (1)
0.6
Representing the collection of qs ’s as a |S| × D matrix Q
and the collection of pu ’s as a |U|×D matrix P , the estima-
0.4 tion problem of interest then reduces to solve the following
minimization problem: Find (Q, P ) that minimizes
0.2
||X − P QT ||2 + ρ||Q||2 + ρ||P ||2 (2)
0
1 2 3 4 5 6 7 8
for some given positive value of ρ. The notation ||·|| denotes
the matrix Frobenius norm.
Figure 3. Average response time of different categories of services. An extension of matrix factorization is regularized SVD
with bias [20], which formulates the estimated X̂u,s as
4
x 10 X̂u,s = μ + bs + bu + pu qsT (3)
2.5
1
||X − P QT ||2 + ρ(||Q||2 + ||P ||2 + ||BU ||2 + ||BS ||2 ) (4)
The optimization problems as stated in (2) and (4) can
0.5 both be solved using gradient descent or stochastic gradient
descent algorithms.
0
1 2 3 4 5 6 7 8 C. Collective Matrix Factorization
It has been shown in previous subsection that the tradi-
Figure 4. Average throughput of different categories of services.
tional matrix factorization model is able to solve the quality
of web service prediction problem. However, other explicit
factors and contextual characteristics are not considered. In
practice, the performance of the learning algorithm would
434
be improved when incorporating more contextual informa- the latent feature of the provider location e and the estimated
tion. In this part, we will introduce the collective matrix provider location LS ˆ s,e is defined by the inner product of
factorization (CMF) [1] to enhance the matrix factorization service feature vector qs and location feature vector oe .
models that predict quality of web services merely based on
the user-service quality pairs. ˆ s,e = qs oTe
LS (8)
1) CMF with Service Category Information: We denote
Denoting the collection of vl ’s as a |L|×D matrix V , and
the service categories data by Y ∈ R|S|×|C| , where C is the
the collection of oe ’s as a |E| × D matrix O, the objective
set of categories, each element of the matrix Ys,c (service-
function of this model can be defined as:
category matrix) is a binary value denoting whether the
service s belongs to the category c or not. In the previous LL (P, Q, V, O) = α||X − P QT ||2 + γ||LU − P V T ||2
section of this paper, we have defined the user-service matrix + δ||LS − QOT ||2
X, user feature P , and service feature Q. By utilizing the
+ ρ(||P ||2 + ||Q||2 + ||V ||2 + ||O||2 )
service feature Q as a shared factor for factorizing both
user-service matrix and service-category matrix, these two (9)
matrices, X and Y , can be decomposed at the same time. where α + γ + δ = 1.
Similar as the traditional matrix factorization discussed This model is referred to as “CMF-L” in the reminder of
above, a latent low-dimensional space RD on which for each this paper.
service category c, a category feature wc is defined. That is, 3) CMF with Category and Location Information: In this
qs and wc both belong to RD , and the estimated service model, we incorporate all the information mentioned above,
category Ŷs,c is defined by the inner product of these two that is to factorize user-service matrix, user-location matrix,
vectors, namely, service-category matrix and service-location matrix at the
Ŷs,c = qs wcT (5) same time. The objective function can be defined as:
Denoting the collection of wc ’s as a |C|×D matrix W , the LT L (P, Q, W, V, O)
loss function of this collective learning problem is defined = α||X − P QT ||2 + β||Y − QW T ||2
as:
+ γ||LU − P V T ||2 + δ||LS − QOT ||2
LT (P, Q, W ) = α||X − P QT ||2 + β||Y − QW T ||2 + ρ(||P ||2 + ||Q||2 + ||W ||2 + ||V ||2 + ||O||2 )
+ ρ(||P ||2 + ||W ||2 + ||Q||2 ) (10)
(6)
where α + β + γ + δ = 1. This model is referred to as
where α,β ∈ [0, 1] weight the relative importance of “CMF-TL” in this paper.
service quality and categories and α + β = 1. This model is At this point, we have not only arrived at a sensible and
referred to as “CMF-T” in the rest of this paper. well-defined notion of quality of web services, we also have
2) CMF with User and Service Location Information: translated the problem of QoS prediction to an optimization
The introducing of the service category could help the model problem. This optimization can be solved by stochastic
learn a more precise latent relations between users and gradient descent algorithm [21].
services. A natural introduction of the location context of D. Algorithms
web services invocation could be incorporated into the above
model, such that this contextual feature can be also learned Overall we take a stochastic gradient-based approach to
through the collective model. minimize the objective functions. For each latent vector b in
We denote LU ∈ R|U |×|L| the location where a service objective function L, the update rule of the parameter is as
user is located (user-location matirx), where U is the set follows:
∂L
of service invokers, L is the set of all location contexts of b=b−λ (11)
∂b
services users. vl ∈ RD denotes the latent feature of location
ˆ u,l is defined by the where λ is the step size.
l and the estimated user location LU
For the first collective matrix factorization model, CMF-T,
inner product of user feature vector pu and location feature
that simultaneously factors matrices X and Y , the param-
vector vl .
eters to be updated are user feature P , service feature Q
ˆ u,l = pu vlT and category feature W . And the partial derivatives of the
LU (7)
objective function LT with respect to these parameters are
Similarly, we consider the location information of service as follows.
providers. We denote LS ∈ R|S|×|E| the locations where
services locate, where S is the set of services, E is the set of ∂ LT
= 2α(Xu: − pu QT )(−Q) + 2ρpu (12)
all location contexts of services providers. oe ∈ RD denotes ∂pu
435
∂ LT 1:initialization
= 2β(Y:c − QwcT )T (−Q) + 2ρwc (13) P = rand(), Q = rand(), V = rand(), O = rand()
∂wc
2: repeat
∂ LT 3: for each (u, s) which Xus is observed do
= 2α(X:s − P qsT )T (−P ) 4: Pu: ← Pu: + λ[α(Xus − Pu: QTs: )Qs: − ρPu: ]
∂ qs
5: Qs: ← Qs: + λ[α(Xus − Pu: QTs: )Pu: − ρQs: ]
+ 2β(Ys: − qs W T )(−W ) + 2ρqs (14)
6: end for
Xu: denotes the row vector of X corresponding to the user 7: for each (u, l) which LUu,l is nonzero do
u; Y:c denotes the column vector of Y corresponding to the 8: Pu: ← Pu: + λ[γ(LUul − Pu: Vl:T )Vl: − ρPu: ]
category c. 9: Vl: ← Vl: + λ[γ(LUul − Pu: Vl:T )Pu: − ρVl: ]
Let Pu: denote the row vector pu and let Qs: denote the 10: end for
row vector qs , both are length-D vectors. The algorithm 1 11: for each (s, e) which LSs,e is nonzero do
T
shows the stochastic gradient algorithm to estimate the 12: Qs: ← Qs: + λ[δ(LSse − Qs: Oe: )Oe: − ρQs: ]
T
parameters. 13: Oe: ← Oe: + λ[δ(LSse − Qs: Oe: )Qs: − ρOe: ]
14: end for
1: initialization P = rand(), Q = rand(), W = rand() 15: record RM SE(P, Q, testX)
2: repeat 16: if λ > minStep then
3: for each (u, s) which Xus is observed do 17: λ ← 0.99λ
4: Pu: ← Pu: + λ[α(Xus − Pu: QTs: )Qs: − ρPu: ] 18: end if
5: Qs: ← Qs: + λ[α(Xus − Pu: QTs: )Pu: − ρQs: ] 19: until reach maxIteration or meet the convergence
6: end for criteria.
7: for each (s, c) which Ysc is nonzero do Algorithm 2: CMF-L:simultaneously factorizing matrices
8: Qs: ← Qs: + λ[β(Ysc − Qs: Wc:T )Wc: − ρQs: ] X,LU and LS.
9: Wc: ← Wc: + λ[β(Ysc − Qs: Wc:T )Qs: − ρWc: ]
10: end for
11: record RM SE(P, Q, testX) The corresponding algorithm pseudo code is shown in
12: if λ > minStep then Algorithm 2.
13: λ ← 0.99λ For the third collective matrix factorization model pro-
14: end if posed in last section, CMF-TL, that simultaneously factors
15: until reach maxIteration or meet the convergence matrices X, Y , LU and LS. As it’s a combination of CMF-
criteria. T and CMF-L, the gradient descent algorithm is similar to
Algorithm 1: CMF-T: simultaneously factorizing matrices these two algorithms, we will not include the update rules in
X and Y . this paper. Also, the algorithm is not listed due to the length
limit. For these algorithms, we exploit dynamic step size to
The second collective matrix factorization model, CMF-L, make minimization efficient by updating the step size after
simultaneously factorizes matrices X, LU , LS. We compute each iteration. We note that three contextual characteristics
the partial derivatives of the objective function LL with are considered in this paper and these models are able
respect to user factors P , service factors Q, user location to be easily extended by incorporating other contextual
factors V and service location factors O, and then update information.
the parameter according Eq. 11.
V. E XPERIMENTAL R ESULTS
In this section, we introduce the metric and the prediction
∂ LL
= 2α(Xu: − pu QT )(−Q) results in our experiments.
∂pu
A. Dataset and Evaluation Metric
+ 2γ(LUu: − pu V T )(−V ) + 2ρpu (15)
We download the user-service invocation records
∂ LL (WSDream-QoSDataset24 ) as the data set for our
= 2α(X:s − P qsT )T (−P ) experimental study. We randomly choose 2,502 services
∂qs
and classify these service in to 8 categories by analyzing
+ 2δ(LSs: − qs OT )(−O) + 2ρqs (16)
the wsdl files. Table I lists the number of services in these
∂ LL categories.
= 2γ(LU:l − P vlT )T (−P ) + 2ρvl (17)
∂vl There are 63 service locations (service provider countries)
∂ LL and 31 service invoker locations (user countries) in the
= 2δ(LS:e − QoTe )T (−Q) + 2ρoe (18)
∂ oe 4 http://www.wsdream.net/dataset.html
436
Table I 1.31
1.29
Name # of services percentages
E-commerce 755 30.18 1.28
Media 124 4.96
1.27
Schedule 146 5.84
Financial 90 3.60
1.26
Geographical 74 2.96
Government 24 0.96 1.25
Network 1125 44.96
Communication 164 6.55 1.24
1.23
20 40 60 80 100 120 140
data set. Table II lists the countries that host more than 10
services and the number of services hosted at these countries.
Figure 5. RMSE performance of CMF-T changing over different latent
From the above two observations, we can see the necessity of dimensions on the response time data with 0.1 sparsity.
introducing extra contextual characteristics when the quality
may be affected by the service category and the location of
both users and providers. 76
The user-service matrix, user-location, service-category,
and service-location matrix are generated from this data set.
To simulate the situation of sparseness in the user-service
matrix, we randomly remove some QoS data of the training 75.5
matrix and the testing matrix. This makes the sparse matrixes
with data density of 10%, 30% and 50%.
B. Evaluation Measures
75
To evaluate the performance of our algorithm, we make
use of Rooted Mean Square Error (RMSE) to compare with
the basic matrix factorization model and the biased matrix
factorization model. RMSE is a statistical accuracy metric 74.5
50 100 150 200 250 300 350 400
which is widely used to measure the prediction quality in
collaborative filtering methods.
The definition of RMSE is given by the following equa- Figure 6. RMSE performance of CMF-T changing over different latent
dimensions on throughput data with 0.1 sparsity.
tion:
2
Xu,s ∈T (X̂u,s − Xu,s ) 2) RMSE Performance Comparison: To confirm the im-
RM SE = , (19) provement by incorporating contextual information, we com-
|T |
pare the prediction performances with the existing QoS
where Xu,s is the observed QoS of service s invoked by prediction methods based on matrix factorization: MF, which
user u, X̂u,s is the predicted corresponding QoS value, and predicts the missing QoS values by factorizing user-service
T is the testing set. matrix; MFB, which extends MF by adding bias (as shown in
C. Results and Analysis equation 4). The three models, CMF-T, CMF-L, and CMF-
1) Impact of Parameters: In this part we change the TL, proposed in this study are also compared.
number of latent dimensions D. Figure 5 shows the per- For each model, we exam the performance on various
formance of CMF-T when predicting the response time parameter settings and choose the one that performs the best.
by employing 10% density and changing D. The figure For example, in experiments for predicting response time,
shows the that RMSE achieve the best when D is set as the parameter settings are:
50. The similar trend is shown in Figure 6 that illustrates • for CMF-T model, we choose α = 0.6, β = 0.4;
the impact of the number of latent features D for CMF- • for CMF-L model, we choose α = 0.6, γ = 0.2, δ =
T when predicting the throughput of services. It can be 0.2;
seen that the best performance achieved when choosing D • for CMF-TL model, we choose α = 0.5, β = 0.3, γ =
as 200. A reasonable choice of number of dimensions can 0.1, δ = 0.1.
be obtained in the experiments when using our method in Table III and Table IV show the comparison results of
different environment. different approaches to the prediction of throughput and
437
Table II
S TATISTICS ON SERVICES IN SOME COUNTRIES .
Table III
RMSE P ERFORMANCE C OMPARISON ON T HROUGHPUT. T HE U NIT IS margin. We will investigate parameter setting problem from
KBPS . large space in our future research and provide a more
efficient parameter selecting strategy.
10% 30% 50%
MF 86.47 86.13 73.50 VI. C ONCLUSION AND F UTURE W ORK
MFB 86.39 87.01 73.30
In this paper we propose context-aware prediction models
CMF-T 76.85 64.63 62.03
CMF-L 75.51 64.80 61.75 for quality of web services. One of the main contributions
CMF-TL 75.83 64.48 62.08 is to incorporate contextual features of service users and
service providers to make prediction for QoS values more
Table IV accurately. In specific, we exploit the collective matrix
RMSE P ERFORMANCE C OMPARISON ON R ESPONSE T IME . T HE U NIT IS factorization model to design context-aware predictors to in-
S ECOND .
crease the performance of existing QoS prediction approach-
10% 30% 50% es that merely base on the traditional matrix factorization or
MF 1.294 1.128 1.094 its variants. The experimental result confirms an increase of
MFB 1.255 1.113 1.078 the prediction accuracy in terms of RMSE.
CMF-T 1.258 1.097 1.074
CMF-L 1.248 1.091 1.070
In this study, to show the advantage of the collective
CMF-TL 1.251 1.089 1.071 matrix factorization model, we consider the service loca-
tion, user location and service category as the contextual
characteristics. In fact, there are still many other aspects of
response time of services invocations respectively. From the QoS properties that can be collected and considered in the
experimental results we can see that our proposed three mod- prediction model in the future. In addition, as we discussed
els achieve better performance in terms of RMSE in most in the experiment section, the parameter selecting strategies
situations. This indicates that by incorporating the category from big space should also be investigated in our future
information and location information, performance of QoS work.
prediction can be improved. Especially for the throughput VII. ACKNOWLEDGEMENT
quality, CMF-T, CMF-L, and CMF-TL all improved the This work was supported partly by National Natural Sci-
performance of MF and MFB by more than 10% in terms ence Foundation of China (No. 61300070, No. 61103031),
of RMSE. It can be seen from Table III and Table IV that partly by China 863 program (No. 2013AA01A213, No.
all the compared model can gain better performance when 2012AA011203), China 973 program (No. 2014CB340305),
reducing the sparseness of the training set. It can also be partly by the State Key Lab for Software Development En-
observed that by incorporating the service category or the vironment (SKLSDE-2013ZX-16), partly by A Foundation
location information will definitely outperform the tradition- for the Author of National Excellent Doctoral Dissertation
al matrix factorization and the biased matrix factorization of PR China(No. 201159) and partly by Program for New
approach. However, combining both of these two contextual Century Excellent Talents in University.
information will not always achieves better performance than
only incorporating one context. This may because that the R EFERENCES
parameter space is much bigger for CMF-TL and the best-fit [1] A. P. Singh and G. J. Gordon, “Relational learning via
parameter is not discovered. collective matrix factorization,” in Proceedings of the 14th
ACM SIGKDD International Conference on Knowledge
In summary, we have shown the effectiveness of our Discovery and Data Mining, ser. KDD ’08. New York,
models and indicated that CMF-T, CMF-L and CMF-TL NY, USA: ACM, 2008, pp. 650–658. [Online]. Available:
outperform existing QoS prediction approaches by a good http://doi.acm.org/10.1145/1401890.1401969
438
[2] L. Shao, J. Zhang, Y. Wei, J. Zhao, B. Xie, and H. Mei, [14] A. Karatzoglou, L. Baltrunas, K. Church, and M. Böhmer,
“Personalized qos prediction forweb services via collaborative “Climbing the app wall: enabling mobile app discovery
filtering,” in Web Services, 2007. ICWS 2007. IEEE Interna- through context-aware recommendations,” in Proceedings
tional Conference on. IEEE, 2007, pp. 439–446. of the 21st ACM international conference on Information
and knowledge management, ser. CIKM ’12. New York,
[3] Q. Zhang, C. Ding, and C. Chi, “Collaborative filtering based NY, USA: ACM, 2012, pp. 2527–2530. [Online]. Available:
service ranking using invocation histories,” in Web Services http://doi.acm.org/10.1145/2396761.2398683
(ICWS), 2011 IEEE International Conference on. IEEE,
2011, pp. 195–202. [15] Y. Koren, R. Bell, and C. Volinsky, “Matrix factorization
techniques for recommender systems,” Computer, vol. 42,
no. 8, pp. 30 –37, aug. 2009.
[4] Z. Zheng, H. Ma, M. R. Lyu, and I. King, “Wsrec: A
collaborative filtering based web service recommender sys- [16] Y. Koren, “Factorization meets the neighborhood: a
tem,” in Web Services, 2009. ICWS 2009. IEEE International multifaceted collaborative filtering model,” in Proceedings
Conference on. IEEE, 2009, pp. 437–444. of the 14th ACM SIGKDD international conference on
Knowledge discovery and data mining, ser. KDD ’08.
[5] Z. Zheng, Y. Zhang, and M. R. Lyu, “Distributed qos evalu- New York, NY, USA: ACM, 2008, pp. 426–434. [Online].
ation for real-world web services,” in Web Services (ICWS), Available: http://doi.acm.org/10.1145/1401890.1401944
2010 IEEE International Conference on. IEEE, 2010, pp.
83–90. [17] ——, “Collaborative filtering with temporal dynamics,” Com-
mun. ACM, vol. 53, no. 4, pp. 89–97, Apr. 2010. [Online].
Available: http://doi.acm.org/10.1145/1721654.1721677
[6] Y. Zhang, Z. Zheng, and M. R. Lyu, “Exploring latent features
for memory-based qos prediction in cloud computing,” in [18] R. Salakhutdinov and A. Mnih, “Probabilistic matrix factor-
Reliable Distributed Systems (SRDS), 2011 30th IEEE Sym- ization,” in NIPS, 2007.
posium on. IEEE, 2011, pp. 1–10.
[19] D. Agarwal and B.-C. Chen, “Regression-based latent
[7] J. Ge, Z. Chen, J. Peng, T. Li, and L. Zhang, “Web service rec- factor models,” in Proceedings of the 15th ACM
ommendation based on qos prediction method,” in Cognitive SIGKDD international conference on Knowledge discovery
Informatics (ICCI), 2010 9th IEEE International Conference and data mining, ser. KDD ’09. New York, NY,
on. IEEE, 2010, pp. 109–112. USA: ACM, 2009, pp. 19–28. [Online]. Available:
http://doi.acm.org/10.1145/1557019.1557029
[8] M. Zhang, X. Liu, R. Zhang, and H. Sun, “A web service
[20] Y. Koren, “Factorization meets the neighborhood: a multi-
recommendation approach based on qos prediction using
faceted collaborative filtering model,” in KDD, 2008, pp. 426–
fuzzy clustering,” in Services Computing (SCC), 2012 IEEE
434.
Ninth International Conference on. IEEE, 2012, pp. 138–
145.
[21] L. Bottou, “Stochastic learning,” in Advanced Lectures
on Machine Learning, ser. Lecture Notes in Artificial
[9] A. Schmidt, M. Beigl, and H.-W. Gellersen, “There is more Intelligence, LNAI 3176, O. Bousquet and U. von Luxburg,
to context than location,” Computers & Graphics, vol. 23, Eds. Berlin: Springer Verlag, 2004, pp. 146–168. [Online].
no. 6, pp. 893–901, 1999. Available: http://leon.bottou.org/papers/bottou-mlss-2004
439