

SUPPORT VECTOR MACHINE FOR STRUCTURAL RELIABILITY ANALYSIS
- 期刊名字:应用数学和力学
- 文件大小:213kb
- 论文作者:LI Hong-shuang,L Zhen-zhou,YUE
- 作者单位:School of Aeronautics
- 更新时间:2020-12-06
- 下载次数:次
Applied Mathenatics and Mechanics (English Edition), 2006, 27(10):1295-1303CEditorial Committee of Appl. Math. Mech., ISSN 0253-4827SUPPORT VECTOR MACHINE FOR STRUCTURALRELIABILITY ANALYSIS *LI Hong-shuang (李洪双),LU Zhen-zhou (吕震宙),YUE Zhu feng (岳珠峰)(School of Aeronautics, Northwestern Polytechnical University, Xi'an 710072, P. R. China)(Contributed by YUE Zhu-feng)Abstract: Support vector machine (SVM) was introduced to analyze the reliability ofthe implicit performance function, which is difficult to implement by the classical meth-ods such as the first order reliability method (FORM) and the Monte Carlo simulation(MCS). As a classifcation method where the underlying structural risk minimization in-ference rule is employed, SVM possesses excellent learning capacity with a small amountof information and good capability of generalization over the complete data. Hence,two approaches, i.e., SVM-based FORM and SVM-based MCS, were presented for thestructural reliability analysis of the implicit limit state function. Compared to the con-ventional response surface method (RSM) and the artificial neural network (ANN), whichare widely used to replace the implicit state function for alleviating the computation cost,the more important advantages of SVM are that it can approximate the implicit functionwith higher precision and better generalization under the small amount of informationand avoid the“curse of dimensionality". The SVM-based reliability approaches can pproximate the actual performance function over the complete sampling data with thedecreased number of the implicit performance function analysis (usually finite elementanalysis), and the computational precision can satisfy the engineering requirement, whichare demonstrated by ilustrations.Key words: structural reliability; implicit performance function; support vector ma-chineChinese Library Classification: TB114.32000 Mathematics Subject Classification: 62N05; 90B25Digital Object Identifier(DOI): 10.1007/s 10483-006-1001-2 .IntroductionThe reliability analysis of implicit performance function is one of challenges in structuralreliability discipline. In theory, any algorithm used for explicit performance function reliabilityevaluation may be adapted to deal with implicit performance functionl,2. However, manydificulties occurring in realistic problem can not be conquered. The gradient-based reliabilityanalysis methods(3- 5] need to estimate the gradient, which is dificult to implement for theimplicit performance functions. And simulation- based methodsl2,4 can not be accepted fotheir vast computational effort of the implicit performance function call, usully based onfinite element method. The regress methods6l have been applied in the structural reliabilityto alleviate the dificulties since 1990s. In the presented regress methods, an explicit functionintended to substitute the implicit performance function is fitted through planned or randomsamples under the empirical risk minimization (ERM). and the. failure probability of explicit中国煤化工* Received Dec.26, 2005; Revised Jul.9, 2006Project supported by the National Natural Science FounHC N M H G), the NationalAstronautics Science Foundation of China (Nos.N3CH0502 and N5CH0001) and Program for NewCentury Excellent Talent of Ministry of Education of China (No.NCET-05-0868).Corresponding author LU Zhen- zhou,Professor, E mail: zhenzhoulu@nwpu.edu.cn1296LI Hong-shuang, LU Zhen-zhou and YUE Zhu-fengfunction replaces that of the actual implicit performance function. Response surface method(RSM)|1,5-10| and artificial neural network (ANN)l,6,11-13lare good representatives of the. regress methods. The regress methods have been successfully applied in many engineeringcases, but ERM principle severely infuences the application range of the regress methods. Thinflexible selection of response surface function and empirical parameters have not been solvedcompletely. In essence, the selections of the topology structures and parameters, overtrainingand local optimization are the main obstacles of ANN in reliability analysis.Similar to ANN, support vector machine (SVM)14-17] is an intelligent learning method forpattern recognition. However, the theoretical bases of SVM and ANN are different. ANN; based on ERM principle, while SVM's foundation is structural risk minimization (SRM)principle. ANN can obtain the minimization risk for the training samples due to the applicationof the ERM principle, but for the un-trained samples ANN may give unbelievable estimation.SRM principle can improve generalization ability of learning machine by minimizing the totalof empirical risk and confidence bound. Comparing with ANN, SVM does not involve localoptimization and has good generalization ability, therefore SVM is superior to ANN.Rocco and Moreno18] introduced SVM to the reliability analysis in Ref.[18] firstly. Hurtadoand Alvarezl19) treated reliability analysis as a classfication task and adopted SVM in con-junction with stochastic finite element to analyze structural reliability. Several classificationmethods were discussed by Hurtadol6l from the viewpoint of statistical learning theory, and ithas been shown that only multi-layer perceptrons (MLP) and SVM could be used for struc-tural reliability analysis, especially in the cases of small samples. From the investigation of thereliability literatures, it is found that the application of SVM in reliability analysis needs to be :researched deeply. In order to estimate the reliability of implicit performance function oftenencountered in structural safety assessment, two SVM-based reliability analysis methods, ie,SVM-based FORM and SVM-based MCS, are presented in this contribution, which are similarto the applications of ANN in structural reliability analysisl3).1 Review on SVMSVM provides a new learning algorithm which satisfes SRM principle based on statisticallearning theory. The classification and regress problems can be transformed into quadraticconvex optimal problems by SVM, the solutions are global optimal and independent of the dis-tributions of samples. In the fllowing part, we briefty introduce the basic concept of SVMI14-17].1.1 Linear SVMGive a set of training data(1,1),(x,12),,(xn,yu),x∈R", y∈ {+1,-1},(1)where xi is the training data, l is the number of training data, y is the class label for xi Define :a hyperplane that can separate the training data as two classes correctly as follows:(w.x)+wo=0,(2)where () means inner product of two vectors.There may exist many possible hyperplanes which can realize the purpose of separating linearseparable samples as two classes, but there is only one, so-called maximal margin hyperplane oroptimal hyperplane, which can maximize the distance between the hyperplane and the nearestsample points to it of each class, and this distance is d中国煤化工、According to SRM principle, the optimal problem! margin hy-perplane can be formulated as minimization of the follYHCNMHGmin |w/2| = (w . w)/2,(3)s.t. yi[(w.xi) + wo]≥土1.Support Vector Machine for Structural Relability Analysis1297Equation (3) describes a quiadratic optimal problem, and according to Lagrange principle, theabove problem can be transferred to its corresponding forms as follows:L(w, wo,ax)= (w .w)/2- 2 ar{y:(w.x)+wo] -1},(4)where Q;≥0(i = 1,2,.. ,l) are Lagrange multipliers. Derive minimization of the Lagrangefunction with respect to w and wo, we have the following equations:, Qiy:= 0,(5)w=QiYiTi. .(6)Furthermore, the optimum solution must satisfy the Karush- Kuhn- Tucker (KKT) conditions:ai{y:[(w.xi)+wo]- 1}=0.(7From these conditions, it concludes that only a small part of training vectors in the expression(6), denoted as support vector (SV), can have nonzero coefficients Qi in the expansion of w.Therefore we obtainw= Eai;y:xi.i∈SVSubstituting Eqs.(5) and (6) into Eq.(4) and taking account of KKT conditions, the dualoptimal problem is obtained corresponding to the original problem as follows:minj,2E=1 j=1j=1(9)ls.t. .2 ; Qi;y;=0, ai≥0,. i=1,2,..l. .i=1After Qi is solved, the parameter vector w of the maximal margin hyperplane is given by thelinear combination of the support vectors in Eq.(8). wo and the maximal margin hyperplaneshown in Eq.(2) are also completely determined by the support vectors.To construct the optimal byperplane in the case that the samples are linearly nonseparable,slack variables ξi≥0 are introduced and the corresponding dual optimal problem is expressedas22:lyijqQ;Q;(x;.xz)- 2a;,(10)s.t.Qa;yi=0, C≥ai≥0, i=1,2,1,where C > 0 is a given penalty parameter that reflects the tradeof between the complexity ofSVM model and the proportion of nonseparable sam|中国煤化工1.2 Nonlinear SVMThe classification ability of linear SVM is finite,:YHCNMHGoftenloseits.practice value for the non- linearly separable samples because of big error. At the case, nonlinearSVM should be introduced. The basic concept of the nonlinear SVM is that the input vectors1298LI Hong shuang, LU Zhen-zhou and YUE Zhu-fengare mapped into a high dimensional feature space by the nonlinear transform x -→(x), wherethen one can construct the maximal margin hyperplane. The mechanism of nonlinear SVM isshown in Fig.1.中(●)→中(刈)(x)、Xxφ(0)(中x)(0o'o中()中(x)中(0)中(0)Fig.1 Mapping from original non-linear separating region to a linear onel18]It only involves computing inner product of two vectors in feature space for building anoptimal hyperplane. Kernel functions provide a simple and practical method that need notknow the expressions of mappings. Therefore the inner product of two vectors zi and zj infeature space Z can be expressed as a function of two variables in the input space,(zx.zj) = ((xr).(xj)) = K(xr,xj),(11)where K(xi, xj) is a kernel function satisfying Mercer condition!15-17]. There are some ker-nels satisfying Mercer condition widely employed in classification, such as polynomial kernelK(x;,xj) = [(xi.xj) + 1]d(d∈N), radial basis kernel K(xi, xj) = exp(-I|x; - xj|/2o2) andmultilayer perceptron kernel K(xi,xj) = tanh [v(xq.xj) + a]. After the kernel function isselected, the optimal hyperplane is a nonlinear function in the input space, which equivalentsto the linear function in feature space for the nonlinear separable training data. Hereby, thedecision function which presents separating rule can be written as an indicator function,f(x) = sgn(2 ; Qigy:K(xi;,x;) + wo).(12)2 Reliability Analysis Methods Based on SVMDenoting -1 for the sampling points located in the failure region (g(x)'≤ 0) and +1 for thesampling points located in the safe region (g(x) > 0), the structural reliability problems can betransformed into the clasifcation ones. Hurtadol6l exposed the possibility of SVM applied instructural reliability analysis from the viewpoint statistical learning theory. The superioritiesof the SVM are described as: (i) it can estimate a function on the basis of a few samples; (i) ituses fAexible and adaptive models; (ii) it can overcome the curse of dimensionality. Combinedwith classical reliability analysis methods, two new structural reliability analysis methods on thebasis of SVM, ie, SVM-based FORM and SVM-based MCS, are presented in this contribution.But the purposes of SVM in two methods are not the same. SVM-based MCS method usesSVM as data classification machine, so that the output values only depend on the location ofsamples in the failure or safe regions. SVM in the SVM-based FORM method is employed toapproximate the implicit limit state equation.2.1 Sampling procedure中国煤化工A primary issue of structural reliability analysis bas.CHCNMH Gate the training samples. Direct MCS generates samples from the usuruluns UI Uasi ralluom variables.But it is not appropriate for many cases, especially for the structural reliability analysis withsmall failure probability, where there may be no samples locating at the failure region even afterSupport Vector Machine for Structural Reliability Analysis1299a large number samples are drawn. At the case, the samples without including failure ones cannot sever as training data for a classification problem.To obtain appropriate training data for the classification, one can generate samples fromuniform distribution with k standard deviation around the mean value of each basic randomvariable. For example, if the basic random variable distributes normally with the mean μ andthe standard deviation σ, i.e., X ~ N(μ, σ2), the sampling range can be distributed uniformlyat the interval [μ - kσ,μ + ko]. This sampling procedure ensures that the samples can belocated in a large range. Moreover it satisfies the need of failure sample for training SVM. Ifwe exactly know the infuence of random variable to structural reliability, the sampling rangemay be shrunk further to increase the number of failure samples. The right distribution trail ofthe load has signifcant effect on failure probability, so one can control the load samples lyinguniformly in the interval [μ,μ + kσ]. In turn, the left distribution trail of the strength hassignifcant efect on failure probability, then one can generate the strength samples locating atthe interval u - ko,国小2.2 Data scalingThe data scaling is introduced to reduce rounding error of computers, hereby improve stabil-ity of the SVM training process and generalization ability. The ranges of basic random variableshave a large distinction due to different physical properties and dimensions, which results inthe instability of the SVM training process. Even though SVM may be trained sucessfully, thegener alization ability of SVM will be poor at the case. The following data scaling for the basicvariable can alleviate the effect of different physical properties and dimensions on the trainingof SVM:工i一μix;=σ(13)where xi is the the ith sample of the basic random variable with the mean Hhi and the standarddeviation σi, and x; is the ith scaled sample corresponding xi.2.3 SVM-based MCSThe analysis procedure of the SVM-based MCS isshown in Fig.2. The SVM is used as classificationPrepare the training samples andmachine. Generating samples from the distributionscompute the response valuesof the random variables, the response values can beobtained from SVM model successfully trained. Then| Data scalingthe failure probability of structures can be estimatedby the following equation:Select the SVM parameters=NP:= P(g(x)≤0)≈P(f(x)≤0) =(14)Train SVM modelwhere 9(x) is the actual implicit performance func-tion, f(x) is .the approximate performance functionCalculate the failure probability from theby SVM, N is the total number of samples, and Nq isSVM model by MCS methodthe number of samples locating at the failure regionwhere f(x)≤0.Fig.2 Framework of SVM-based MCS2.4 SVM-based FORMSVM-based FORM employs the SVM to approximate the implicit performance function andits first order partial derivatives which are necessary for FORM. Generally, the approximateperformance function by SVM is given as中国煤化工f(x)= 2 a;y:K<(xr(15)iESVHYHCNMHGwhere x is an n dimension random vector. Once the kernel function is determined, the partialderivatives of the performance function with respect to each variable can be calculated readily.1300LI Hong-shuang, LU Zhen-zhou and YUE Zhu-fengFor instance, if d-order polynomial kernel function K#(x; . xj) is chosen as the kernel, the firstorder derivatives are calculated by8f0x;E a:;zy;Kd-+(x.x)(16)3 ExamplesThree examples are taken to check accuracy and eficiency of the presented methods. Inorder to observe approximate effect, those examples are chosen to have an explicit performancefunction. All the results are list in tables. The FORM column presents the results calculated byFORM, and MCS, SVM-based FORM and SVM-based MCS columns are analogous to FORM.Example 1 Quadratic limit state reliability analysisEquation (17) is often taken to investigate the accuracy of the implicit limit state reliabilityanalysis methodsl10]:g(x)=4-号(x1-1)2 -x2,(17)2where工1 and x2 obey the standard normal distribution. In terms of sampling process pre-sented in Section 2, 100 data points are drawn. A polynomial kernel is selected with the orderd= 2, 3, 4, 5. The limit state surfaces predicted by SVM are shown in Fig.3. The solid lines rep-resent the actual limit state equations, and the dashed lines represent the limit state equations6★Design point由Design pointx2-2-6--10-g 448-10-g 一-4方4-8(a) d=2(b) d=3# Design point* Design pointx2-25-I0-8 -404- 8中国煤化工_,8fYHCNMHG(c) d=4{d) d=5Fig.3 Approximation of second order linit state equationSupport Vector Machine for Structural Reliability Analysis1301approximated by SVM. The points with cycles are SV. The calculated failure probabilities aresuimmarized in Table 1.Table 1 Failure probabilities of Exarmple 1 (x10-4)SVM-based (d = 2)SVM-based (d = 3)SVM-based (d=4)SVM-based (d= 5)FORM MCSFORMMCS6.368.04.355.676.256.436.32:4.947.53It is observed from Fig.3 that kernel parameter has some effect on the shape of approximatelimit state surface. The number of support vector does not exceed 10% of the total number oftraining samples. Removing non-support-vector samples will not result in calculation change.As shown in Table 1, there exists difference between the result of the presented method andthose of the MCS method. This difference is brought by the restricted sample range, at whichthe approximate limit state surface can substitute the actual limit state surface.Example 2 Fourth order limit state reliability analysisA fourth order performance function[19] is given asg(x) = 2 + exp(-)+(管)-2,(18)where x1 and x2 are standard normal random variables. The approximate situations of limitstate equations are shown in Fig.4, and the results are tabulated in Table 2.●Design point |* Design point |4功0-4--8一-448-8g -4548x(a) d=2(b) d=38●Design point /20f-8 -4中国煤化工(C) d=4YHCNMHGFig.4 Approximation of fourth order limit state equation1302LI Hong shuang, LU Zhen-zhou and YUE Zhu-fengTable 2 Failure probabilities of Example 2 (x10-3)SVM-based (d = 2)SVM-based (d = 3)SVM-based (d=4) SVM-based (d = 5)FORM MCSFORMMCS1.351.842.092.251.601.901.501.911.54Example 3 Reliability analysis of three span continuous beamConsider the reliability analysis of three-span beam with L= 5 m as shown in Fig.5, theperformance function is defined as the maximal deflection of three-span beam not exceedingL/360.B两两CFig.5 Schematic of three-span continuous beamThe limit state is shown asg(w, E,I) = L/360 - 0.0069wL */EI,(19)where w denotes distributed loads, E is the modulus of elasticity and I is the moment of iner-tia. The basic variables are distributed normally, and their distribution parameters are givenin Table 3. All the variables are independent of each other. The analytical solution of thefailure probability can be found in Nowak and Collinsl(the corresponding reliability index isβ= =3.173). Table 4 is a summary of results. It is shown that the results calculated by thepresented methods satisfy the require of engineering.Table 3 Distribution parameters of basic variables inTable 4 Failure probabilities of Ex-Example 3ample 3 (x10- *)Random variable MeanStandard deviationSVM-based (d= 3)u10 km/m0.4 kN/mE2x 107 kN/m20.5x 107 kN/m28x10-4m41.5x10- 4 m47.54368.9608.37788.65304 ConlusionsTwo approaches based on SVM for the reliability analyses of structures were introduced inthis paper. SVM satisfies the SRM principle which can guarantee small prediction error for non-training data. The examples indicate that the proposed methods are good alternative reliabilityanalysis method for their easy implementation and accuracy. The following conclusions can bedrawn from this study:(i) The SVM can be used as not only classification machine but also limit state approximatorfor the reliability analysis.(i) The selection of kernel parameters has a great effect on computing time and accuracy ofthe failure probability, and deserves deep research. In中国煤化工tion dependson cross-validation technique.(ii) The resolution of SVM is a convex quadraticMHCNMH G there isnoproblem of local optimization when approximating performance functions. It improves thconfidence and accuracy of the calculated results.Support Vector Machine for Structural Reliability Analysis1303References[1] Gomes H M, Awruch A M. Comparison of response surface and neural network with other methodsfor structural reliability analysis[J] Structural Safety, 2004, 26(1):49 67.[2] Schueremans L, Gemert D V. Beneft of splines and neural networks in simulation based structuralreliability analysis[J] Structural Safety, 2005, 27(3):246 -261.13] Rackwitz R. Reliability analysis- -a review and some perspectives[J]. Structural Safety, 2001,23(4):365- -395[4] Nowak A R, Collins K R. Reliability of Structures[M]. McGraw-Hil, Boston, 2000.[5] Zhao Y G,Ono T. A general procedure for frst/second-order reliability method(FORM/SORM)[J]. Structural Safety, 1999, 21(2):95 -112.[6] Hurtado J E. An examination of methods for approximating implicit limit state functions fromthe viewpoint of statistical learning theory[J]. Structural Safety, 2004, 26(3):271-293.[7] Bucher C G, Bourgund U. A fast and eficient response surface approach for structural reliabilityproblems[J]. Structural Safety, 1990, 7(1):57-66.8] Rajashekhar M R, Ellingwood B R. A new look at the response surface approach for reliabilityanalysis[J]. Structural Safety, 1993, 12(3):205 -220.[9] Kim S, Na S. Response surface method using vector projected sampling points[J]. StructuralSafety, 1997, 19(1): 3-19.[10] Guan X L, Melchers R E. Effect of response surface parameter variation on structural reliabilityestimates[J]. Structural Safety, 2001, 23(4):429 444.[11] Hurtado J E, Alvarez D A. Neural-network based reliability analysis: a comparative study[J].Computer Methods in Applied Mechanics and Engineering, 2001, 191(1/2):113-132.[12] Papadrakakis M, Lagaros N D. Reliability- based structural optimization using neural networksand Monte Carlo simulation[J]. Computer Methods in Applied Mechanics and Engineering, 2002,191(32):3491- -3507.[13] Deng J, Gu DS, Li X B, et al. Structural reliability analysis for implicit performance functionsusing artificial neural network[J]. Structural Safety, 2005, 27(1):25-48.[14] Cortes C, Vapnik V N. Support vector networks[J]. Machine Learming, 1995, 20(3):273-297.[15] Vapnik V N. An overview of statistical learning theory|[]. IEEE Transaction on Neural Networlks,1999, 10(5):988- 998.[16] Vapnik V N. The Nature of Statistical Learning Theory[M]. Springer-Verlag, New York, 1995.[17] Deng Naiyang, Tian Yingjie. A New Method for Data Mining: Support Vector Machine[M].Science Press, Bejjing, 2004 (in Chinese).[18] Rocco C M, Moreno J A. Fast Monte Carlo reliability evaluation using support vector machine[J].Reliability Engineering and System Safety, 2002, 76(3):237 -243.[19] Hurtado J E, Alvarez D A. Classification approach for relability analysis with stochastic finite-element modeling[J.Jourmal of Structural Engineering, 2003, 129(8):1141-1149.中国煤化工MYHCNMHG
-
C4烯烃制丙烯催化剂 2020-12-06
-
煤基聚乙醇酸技术进展 2020-12-06
-
生物质能的应用工程 2020-12-06
-
我国甲醇工业现状 2020-12-06
-
JB/T 11699-2013 高处作业吊篮安装、拆卸、使用技术规程 2020-12-06
-
石油化工设备腐蚀与防护参考书十本免费下载,绝版珍藏 2020-12-06
-
四喷嘴水煤浆气化炉工业应用情况简介 2020-12-06
-
Lurgi和ICI低压甲醇合成工艺比较 2020-12-06
-
甲醇制芳烃研究进展 2020-12-06
-
精甲醇及MTO级甲醇精馏工艺技术进展 2020-12-06