adaptive power system stabilizer using support vector ... - CiteSeerX

18 downloads 759 Views 424KB Size Report
Traditional power system stabilizers rely on linear design .... classifiers require the use of a symmetric and positive semi-definite (PSD) kernel, .... [6] Burges C., “A tutorial on support vector machines for pattern recognition”, In “Data Mining and ...
K. R. Sudha et al/ International Journal of Engineering Science and Technology Vol. 2(3), 2010, 442-447

ADAPTIVE POWER SYSTEM STABILIZER USING SUPPORT VECTOR MACHINE K.R.Sudha1

Y.Butchi Raju2

Prasad Reddy.P.V.G.D3

1

Department of Electrical Engineering, AU college of Engineering, Andhra University, Visakhapatnam, India [email protected]

2

Department of Electrical and Electronics Engineering, Sir.C.R.Reddy College of Engineering, Eluru, India [email protected]

3

Department of Computer Science and Systems Engineering, AU college of Engineering, Andhra University, Visakhapatnam, India [email protected]

Abstract Power System Stabilizer (PSS) must be capable of providing appropriate stabilization signals over a broad range of operating conditions and disturbances. Traditional power system stabilizers rely on linear design methods. In the present paper, a novel approach for on-line adaptive tuning of Support Vector Machine based Power System Stabilizer (SVMPSS) parameters using and sigmoidal kernel is presented. The proposed SVMPSS is trained over a wide range of operating conditions. The simulation results of the proposed SVMPSS are compared to those of conventional stabilizers in for a Single Machine Infinite Bus (SMIB) system The effect of system parameter variations on the proposed stabilizer performance is also examined. The results show the Robustness of the proposed SVMPSS and its ability to enhance system damping over a wide range of operating conditions and system parameter variations. Key words: Support Vector Machine, Power system stabilizer, Dynamic stability.

1. INTRODUCTION The basic function of a PSS is to extend stability limits which are characterized by lightly damped or spontaneously growing oscillations in the 0.2 to 2.5 Hz frequency range [1]. This is accomplished via excitation control to contribute damping to the system modes of oscillation. Consequently, it is the stabilizer’s ability to enhance damping under least stable condition. A PSS can be most effectively applied if it is tuned with the understanding of the characteristics of the power system associated. Considerable efforts have been directed towards developing an adaptive PSS. Machine Learning is considered as a subfield of Artificial Intelligence and it is concerned with the development of techniques and methods which enable the computer to learn. In simple terms development of algorithms which enable the machine to learn and perform tasks and activities. Machine learning overlaps with statistics in many ways. Over the period of time many techniques and methodologies were developed for the design of PSS using machine learning tasks[6][7][13].The foundations of Support Vector Machines (SVM) have been developed by Vapnik [5][13]. The SVM can be applied to both classification and regression problems. The SVM for regression is called Support Vector Regression (SVR) and applied in regression analysis. SVM has its solid mathematical foundation based on statistical learning theory (Vapnik-Chervonenkis (VC) theory) [7][13]. A major goal of VC theory is to characterize the generalization error instead of the error on specific data sets, which enable SVM to generalize well to unseen data. Unlike conventional regression techniques, the SVR attempts to minimize the upper bound on the generalization error based on the principle of structural risk minimization rather than minimizing the training error. This approach is expected to perform better than the empirical risk minimization principle employed in the conventional approaches. Moreover, the SVR is a convex optimization, which ensures that the local minimization is the unique minimization. Support vector machine has three key features: 1) Better generalization capability;

ISSN: 0975-5462

442

K. R. Sudha et al/ International Journal of Engineering Science and Technology Vol. 2(3), 2010, 442-447

2) Global optimal solution using optimization theory; 3) Kernel functions for nonlinearity. There are many linear classifiers (hyper planes) that separate the data. However only one of these achieves maximum separation. Advantages and Limitations of SVR: Support vector regression (SVR) has many attractive features: 1. It has the ability to model non-linear relationships; 2. It has the ability to select only the necessary data points (support vector) to solve the regression function, which results in a sparse solution; 3. The regression function is related to a quadratic problem (QP) which has a unique global solution in general; 4 VC theory characterizes properties the generalization error which enable SVR to generalize well to unseen data; 5 The SVR technique can be used when there are few samples than variables, which is also called small n large p problems . Power systems is a highly non linear system operating over a wide range of operating conditions. The ability of SVM to model non-linear relationships is taken advantage of in designing a PSS to operate over wide range of operating conditions. The dynamic responses following perturbation in excitation and perturbation in torque for a single machine connected to infinite bus bar are observed. 2. SYSTEM MODEL The small perturbation block diagram of a synchronous machine connected to infinite bus system [2] is considered. The exciter is assumed to be of the thyristor type. Amortisseur effects, armature resistance, armature p terms and saturation are neglected. The linearized model parameters K1 to K6 vary with operating conditions with the exception of K3. The stabilization problem is to design a stabilizer, which provides supplementary stabilizing signals to increase the damping torque of the system. An infinite bus is a source of constant frequency and voltage either in magnitude and angle. A schematic representation of this system is shown in fig 1.

Fig1 Single Machine Connected to infinite busbar

Fig.2 Block diagram representation of the system

ISSN: 0975-5462

443

K. R. Sudha et al/ International Journal of Engineering Science and Technology Vol. 2(3), 2010, 442-447

Conventional Lead-Lag PSS (CPSS): Electromechanical oscillations of generator are damped by compensator that one type of it is shown in fig 3.

Fig.3 Block diagram of Conventional PSS

Where KS is the stabilization gain, TW is the washout time constant, T1 is the transient time constant and T2 is the steady state time constant Generally, we take Tw=2 and T2=0.05 3. KERNEL FUNCTION IN SUPPORT VECTOR MACHINE: . The idea of the kernel function is to enable operations to be performed in the input space rather than the potentially high dimensional feature space. Hence the inner product does not need to be evaluated in the feature space. We want the function to perform mapping of the attributes of the input space to the feature space. The kernel function plays a critical role in SVM and its performance. It is based upon reproducing Kernel Hilbert Spaces [10] [12][13]. K(x, x )  φ(x), φ(x ) (1) If K is a symmetric positive definite function, which satisfies Mercer’s Conditions, K(x, x ) 



α

m φ m (x)φ m (x )

αm  0

(2)

m

 K(x, x )g(x)g(x )dxdx   0

, g  L2

(3)

Then the kernel represents a legitimate inner product in feature space. The training set is not linearly separable in an input space. The training set is linearly separable in the feature space. Choosing Kernel and Corresponding Parameters In the SVR algorithm, the kernel function, its parameters, and two SVR training parameters (C, and ε) for ε-insensitive loss function play a key role in the SVR performance. Parameter C is the trade off between model complexity (flatness) and the degree of deviations allowed in the optimization formulation. Parameter  controls the width of the  -insensitive zone, which affects the number of support vectors used to construct the regression function. The kernel function represents the mapping instrument that is necessary to transform the non-linear input space to a high-dimensional feature space where linear regression is possible. The mapping depends on the intrinsic topological structure of the data and application-domain knowledge. It implies that the kernel type and all parameters need be optimally chosen to get the best performance. However, there are no structural methods for determining efficiently the selection of kernel and all parameters. Moreover, because C and  -values affect the model in a different way and kernel parameters and C are dependent, the kernel function and all parameters cannot be chosen separately. There are different kernel functions are often found in the literature associated mapping process.

ISSN: 0975-5462

444

K. R. Sudha et al/ International Journal of Engineering Science and Technology Vol. 2(3), 2010, 442-447

1] Polynomial: A polynomial mapping is a popular method for non-linear modeling. The second kernel is usually preferable as it avoids problems with the hessian becoming Zero.

K ( x, x ' )  x, x '

K ( x, x ' )  ( x, x '

d

(4) d

 1)

(5)

2] Gaussian Radial Basis Function: Radial basis functions most commonly with a Gaussian form   || x  x' || 2   K(x, x' )  exp (6)   2σ 2   3] Exponential Radial Basis Function: A radial basis function produces a piecewise linear solution which can be attractive when discontinuities are acceptable.   || x  x' ||  (7) K(x, x' )  exp   2σ 2  Sigmoidal Kernal: The long established MLP, with a single hidden layer, also has a valid kernel representation

K ( x , x ' )  tanh(( x , x )  )

(8) for certain values of the scale,  , and offset,  , parameters. Here the Support Vectors correspond to the first layer and the Lagrange multipliers to the weights. SVM and non-Positive Semi -Definite (PSD) sigmoid kernels. In the support vector machines (SVM) framework , nonlinear classifiers require the use of a symmetric and positive semi-definite (PSD) kernel, which transforms the input space samples into a high-dimensional (possibly infinite) feature space, in which a linear classification is performed [11]. Many nonlinear kernels are available, such as linear, polynomial, Gaussian, or sigmoid-shaped functions. The sigmoid kernel has been proposed theoretically for SVM due to its origin from neural networks but, to date, it has not been extensively used in practice because it becomes a PSD kernel for only some combinations of its free parameters (slope and bias of the function) [2,12]. Given that sigmoidal activation functions have been widely proved to provide useful global classifiers in neural networks, its inclusion in the SVM framework is an interesting and open issue. In the present paper, the sigmoidal function is utilized for the design of PSS to operate over a wide range of operating conditions. 4. NUMERICAL SIMULATION:

To evaluate the stability of the SVM PSS over a wide range of operating conditions, we consider a typical example of a generator connected to infinite bus bar [2][8][9] with the data xd=1.6, xd’=0.32, xq=1.55, Tdo=6, H=5, xe=0.4, Te=. 05, Ke=200. The SVM trained initially with 29 different loading conditions and The simulations were carried out for all the operating conditions and results of two cases 1+j0.7 which is the most unstable case and 0.7-j0.5 which is a lightly loaded and under damped case are presented in fig (3-6). The simulations were carried out for a 0.1 pu step change of excitation et. The plots show speed and rotor angle response to this disturbance for the system with SVMPSS designed

ISSN: 0975-5462

445

K. R. Sudha et al/ International Journal of Engineering Science and Technology Vol. 2(3), 2010, 442-447 0.12

0.1

change in delta

0.08

0.06

0.04

0.02

0

0

1

2

3

4

5 Time in Secs

6

7

8

9

10

Fig.3 . Rotor angle response for (P,Q)=(1.0, 0.7);With conventional PSS; With SVMPSS -4

12

x 10

10

8

6

4

2

0

-2

-4

-6

0

1

2

3

4

5

6

7

8

9

10

Fig.4 . Speed deviation response for (P,Q)=(1.0, 0.7);With conventional PSS; With SVMPSS

0.14

0.12

Change in delta

0.1

0.08

0.06

0.04

0.02

0

0

1

2

3

4

5 Time in Secs

6

7

8

9

10

Fig 5. Rotor angle response for (P,Q)=(0.7, -0.5);With conventional PSS; With SVMPSS

-4

12

x 10

10

8

Change in speed

6

4

2

0

-2

-4

-6

0

1

2

3

4

5 Time in Secs

6

7

8

9

10

Fig.6 .Speed deviation response for (P,Q)=(0.7, -0.5);With conventional PSS; With SVMPSS

ISSN: 0975-5462

446

K. R. Sudha et al/ International Journal of Engineering Science and Technology Vol. 2(3), 2010, 442-447

6. CONCLUSIONS:

In this study, Support Vector Machine based Power System Stabilizer (SVMPSS) using sigmoidal kernel is presented to adapt the PSS parameters to improve power system dynamic stability. Time domain simulations of the system with SVMPSS given a good speed deviation and change in rotor angle response at different type of loading condition. The results show that the performance of the SVMPSS parameters yields the less settling time and less overshoots as compared with conventional PSS parameters. REFERENCES: [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13]

E.V.Larsen and D.A.Swann “Applying Power system stabilizers :General concepts Part I,II,III “,IEEE Transactions on Power Apparatus and systems, Vol. PAS-100, No6 June 1981. pp 3017-3044 F. P. deMello and C. Concordia, “Concept of Synchronous Machine Stability as Affected by Excitation Control,“ IEEE Transactions on Power. Apparatus and Systems, Vol. PAS-88. April 1969. pp.316-329. WikipediaOnline.Http://en.wikipedia.org/wiki Tutorial slides by Andrew Moore. Http://www.cs.cmu.edu/~awm V. Vapnik. The Nature of Statistical Learning Theory. Springer, N.Y., 1995. ISBN0-387-94559-8. Burges C., “A tutorial on support vector machines for pattern recognition”, In “Data Mining and Knowledge Discovery”. Kluwer Academic Publishers, Boston, 1998, (Volume 2). V. Vapnik, S. Golowich, and A. Smola. Support vector method for function approximation, regression estimation, and signal processing. In M. Mozer, M. Jordan, and T. Petsche, editors, Advances in Neural Information Processing Systems 9, pages 281– 287, Cambridge, MA, 1997. MIT Press. K.R.Sudha, K.A.Gopala Rao “Analytical Structure of Three Input Fuzzy PID Power System Stabilizer with Decoupled Rules” WSEAS Trans. on CIRCUITS and SYSTEMS, June 2004 pp965-970 K.R.Sudha, K.A.Gopala Rao “Analytical Structure of PID Fuzzy Logic Power System Stabilizer Proceedings of IASTED Modelling ,Simulation,and Optimization MSO2004 August 2004 Vapnik V., ”Statistical Learning Theory”, Wiley, New York, 1998. M. A. Aizerman, E. M. Braverman, and L. I. Rozono´er. Theoretical foundations of the potential function method in pattern recognition learning. Automation and Remote Control, 25:821–837, 1964. N. Aronszajn. Theory of reproducing kernels. Trans. Amer. Math. Soc., 686:337–404, 1950. C. Cortes and V. Vapnik. Support vector networks. Machine Learning, 20:273 – 297, 1995

ISSN: 0975-5462

447

Suggest Documents