Augmented Hopfield Network for Mixed-Integer Programming - Neural ...

1 downloads 0 Views 88KB Size Report
Mixed-Integer Programming. Michael P. Walsh, Meadhbh E. Flynn, and Mark J. O'Malley. Abstract—Watta and Hassoun recently proposed a coupled gradient.
456

IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 10, NO. 2, MARCH 1999

Letters Augmented Hopfield Network for Mixed-Integer Programming

The dynamics of the augmented model are defined by (3) and (4) below

dUcij dt

Michael P. Walsh, Meadhbh E. Flynn, and Mark J. O’Malley

This letter proposes an augmented Hopfield network which is similar to the coupled gradient network proposed by Watta and Hassoun [1]. In Section II it is shown that the action of this augmented network is to minimize an energy function with discrete and continuous terms and that neurons with discrete transfer functions may be used. The augmented network is applied to the temporal unit commitment problem and results are provided in Section III. Section IV discusses the advantages of the proposed network over the model proposed by Watta and Hassoun and also compares it with other methods. Conclusions are drawn in Section V.

k;m

Tckm!cij Vckm +

+I + cij

Abstract— Watta and Hassoun recently proposed a coupled gradient neural network for mixed integer programming. In this network continuous neurons were used to represent discrete variables. For the larger temporal problem they attempted many of the solutions found were infeasible. This letter proposes an augmented Hopfield network which is similar to the coupled gradient network proposed by Watta and Hassoun. However, in this network truly discrete neurons are used. It is shown that this network can be applied to mixed integer programming. Results illustrate that feasible solutions are now obtained for the larger temporal problem.

I. INTRODUCTION

=

k;m

Udij

= k;m

+I + dij

k;m

dij

dij

d

dij

No change in Vdij ;

;if Udij > 0 if Udij < 0 if Udij = 0

(1)

where Udij and Vdij are the input and output, respectively, of discrete neuron dij . Continuous neurons use the Sigmoid transfer function gc

Vcij

= g (U c

cij

) = 12 (1 + tanh(U )) cij

(2)

where Ucij and Vcij are the input and output, respectively, of continuous neuron cij and  is a scaling factor known as the slope. All neurons have an input bias Idij for discrete neuron dij and Icij for continuous neuron cij . The standard interconnection matrix T is used, giving connections between all neurons, discrete and continuous. For example Tdij ckm is the connection from the discrete neuron dij to the continuous neuron ckm. However, a new form of interconnection (a matrix W ) between neuron pairs is introduced. For example Wij km is the connection from neuron pair cij dij to neuron pair ckm dkm.

!

!

ij

dij

9 = 01; 9 = 1;

ij

2 Vcij 9

(4)

=1 = 0:

if Vdij if Vdij

The following energy function is proposed for the augmented Hopfield network:

E=0

1 2

0

i;j k;m

0 21 0

Tckm!cij Vcij Vckm

Tdkm!cij Vcij Vdkm

i;j k;m

Tdkm!dij Vdij Vdkm

Icij Vcij 0

i;j

0 21

Idij Vdij i;j

i;j k;m

Wkm!ij Vdij Vcij Vdkm Vckm :

(5)

It is now shown that the dynamics of the system, (3) and (4), cause the energy function (5) to decrease, if discrete neurons are updated asynchronously. First consider the rate of change of the energy function due to a change in the output of continuous neuron

cij

dE dt

=0 k;m

0 k;m

0 k;m

Tckm!cij Vckm

dVcij dt

Tdkm!cij Vdkm

dVcij dt

dE =0 dVdtcij dt

k;m

1045–9227/99$10.00  1999 IEEE

dgc (Ucij ) dt

cij

k;m

dVcij dt

dVcij dt

Tckm!cij Vckm +

+I + =0

0I

Wkm!ij Vdij Vdkm Vckm

cij

Manuscript received October 31, 1996; revised September 19, 1998. The author is with the Department of Electronic and Eletrical Engineering, University College Dublin, Dublin 4, Ireland. Publisher Item Identifier S 1045-9227(99)01905-0.

Wkm!ij Vckm Vdkm Vcij

where

In the augmented Hopfield network there are two sets of neurons, a set of discrete neurons and a set of continuous neurons. The discrete neurons use the discrete transfer function gd , given by d

(3)

Tdkm!dij Vdkm

k;m

+ 12 T ! 9 + 21 W !

II. AUGMENTED HOPFIELD NETWORK

= g (U ) = 1 = g (U ) = 0;

Wkm!ij Vdkm Vdij Vckm

Tckm!dij Vckm +

i;j k;m

Vdij Vdij

Tdkm!cij Vdkm

k;m

k;m

(6)

Tdkm!cij Vdkm

Wkm!ij Vdij Vdkm Vckm

dUcij dt

(7) (8)

IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 10, NO. 2, MARCH 1999

=0

dgc dUcij

dUcij dt

2

:

457

which is incurred if the generator is on-line

(9)

1V = 0 ) 1E = 0 otherwise by definition of 9 1V = 9 if 1V 6= 0: dij

+bV i

dij

1E = 0

!

dij

1V

dij

ij

ij

+ 21 T ! 9 + 21 W ) 1E = 0U 1V : dij

dij

dij

dij

ij

(

Vdij Vcij 0 Lj

=

6

k=i

:

(15)

Vdkj Pmax;k 0 Lj 0 Rj

)

Vi = [(Pmax;i 0 Pmin;i ) 3

(16)

2

cij

)

(

1 (1 + tanh(U ))] + Pmin 2

Wkm!ij Vckm Vdkm Vcij

i

1V

dij

1V

;i

:

(19)

:

dij

Tdkm!dij Vdkm + ij

2

distribution of the reserve amongst the units will be decided by the dynamics of the system. The maximum Pmax;i and minimum Pmin;i generation limits of the ith generator are tackled by using the modified Sigmoid function (19) as defined in [4] for all continuous neuron transfer functions

k;m

! V

(14)

(10)

Tckm!dij Vckm + Idij + k;m

j

i

k;m

! V

Vcij )2

Vbij = 0 if Ubij  0 (17) Vbij = 0K (Ubij ) if Ubij < 0 (18) where the subscript b is used to denote a constraint neuron, Pmax;k is the maximum output possible from generator k Rj is the spinning reserve required at hour j and K is a positive constant. The output of constraint neuron ij (Vbij ) is added to the input of the discrete neuron ij (Udij ). In this manner generator i is encouraged to come on-line at hour j if there is insufficient capacity available to supply the load and reserve without the contribution from generator i. The

Tdkm!dij Vdkm +

+ 21 W

dij

and their output is given by

k;m

+ 21 T

i

Vcij + ci Vdij ) :

Ubij

( ) ( )

Tckm!dij Vckm + Idij +

(a (V

Thus the overall cost function (Cost) is the sum of the terms (14) and (15). It is this function that must be minimized to yield an optimum operating strategy. It is apparent that this function contains both discrete and continuous terms. The previous forms of the Hopfield network cannot solve this problem without simplifying approximations which degrade possible solution quality. The proposed AHN permits an exact mapping of the cost function to the network energy function. Spare capacity, known as reserve, must be scheduled to ensure sufficient generation to meet power demand in case of a system emergency. Dedicated constraint neurons [6] are used to tackle this constraint. These neurons have the following input

=1

1E = 0

dij

j

In this section a cost function is developed for the unit commitment problem of a power system with thermal units. The cost function consists of the objective function (fuel costs, idling costs, and startup costs) and equality constraints. The cost function is written in terms of the variables Vdij and Vcij where Vdij represents the status if the generator is on-line of the ith generator at time j (i.e., Vdij and Vdij if the generator is off-line) and where Vcij is the power output of the generator if it is on-line. Hence the product Vdij Vcij always represents the actual power output of the ith generator at time j . The various terms in the cost function are scaled by weighting coefficients A; B as in previous work [5]. The objective function (14) is made up of a start-up cost Si , a quadratic fuel cost (coefficients ai and bi ) and an idling cost ci

k;m

01 )) + i

B

III. GENERATOR SCHEDULING PROBLEM

)

dij

The load balance constraint must take account of the fact that generation must be provided for the load (Lj at time j ). The penalty term (15) is added to the cost function

Hence we have (13) shown at the bottom of the page. By arguments similar to those in [2] this quantity is always negative. Hence the energy function (5) is nonincreasing. This proves that the action of this system is to seek out a minimum of the energy function (5) as required. It also proves that it is valid to use truly discrete neurons within the framework of the augmented Hopfield network. This has important implications for mixed integer programming. The augmented Hopfield network, defined above, is similar to the coupled gradient network defined by Watta and Hassoun [1]. However, in the coupled gradient network, neurons with the Sigmoid nonlinearity were used to represent discrete variables, i.e., the “integrality” constraints were relaxed. When the network converged, the outputs of these continuous neurons were “thresholded” to yield discrete results. It has been shown above that it is valid to represent discrete variables by discrete neurons in the augmented Hopfield network. This means that it is not necessary to relax “integrality” constraints during computation. It is proposed that this more accurate representation of integer variables will lead to improved performance.

(

dij

j

(12)

dij

=0

i

i

(11)

dij

(S V (1 0 V

A

This is always nonpositive as gc is monotonic increasing. Consider the change in energy, (5), due to discrete neuron dij changing its state at iteration n as shown in (10) at the bottom of the page. Hence if

Wkm!ij Vckm Vdkm Vcij k;m

2

cij

9

1V

dij

(13)

458

IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 10, NO. 2, MARCH 1999

Fig. 1. Profile of energy after each iteration.

This function is monotonic increasing and so the convergence proof above remains valid. IV. RESULTS The energy function (5) has the same structure as the cost function of a typical quadratic mixed integer problem. The generator scheduling problem is such a problem and the augmented Hopfield network described above has been applied to this problem. It was found that for the problem of scheduling the 17 generators, described in [5], over 24-h feasible results within 0.05% of Lagrangian relaxation were consistently obtained. The duality gaps for Lagrangian relaxation were of the order of 0.1% to 0.15%. The proposed network occasionally found better solutions than Lagrangian relaxation. The network very rarely converged to infeasible solutions with the parameters used in these tests even when the scheduling horizon was increased to 168 hours. These improvements may be attributed to the more accurate representation of the problem by the augmented Hopfield network. Fig. 1 shows the value of the energy function at each iteration for the scheduling problem described above. This energy function is monotonically decreasing. Fig. 2 shows the energy change after each discrete neuron update. This clearly illustrates that the energy function is nonincreasing for discrete neuron updates. V. DISCUSSION In [1] it was observed that as the problem size increased the percentage of feasible solutions obtained decreased, for example as the scheduling horizon increased from 15 h to 24 h the percentage of feasible solutions obtained decreased from 80% to 55% for a tengenerator problem. This did not occur with the augmented network where it has been observed that the percentage of feasible solutions obtained depends on parameter selection and not problem size. It was observed in [1] that, “convergence to interior points of the hypercube is one of the reasons why the coupled net may produce solutions which are not feasible.” However, in the augmented network, presented here, truly discrete neurons are used and this problem cannot arise. In [1] it was also observed that, “competing penalty terms tend to frustrate each other and the result is that the final solution tends not to be feasible.” One of the penalty terms used by Watta and Hassoun was a combinatorial constraint which is not required in the augmented network.

Fig. 2. Energy change after each discrete neuron update.

The augmented Hopfield network discussed here differs from that presented in [5] in that a revised update rule for discrete neurons is used. In [7], the unit commitment problem was also tackled using a neural network. However, a Boltzmann machine was used for the discrete variables. This means that a stochastic update rule was used which does not guarantee a decreasing energy function but may avoid local minima. Results were given for scheduling 20 thermal units but as no comparison with other methods was made it is difficult to compare results with those presented here. VI. CONCLUSION This letter proposes an augmented Hopfield network similar to the coupled gradient network proposed by Watta and Hassoun [1]. This network allows mixed integer problems to be represented more accurately. The temporal unit commitment problem is solved by this network and improved results are obtained. ACKNOWLEDGMENT The authors are very grateful to the reviewers for their very constructive comments. Credit is due to Dr. P. Curran, University College Dublin, for his invaluable insight and helpful discussions. REFERENCES [1] P. B. Watta and M. H. Hassoun, “A coupled gradient network approach for static and temporal mixed-integer optimization,” IEEE Trans. Neural Networks, vol. 7, pp. 578–593, May 1996. [2] J. J. Hopfield, “Neural networks and physical systems with emergent collective computational abilities,” in Proc. Nat. Academy Sci., USA, 1982, vol. 79, pp. 2554–2558. [3] H. Sasaki, M. Watanabe, J. Kubokawa, N. Yorino, and R. Yokoyama, “A solution method of unit commitment by artificial neural networks,” IEEE Trans. Power Syst., vol. 7, pp. 974–981, Aug. 1992. [4] J. H. Park, Y. S. Kim, I. K. Eom, and K. Y. Lee, “Economic load dispatch for piecewise quadratic cost function using Hopfield neural network,” IEEE Trans. Power Syst., vol. 10, pp. 1559–1565, 1995. [5] M. P. Walsh and M. J. O’ Malley, “Augmented Hopfield network for unit commitment and economic dispatch,” IEEE Trans. Power Syst., vol. 12, pp. 1765–1775, 1997. [6] Kennedy and L. O. Chua, “Unifying the tank and Hopfield linear programming circuit and the canonical nonlinear programming circuit of Chua and Lin,” IEEE Trans. Circuits Syst., vol. CAS-34, pp. 210–214, Feb. 1987. [7] Z. J. Liu, F. E. Villaseca, and F. Renovich, “Neural networks for generation scheduling in power systems,” in Proc. Int. Joint Conf. Neural Networks, Baltimore, MD, 1992, pp. 233–238.

Suggest Documents