Stochastic generalized-differentiable functions in the problem of ...

4 downloads 0 Views 489KB Size Report
function is itself a generalized-differentiable function. In addition the ... differentiable functions, the method of stochastic generalized gradients [i]. It is shown.
STOCHASTIC GENERALIZED-DIFFERENTIABLE FUNCTIONS IN THE PROBLEM OF NONCONVEX NONSMOOTH STOCHASTIC OPTIMIZATION V. I. Norkin

UDC 517.987.1

In this paper we study the problem of nonconvex nonsmooth stochastic programming. The problem of convex stochastic programming is considered in [1-5]. Some generalizations to the nonconvex case are in [6, 7]. In the present paper we discuss the possibility of applying the method of stochastic generalized gradients for solving problems of stochastic programming with so-called generalized-differentiable functions [8]. The class of generalized-differentiable functions contains the continuously differentiable, convex and concave, weakly convex and weakly concave [7], semismooth [9] functions. This class is closed with respect to finite operations of maximum, minimum, composition, and, as shown in the paper, with respect to taking the expectation. For generalized-differentiable functions it turns out to be possible to introduce the concept of pseudogradient (of a pseudogradient map). In many respects pseudogradients are analogous, for example, to subgradients of convex functions. In particular, they can be used in methods of minimization of generalized-differentiable functions, where convergence, of course, is possible only to pseudostationary points [i0]. Pseudogradients of generalizeddifferentiable functions are defined nonuniquely and there exists a whole class of pseudogradient maps for one function. As pseudogradient one can take, for example, generalized gradients in the sense of Clark [ii], an inclusion minimal pseudogradient map. For the computations of pseudogradients of compound functions, maximum functions with respect to pseudogradients of constituent functions, there are formulas (a calculus), analogous to the formulas of mathematical analysis and convex analysis. It is shown in the paper that the expectation of a stochastic generalized-differentiable function is itself a generalized-differentiable function. In addition the expectation of a pseudogradient map (as the integral of a multivalued map) of the stochastic function, standing under the expectation sign, is itself a pseudogradient map of a function, the expectation. In this respect generalized-differentiable functions are analogous to convex functions, so it turns out to be possible to extend to the problem of stochastic programming with generalizeddifferentiable functions, the method of stochastic generalized gradients [i]. It is shown that sections which are measurable in the collections of determinate and stochastic variables of a pseudogradient map of the function standing under the expectation sign play an important role here. In the paper we investigate questions of existence and construction of such sections and in addition we make essential use of the notation, facts, and results of [12], and also of the results of the theory of multivalued maps [13-16]. Stochastic Generalized-Differentiable Functions Definition i [8, i0]. The function f:R n + R I is said to be generalized-differentiable at the point x, if in some neighborhood of x there is defined a multivalued map Gf:R n + 2Rn which is upper-semicontinuous at x, such that the sets Gf(y) are nonempty, bounded, convex, and closed, and at the point x one has

f (y) = f (x) + (g, y - - x) + o (x, y, g),

( 1)

w h e r e gEGl(y), (g, y . x ) b e i n g t h e s c a l a r p r o d u c t , o(x, yk, gk)/llykxll._+O f o r a n y s e q u e n c e s yk ~ x and gk6G~(yk) [the last condition is equivalent with o(x, y, g)/l[y--x[]-~O uniformly with respect to y + x and with respect to g6G~(y)]. A function is said to be generalized-differentiable in a domain if it is generalized-differentiable at each point of the domain. The vectors are said to be pseudogradients (or generalized gradients) of the function f at the point y. LEMMA i. A generalized-differentiable function f(x) is locally Lipschitz and its Clark subdifferential 8f(x) [Ii] is an inclusion minimal pseudogradient map for f(x). Translated from Kibernetika, No. 6, pp. 98-102, November-December, 1986. ticle submitted November Ii, 1983.

804

0011-4235/86/2206-0804512.50

Original ar-

9 1987 Plenum Publishing Corporation

Proof.

By the generalized mean value theorem [8], for x, y 6 K

/(x)l = I(g, y - x ) 1~< fK 11y - xll,

If(y) where

-gEGt(x + X(y--x)), 0 ~ , < ~ 1, LK = sup{Hgl[lgEGj(z), zEcoK}.

If K is compact, then the convex hull co K is compact a n d ~ K < +~ by the boundedness on a compact set of a compact-valued upper-semicontinuous map. This proves that f(x) is Lipschitz.

gEO[(x) and d 6 R n , (g, d) ~ lira sup [/(y -k ~,d) - - [ (y)i/L

By the definition of 8f(x) [ii], for any

6~o Ily-xll

Suggest Documents