#### Order generalised gradient and operator inequalities

Dragomir and Kikianty Journal of Inequalities and Applications
Order generalised gradient and operator inequalities
Sever S Dragomir 1
Eder Kikianty 0
0 Department of Pure and Applied Mathematics, University of Johannesburg , PO Box 524, Auckland Park, 2006 , South Africa
1 School of Engineering and Science, Victoria University , PO Box 14428, Melbourne, Victoria 8001 , Australia
We introduce the notion of order generalised gradient, a generalisation of the notion of subgradient, in the context of operator-valued functions. We state some operator inequalities of Hermite-Hadamard and Jensen types. We discuss the connection between the notion of order generalised gradient and the Gâteaux derivative of operator-valued functions. We state a characterisation of operator convexity via an inequality concerning the order generalised gradient. MSC: 47A63; 46E40 f n i= © 2015 Dragomir and Kikianty; licensee Springer. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited.
subgradient inequality; operator convex function; operator inequality
1 Background
Convex functions play a crucial role in many fields of mathematics, most prominently in
optimisation theory. There are two main important inequalities which characterise convex
functions, namely Jensen’s and Hermite-Hadamard’s inequalities. In (), Jensen
ai∗xiai ≤
ai∗f (xi)ai
for any a, b ∈ I.
f (x) dx ≤
for any a, b ∈ I.
Inequality () is referred to as Jensen’s inequality. Hermite-Hadamard’s inequality provides
a refinement for Jensen’s inequality, namely, for a convex function f : I ⊂ R → R,
We refer the reader to Section for further details regarding these inequalities.
Similarly to the case of real-valued functions, the operator convexity can be
characterised by some operator inequalities. Hansen and Pedersen [] characterise operator
convexity via a non-commutative generalisation of Jensen’s inequality. If f is a real continuous
function on an interval I, and A(H) is the set of bounded self-adjoint operators on a Hilbert
space H with spectra in I, then f is operator convex if and only if
for x, . . . , xn ∈ A(H) and a, . . . , an ∈ B(H) with in= ai∗ai = . We refer the reader to
Section for further details regarding this characterisation.
One of the useful differential properties of convex functions is the fact that their
onesided directional derivatives exist universally [, p.]. Just as the ordinary two-sided
directional derivatives of a differentiable function can be described in terms of gradient
vectors, the one-sided directional derivatives can be described in terms of ‘subgradient’
vectors [, p.]. A vector x∗ is said to be a subgradient of a convex function f : K ⊂ Rn → R
at point x if
f (x) – f (y) ≥ x∗ · (x – y) for all y ∈ K .
This condition is referred to as the subgradient inequality [, p.]. If () holds for every
x ∈ K , then () characterises the convexity of f (cf. Eisenberg [, Theorem ]).
In this paper, we introduce the notion of order generalised gradient (cf. Section ) for
operator-valued functions, which is a generalisation of () (without the assumption of
convexity) in the settings of bounded self-adjoint operators on a Hilbert space.
Furthermore, we state some inequalities of Hermite-Hadamard and Jensen types for the order
generalised gradient in Section . Finally, in Section , we state the connection between
the order generalised gradient and Gâteaux derivative of operator-valued functions. We
state a characterisation of convexity analogues to () in the context of operator-valued
functions.
2 Inequalities for convex functions
This section serves as a point of reference for known results regarding some inequalities
related to convex functions (both real-valued and operator-valued functions).
2.1 Jensen’s inequality
Let C be a convex subset of the linear space X and f be a convex function on C. If p =
(p, . . . , pn) is a probability sequence and x = (x, . . . , xn) ∈ Cn, then
This inequality is referred to as Jensen’s inequality. Recently, Dragomir [] obtained the
following refinement of Jensen’s inequality:
pixi ≤
pjxj ≤ k∈m{,i.n..,n}
+ pkf (xk)
+ pkf (xk)
where f , xk and pk are as defined above. For other refinements of Jensen’s inequality, we
refer the reader to Pečarić and Dragomir [] and Dragomir [].
The above result provides a different approach to the one that Pečarić and Dragomir []
obtained in
pixi ≤
i,...,ik=
≤ · · · ≤
pixi ≤
pi . . . pik f (qxi + · · · + qkxik+ )
where ≤ k ≤ n and p, x are as defined above.
For more refinements and applications related to the generalised triangle inequality,
the arithmetic-geometric mean inequality, the f -divergence measures, Ky Fan’s inequality,
etc., we refer the readers to [–] and [].
2.2 Hermite-Hadamard’s inequality
The following inequality also holds for any convex function f defined on R:
It was first discovered by Hermite in in the journal Mathesis []. However, this
result was nowhere mentioned in the mathematical literature and was not widely known as
Hermite’s result [].
Beckenbach, a leading expert on the history and the theory of convex functions wrote
that this inequality was proven by Hadamard in []. In , Mitrinović found
Hermite’s note in Mathesis []. Since () was known as Hadamard’s inequality, the inequality
is now commonly referred to as Hermite-Hadamard’s inequality [].
Hermite-Hadamard’s inequality has been extended in many different directions. One of
the extensions of this inequality is in the vector space settings. Firstly, we start with the
following definitions and notation: Let X be a vector space and x, y be two distinct vectors
in X. We define the segment generated by x and y to be the set
[x, y] := ( – t)x + ty, t ∈ [, ] .
gx,y(t) = f ( – t)x + ty .
For any real-valued function f defined on the segment [x, y], there exists an associated
function gx,y : [, ] → R with
We remark that f is convex on [x, y] if and only if g is convex on [, ]. For any convex
function defined on a segment [x, y] ⊂ X, we have the Hermite-Hadamard integral inequality
(cf. Dragomir [, p.] and Dragomir [, p.]):
which can be derived by the classical Hermite-Hadamard inequality () for the convex
function gx,y : [, ] → R. Consider the function f (x) = x p (x ∈ X and ≤ p < ∞), which
is convex on X, then we have the following norm inequality (derived from ()) [, p.]:
( – t)x + ty p dt ≤
for any x, y ∈ X.
2.3 Non-commutative generalisation of Jensen’s inequality
Hansen [] discussed Jensen’s operator inequality for operator monotone functions.
Motivated by Aujla’s work [] on the matrix convexity of functions of two variables,
Hansen [] characterised operator convex functions of two variables in terms of a
noncommutative generalisation of Jensen’s inequality (cf. [, Theorem .]). A simplified
proof of this result formulated for matrices is given in Aujla []. The case for several
variables is given in Hansen []. The case for self-adjoint elements in the algebra Mn of
n-square matrices is given in Hansen and Pedersen []. Finally, Hansen and Pedersen
[] presented a generalisation of the above results for self-adjoint operators defined on a
Hilbert space.
2.4 Subgradient inequality
Recall the following definition of a subgradient [].
f (x) – f (y) ≥ x∗ · (x – y) for all y ∈ K .
The following theorem is a useful characterisation of convexity (cf. Eisenberg [,
Theorem ]).
Theorem If U is a nonempty open subset of Rn, f : U → R is a differentiable function
on U, and K is a convex subset of U, then f is convex on K if and only if
f (x) – f (y) ≥ (x – y)T f (y) for all x, y ∈ K
where f (y) denotes the gradient of f at y.
This theorem has been generalised and employed in obtaining optimality conditions of
a non-differentiable minimax programming problem in complex spaces (cf. Lai and Liu
[]). Note that (x – y)T f (y) can be written as f (y) · (x – y), which can be interpreted as
the directional derivative of f at point y in x – y direction.
3 Order generalised gradient
Throughout the paper, we use the following notation. We denote by B(H) the Banach
algebra of all bounded linear operators on the Hilbert space (H, ·, · ), and by A(H) the linear
subspace of all self-adjoint operators on H. We denote by P+(H) ⊂ A(H) the convex cone
of all positive definite operators defined on H, that is, P ∈ P+(H) if and only if Px, x ≥ ,
and for all x ∈ H, Px, x = implies x = . This gives a partial ordering (we refer to it as
the operator order) on A(H), where two elements A, B ∈ A(H) satisfy A ≤ B if and only if
B – A ∈ P+(H).
Definition Let C be a convex set in A(H). A function f : C → A(H) has the function
∇f : C × A(H) → A(H) as an order generalised gradient if
f (A) – f (B) ≥ ∇f (B, A – B) for any A, B ∈ C
in the operator order of A(H).
Remark We remark that in (), if f is a real-valued differentiable function on an open
set U ⊂ Rn, and ∇f is the gradient of f , then () becomes (). We also note that there is
no assumption of convexity at this point. We discuss the convexity case in Section .
∇f (B, X) := Q(BX + XB)Q
is an order generalised gradient for f .
Proof Observe that BX + XB ∈ A(H) and if P ∈ A(H) then P(BX + XB)P ∈ A(H). We need
to prove that
QAQ – QBQ ≥ Q B(A – B) + (A – B)B Q.
f (A) – f (B) ≥ ∇f (B, A – B)
for any A, B ∈ A(H), that is,
hence () is equivalent to
which is also equivalent to
Q(A – B)Q ≥
which always holds. This completes the proof.
∇f (B, X) := XPB + BPX
as an order generalised gradient.
APA – BPB ≥ (A – B)PB + BP(A – B)
= APB – BPB + BPA – BPB,
APA – APB – BPA + BPB ≥ .
Recall P+(H) ⊂ A(H) the convex cone of all positive definite operators defined on H,
that is, P ∈ P+(H) if and only if Px, x ≥ , and for all x ∈ H, Px, x = implies x = .
Proposition Let f : P+(H) → A(H) defined by
f (A) = QA–Q,
∇f (B, X) = –QB–XB–Q
is an order generalised gradient for f .
Proof For B ∈ P+(H), B– ∈ P+(H) then B–XB– ∈ P+(H) for any X ∈ P+(H) and thus
QB–XB–Q ∈ P+(H) showing that ∇f (B, X) ∈ A(H). We need to prove that
or equivalently
QA–Q – QB–Q ≥ –QB–(A – B)B–Q,
QA–(B – A)B–Q + QB–(A – B)B–Q ≥
QA–(B – A)B–Q – QB–(B – A)B–Q ≥
Q A– – B– (B – A)B–Q ≥ .
A– – B– A A– – B– ≥
and Q ∈ A(H).
= Q A– – B– A A– – B– Q ≥ ,
4 Inequalities involving order generalised gradients
We start this section by the following definition.
Definition An order generalised gradient ∇f : C × A(H) → A(H) is
(i) operator convex if
for any B ∈ C, X, Y ∈ A(H) and α, β ≥ with α + β = ;
(ii) operator sub-additive if
∇f (B, X + Y ) ≤ ∇f (B, X) + ∇f (B, Y )
for any B ∈ C and X, Y ∈ A(H);
(iii) positive homogeneous if
for any B ∈ C, X ∈ A(H) and α ≥ ;
(iv) operator linear if
R
for any B ∈ C, X, Y ∈ A(H) and α, β ∈ .
It can be seen that if ∇f (·, ·) is operator linear, then it is positive homogeneous and
subadditive. If ∇f (·, ·) is positive homogeneous and sub-additive, then it is operator convex.
Theorem Let f : C → A(H) be operator convex and ∇f : C × A(H) → A(H) be an order
generalised gradient for f . Then, for any A, B, ∈ C and t ∈ [, ], we have the inequalities
≥ tf (A) + ( – t)f (B) – f tA + ( – t)B
≥ ∇f tA + ( – t)B, .
which is equivalent to
f (B) – f (A) ≥ ∇f (A, B – A),
–∇f (A, B – A) ≥ f (A) – f (B).
Since C is a convex set, hence by () we have
–∇f A, ( – t)(B – A) ≥ f (A) – f tA + ( – t)B
≥ ∇f tA + ( – t)B, –( – t)(B – A)
–∇f B, –t(B – A) ≥ f (B) – f tA + ( – t)B
≥ ∇f tA + ( – t)B, t(B – A)
≥ tf (A) + ( – t)f (B) – f tA + ( – t)B
≥ ∇f tA + ( – t)B, ,
which completes the proof.
Corollary Under the assumptions of Theorem ,
() If ∇f (·, ·) is positive homogeneous, then we have
() If ∇f (·, ·) is operator linear, then
–t( – t) ∇f (B, A – B) + ∇f (A, B – A)
≥ tf (A) + ( – t)f (B) – f tA + ( – t)B ≥ .
t( – t) ∇f (B, B – A) – ∇f (A, B – A)
≥ tf (A) + ( – t)f (B) – f tA + ( – t)B ≥ .
Corollary Under the assumptions of Theorem , if ∇f is positive homogeneous, then
we have the following inequality:
f tA + ( – t)B dt ≥ .
Example
. We consider the function f (A) = QAQ with Q ∈ A(H). We note that the order
generalised gradient
∇f (B, X) = Q(BX + XB)Q
is operator linear. Then
= Q (B – A)X + X(B – A) Q.
For X = B – A, we then get
Applying inequality () we have
t( – t)Q(B – A)Q
∇f (B, B – A) – ∇f (A, B – A) = Q(B – A)Q.
for any A, B ∈ A(H) and Q ∈ A(H).
. We consider the function f (A) = APA with P ∈ P(H). We note that the order
generalised gradient
∇f (B, X) = XPB + BPX
is operator linear. Then
If X = B – A, we then get
= XP(B – A) + (B – A)PX.
Applying inequality () we have
t( – t)(B – A)P(B – A)
for any A, B ∈ A(H) and P ∈ P(H).
. For f (A) = QA–Q with Q ∈ A(H) and A ∈ P+(H), we note that the order generalised
gradient
∇f (B, X) = –QB–XB–Q
is operator linear. Then
For X = B – A, we get
∇f (B, B – A) – ∇f (A, B – A)
= –QB–(B – A)B–Q + QA–(B – A)A–Q
= –Q B– – B–AB– Q + Q A–BA– – A– Q
By () we have the inequality
t( – t) QA–BA–Q + QB–AB–Q – QB–Q – QA–Q
for any A, B ∈ P+(H) and Q ∈ A(H).
4.2 Jensen type operator inequalities
In this subsection, we will state inequalities of Jensen type for order generalised gradients.
Theorem Let f : C ⊂ A(H) → A(H) be a function that possesses ∇f : C × A(H) → A(H)
as an order generalised gradient. Then, for any Ai ∈ C, i ∈ {, . . . , n} and pi ≥ with Pn :=
in= pi > , we have the inequalities
pjf (Aj) – f
pj∇f
Pn i=
Pn i=
piAi, Aj –
Pn i=
Proof From the definition of an order generalised gradient we have
–∇f (A, B – A) ≥ f (A) – f (B) ≥ ∇f (B, A – B).
in= piAi in (), then we get
≥ f (Aj) – f
≥ ∇f
Pn i=
Pn i=
piAi, Aj –
Pn i=
Theorem Under the assumptions of Theorem , we have the following results:
Pn j=
≥ f (A)
Pn j=
Pn j=
pjf (Aj) –
pj∇f (A, Aj – A)
pjf (Aj) +
pj∇f (Aj, A – Aj).
for any j ∈ {, . . . , n}. We obtain the desired inequalities () by multiplying the inequalities
in () by pj ≥ and taking the sum over j from to n; and divide the resulted inequalities
Pn j=
Pn j=
Pn j=
() If ∇f is linear, we have
Pn i=
Pn i=
pjf (Aj) – f
piAi ≥ .
pjf (Aj) – f
piAi ≥ ∇f
Pn i=
pj∇f Aj, Aj –
Pn i=
Pn i=
pjf (Aj) – f
piAi ≥ .
Proof From () we also have
Pn j=
pj∇f (A, Aj – A) ≥ f (A) –
Pn j=
pj∇f (Aj, A – Aj),
pj∇f (Aj, A) –
pj∇f (Aj, Aj)
pj∇f (A, Aj) +
pj∇f (A, A).
Pn j=
Pn j=
which completes the proof.
Pn j=
Pn j=
Therefore, if A ∈ A(H) is such that
Pn j=
then we have the Slater type inequality (cf. Slater [] and Pečarić [])
5 Connection with Gâteaux derivatives
In this section, we consider the connection between order generalised gradients and
Gâteaux derivatives. We refer the reader to Dragomir [] for some inequalities of Jensen
type, involving Gâteaux derivatives of convex functions in linear spaces.
Let C ⊂ A(H) be a convex set. Then f : C → A(H) is said to be operator convex if for all
t ∈ [, ] and A, B ∈ C, we have
f ( – t)A + tB ≤ ( – t)f (A) + tf (B).
We start with the following lemmas.
t→li m± F(t) x, y = lim F(t)x, y
t→±
for all nonzero x, y ∈ H.
Proof We provide the proof for the right-sided limit, as the proof for the left-sided limit
follows similarly. Let ε > and for x, y ∈ H, where x, y = , set ε = ε/( x H y H ). Since
limt→+ F(t) = L, there exists δ such that
F(t)x, y – Lx, y ≤
when < t < δ. Note that L ∈ B(H) since B(H) is a Banach space, hence F(t) – L is also a
bounded linear operator. Now, we have
≤ F(t) – L B(H) x H y H < ε x H y H = ε,
which completes the proof.
∇–f (A) (B) = lim f (A + tB) – f (A)
t→– t
∇+f (A) (B) = lim f (A + tB) – f (A)
t→+ t
exist and are bounded self-adjoint operators.
Proof Fix an arbitrary B ∈ A(H), and let
We want to show that G is nondecreasing. Let < t < t, then
f (A – tB) – f (A) = – f [A + t(–B)] – f (A)
–t t
≤ – f [A + t(–tB)] – f (A) = f (A – t–Bt) – f (A) .
Note also that
which implies that
f (A) ≤ f (A + tB) + f (A – tB);
f (A + tB) – f (A) ≥ – f (A – tB) – f (A) ,
which implies that
By the above expositions, we conclude that G is nondecreasing on R \ {}. This proves that
both (∇–f (A))(B) and (∇+f (A))(B) exist and are bounded linear operators by Lemma .
Note that for all t ∈ R, t = and A, B ∈ A(H),
t→li m± f [B + t(A –t B)] – f (B) x, y
= t→li m± f [B + t(A –t B)] – f (B) x, y
= t→li m± x, f [B + t(A –t B)] – f (B) y
= x, t→li m± f [B + t(A –t B)] – f (B) y ,
which completes the proof.
is an order generalised gradient.
f [B + t(A –t B)] – f (B) = f [( – t)B +t tA] – f (B)
( – t)f (B) + tf (A) – f (B)
≤ t
= f (A) – f (B).
This is equivalent to
Note that for all x ∈ H,
∈ P+(H).
by Lemma . Since K ∈ P+(H), Kx, x ≥ , hence [limt→± K ]x, x ≥ , which implies
that
lim f (A) – f (B) – f [B + t(A – B)] – f (B)
t→± t
∈ P+(H).
Lemma gives us
which implies that
∇–f (B) (A – B) ≤ ∇+f (B) (A – B),
∇–f (B) (A – B) ≤ f (A) – f (B).
Proposition Let f : A(H) → A(H) be operator convex and A ∈ A(H). The right
Gâteaux derivative of f is sub-additive, i.e.
∇+f (A) (B + C) ≤ ∇+f (A) (B) + ∇+f (A) (C)
∇–f (A) (B + C) ≥ ∇–f (A) (B) + ∇–f (A) (C)
for any B, C ∈ A(H).
f [A + t(B + C)] – f (A)
t =
f [ (A + tB) + (A + tC)] – f (A)
t
f (A + tBt) – f (A) + f (A + tCt) – f (A)
By a similar argument to the proof of Theorem , we conclude that
≤ tl→im+ f (A + tB) – f (A) + lim f (A + tC) – f (A)
t t→+ t
= ∇+f (A) (B) + ∇+f (A) (C)
as desired. The proof for the left Gâteaux derivative of f follows similarly.
Remark We remark that the Gâteaux (lateral) derivative(s) is always positive
homogeneous with respect to the second variable, i.e. for any function f : A(H) → A(H) and
fixed A ∈ A(H),
for all α ≥ and B ∈ A(H). The Gâteaux derivative, on the other hand, is always
homogeneous with respect to the second variable, i.e. for any function f : A(H) → A(H) and fixed
A ∈ A(H),
for all α ∈ C and B ∈ A(H).
The following result restates Theorem in the setting of operator-valued functions.
Corollary Let C ⊂ A(H) be a convex set and f : C → A(H) be a Gâteaux differentiable
function. Then f is operator convex if and only if ∇f defined by
∇±f (A) (B) = lim f (A + tB) – f (A)
t→± t
∇f (A) (B) = lim f (A + tB) – f (A)
t→ t
∇f (B) (A – B) ≤ f (A) – f (B)
are order generalised gradients. Since f is assumed to be Gâteaux differentiable, both limits
are equal, hence
is an order generalised gradient for any A, B ∈ C. Conversely, we have the following
inequality:
for any A, B ∈ C. Let C, D ∈ C, t ∈ (, ), and choose A = C and B = tC + ( – t)D. Then we
have
Multiply () by t and () by ( – t), and add the resulting inequalities to obtain
f tC + ( – t)D ≤ tf (C) + ( – t)f (D),
which completes the proof.
The following result follows by Corollary and employing the fact that the Gâteaux
lateral derivatives are positive homogenous.
Corollary (Hermite-Hadamard type inequality) Let f : C ⊂ A(H) → A(H) be operator
convex. The following inequality holds:
∇±f (B) (A – B) + ∇±f (A) (B – A)
f tA + ( – t)B dt ≥ .
∇ log(A) (B) =
(sI + A)–B(sI + A)– ds
≥ –
(sI + B)–(A – B)(sI + B)– ds
(sI + A)–(B – A)(sI + A)– ds
log tA + ( – t)B dt ≥ .
Then we have the following inequalities:
tA + ( – t)B log tA + ( – t)B dt ≥ .
The following results follow by Theorems and .
Corollary (Jensen type inequality) Let f : C ⊂ A(H) → A(H) be operator convex.
Then, for any Ai ∈ C, i ∈ {, . . . , n} and pi ≥ with Pn := in= pi > , we have the inequalities
Pn j=
pj ∇±f (Aj)
piAi – Aj
pjf (Aj) – f
Pn i=
Pn i=
pj ∇±f
Pn i=
We also have
Pn j=
≥ f (A)
Pn j=
Pn j=
pjf (Aj) –
pj ∇±f (A) (Aj – A)
pjf (Aj) +
pj ∇±f (Aj) (A – Aj).
piAi – Aj (sI + Aj)– ds
Pn j=
≥ –
≥ –
Pn j=
Pn j=
Pn i=
Pn i=
pj log(Aj) + log
Pn i=
Pn i=
Pn i=
Pn j=
Pn j=
pj log(Aj) +
Pn j=
≥ – log(A)
≥ –
Pn j=
∞(sI + A)–(Aj – A)(sI + A)– ds
pj log(Aj) –
∞(sI + Aj)–(A – Aj)(sI + Aj)– ds.
Pn j=
Pn i=
piAi – Aj ds
Pn i=
Pn i=
pjAj log(Aj) –
Pn i=
Pn i=
Pn i=
Pn j=
≥ A log(A)
pjAj log(Aj) –
Pn j=
∞ (sI + A)–(A – I)(Aj – A) ds
pjAj log(Aj) +
Pn j=
6 Conclusions
f (A) – f (B) ≥ ∇f (B, A – B) for any A, B ∈ C
in the operator order of A(H). We have the following operator inequalities.
() Operator inequalities of Hermite-Hadamard type:
f tA + ( – t)B dt ≥ for any A, B ∈ C.
() Operator inequalities of Jensen type:
Pn j=
≥ Pn j= pj∇f Pn i=
Pn i=
pjf (Aj) – f
piAi, Aj –
Pn i=
Pn j=
≥ f (A)
Pn j=
pjf (Aj) –
pj∇f (A, Aj – A)
Pn j=
pjf (Aj) +
pj∇f (Aj, A – Aj)
for any A ∈ C, Ai ∈ C, i ∈ {, . . . , n} and pi ≥ with Pn :=
() Operator inequalities of Slater type: if ∇f is linear and A ∈ A(H) is such that
in= pi > .
Pn j=
Pn j=
f (A) ≥
Order generalised gradients extend the notion of subgradients, without the assumption
of convexity, for operator-valued functions. This notion is also connected to the Gâteaux
f (A + tB) – f (A)
A, B ∈ C
is an order generalised gradient. Furthermore, if f : C → A(H) is a Gâteaux differentiable
f (A + tB) – f (A)
A, B ∈ C
is an order generalised gradient. This characterisation of convexity is a generalised version
of Theorem of Section (cf. Eisenberg [, Theorem ]).
Competing interests
The authors declare that they have no competing interests.
Authors’ contributions
SSD and EK contributed equally in all stages of writing the paper. All authors read and approved the final manuscript.
Acknowledgements
The research of E Kikianty is supported by the Claude Leon Foundation. The authors would like to thank the anonymous
referees for valuable suggestions that have been incorporated in the final version of the manuscript.
1. Hansen , F, Pedersen, GK : Jensen's operator inequality . Bull. Lond. Math. Soc . 35 ( 4 ), 553 - 564 ( 2003 )
2. Rockafellar , RT, Tyrrell , R : Convex Analysis, Princeton Landmarks in Mathematics. Reprint of the 1970 original, Princeton paperbacks. Princeton University Press, Princeton ( 1997 )
3. Eisenberg , E: A gradient inequality for a class of nondifferentiable functions . Oper. Res . 14 , 157 - 163 ( 1966 )
4. Dragomir , SS: A refinement of Jensen's inequality with applications for f -divergence measures . Taiwan. J. Math. 14 ( 1 ), 153 - 164 ( 2010 )
5. Pecˇaric´, JE, Dragomir, SS: A refinement of Jensen inequality and applications . Stud. Univ. Babes¸-Bolyai , Math. 24 ( 1 ), 15 - 19 ( 1989 )
6. Dragomir , SS: A further improvement of Jensen's inequality . Tamkang J. Math. 25 ( 1 ), 29 - 36 ( 1994 )
7. Dragomir , SS: An improvement of Jensen's inequality . Bull. Math. Soc. Sci. Math. Roum . 34 ( 82 )(4), 291 - 296 ( 1990 )
8. Dragomir , SS: Some refinements of Ky Fan's inequality . J. Math. Anal. Appl . 163 ( 2 ), 317 - 321 ( 1992 )
9. Dragomir , SS: A new improvement of Jensen's inequality . Indian J. Pure Appl. Math. 26(10) , 959 - 968 ( 1995 )
10. Dragomir , SS: Semi-Inner Products and Applications . Nova Science Publishers, New York ( 2004 )
11. Mitrinovic´, DS, Lackovic´, IB: Hermite and convexity . Aequ. Math. 28 , 229 - 232 ( 1985 )
12. Pecˇaric´, JE, Proschan , F, Tong, YL : Convex Functions, Partial Orderings , and Statistical Applications . Academic Press, San Diego ( 1992 )
13. Beckenbach , EF: Convex functions . Bull. Am. Math. Soc. 54 , 439 - 460 ( 1948 )
14. Dragomir , SS: An inequality improving the first Hermite-Hadamard inequality for convex functions defined on linear spaces and applications for semi-inner products . J. Inequal. Pure Appl. Math . 3 ( 2 ), Article ID 31 ( 2002 )
15. Dragomir , SS: An inequality improving the second Hermite-Hadamard inequality for convex functions defined on linear spaces and applications for semi-inner products . J. Inequal. Pure Appl. Math . 3 ( 3 ), Article ID 35 ( 2002 )
16. Pecˇaric´, JE, Dragomir, SS: A generalisation of Hadamard's inequality for isotonic linear functional . Rad. Mat . 7 , 103 - 107 ( 1991 )
17. Hansen , F: An operator inequality . Math. Ann. 246 ( 3 ), 249 - 250 ( 1979 /80)
18. Aujla , JS: Matrix convexity of functions of two variables . Linear Algebra Appl . 194 , 149 - 160 ( 1993 )
19. Hansen , F: Jensen's operator inequality for functions of two variables . Proc. Am. Math. Soc . 125 ( 7 ), 2093 - 2102 ( 1997 )
20. Aujla , JS: On an operator inequality . Linear Algebra Appl . 310 ( 1-3 ), 3 - 47 ( 2000 )
21. Hansen , F: Operator convex functions of several variables . Publ. Res. Inst. Math. Sci . 33 ( 3 ), 443 - 463 ( 1997 )
22. Hansen , F, Pedersen, GK : Jensen's inequality for operators and Löwner's theorem . Math. Ann. 258 ( 3 ), 229 - 241 ( 1981 /82)
23. Lai , H-C , Liu, J-C: Duality for nondifferentiable minimax programming in complex spaces . Nonlinear Anal . 71 (12), e224 - e233 ( 2009 )
24. Slater , ML: A companion inequality to Jensen's inequality . J. Approx. Theory 32 ( 2 ), 160 - 166 ( 1981 )
25. Pecˇaric´, JE: A multidimensional generalization of Slater's inequality . J. Approx. Theory 44 ( 3 ), 292 - 294 ( 1985 )
26. Dragomir , SS: Inequalities in terms of the Gâteaux derivatives for convex functions on linear spaces with applications . Bull. Aust. Math. Soc. 83 ( 3 ), 500 - 517 ( 2011 )
27. Pedersen , GK: Operator differentiable functions . Publ. Res. Inst. Math. Sci . 36 ( 1 ), 139 - 157 ( 2000 )