1

I want to find $p$ which maximizes the given functional. $p$ is a function of the form $\mathbb{R}^2 \to \mathbb{R}$. $\Omega$ is a region in the 2-d plane.

$\underset{p}{\sup} \int_\Omega \{ \lambda(\vec{\nabla}\cdot \vec{p}) - \alpha(|p| - C)\}\, dx$

Authors of the paper A Study on Continuous Max-Flow and Min-Cut Approaches has said the following to be an equivalent formulation:

$\underset{|p| \le C}{\sup} \int_\Omega \lambda \vec{\nabla}\cdot \vec{p}\, dx$

The authors further claim that it is a well known result that the above is equal to

$\int_\Omega C|\nabla\lambda|\, dx$

And hence it is the desired answer.

I was wondering if I can get the above answer by some more intuitive approach, or if someone could please explain to me what the authors are trying to say.

40 votes
  • 9,736
  • Which author of which paper is claiming all that? – Fabian Jun 04 '11 at 05:56
  • The second equivalence (which the author suggests is well known) is in the paper " A study on continuous max-flow and min-cut approaches" by Boykov et al. The first equivalence is my understanding of the paper and not necessarily what the author is claiming – AnkurVijay Jun 04 '11 at 06:02
  • Did you try partial-integration? – Fabian Jun 04 '11 at 06:10
  • The first equivalence is true when $\alpha(x) = \infty$ for $x \leq 0$, $0$ otherwise. It's a function that's often used in convex optimization, I don't remember the name for it. – trutheality Jun 04 '11 at 06:54
  • @Fabian: The more common term is "integration by parts". I'm German, I used to think it was called "partial integration", but apparently to native English speakers that sounds like you didn't finish the integration :-) (In German it's "partielle Integration".) – joriki Jun 04 '11 at 16:28
  • @joriki: I find both (http://mathworld.wolfram.com/PartialIntegration.html) but it seems integration by parts is much more common... – Fabian Jun 04 '11 at 17:11

1 Answers1

2

The author's assertion $$\underset{|p| \le C}{\sup} \int_\Omega \lambda \vec{\nabla}\cdot \vec{p}\, dx = \int_\Omega C|\nabla\lambda|\, dx \tag1$$ is true as long as $\lambda$ is sufficiently regular; being in the Sobolev class $W^{1,1}(\Omega)$ is enough. (Being Lipschitz, or having bounded continuous gradient, is more than enough.)

Inequality in one direction follows from integration by parts: $$ \int_\Omega \lambda \vec{\nabla}\cdot \vec{p}\, dx = - \int_\Omega \nabla \lambda \cdot \vec{p}\, dx \tag2$$ and slapping $|\cdot|$ on the right. In the opposite direction, we must produce $\vec p$ with $|\vec p|\le C$ for which $-\nabla\lambda \cdot \vec{p}$ is equal or very close to $C|\nabla\lambda|$. Formally, $\vec p=-C\nabla \lambda/|\nabla \lambda|$ should work, but some approximation may be necessary to make $\vec p$ smooth. Details can be found in the book by Guisti cited as [20] in the paper (Minimal surfaces and functions of bounded variation). There's also Wikipedia article on Bounded variation.

The supremum on the left of (1) is the standard definition of total variation of $\lambda$; it generalizes the one-dimensional concept of bounded variation. See Two definitions of "Bounded Variation Function"

The problem $$\underset{p}{\sup} \int_\Omega \{ \lambda(\vec{\nabla}\cdot \vec{p}) - \alpha(|p| - C)\}\, dx \tag3$$ (with appropriate $\alpha$) is a relaxation of the problem in (1). Typically, $\alpha=0$ when the argument is nonpositive, and grows rapidly for positive argument. This means that you are not prevented from taking $|p|>C$, but you'll pay a fine if you do. Relaxation is a common way to approximate constrained optimization problems with unconstrained ones. However, the relaxed problem is not "equivalent" to the original one; it is merely an approximation, which may be good or bad depending on circumstances.

40 votes
  • 9,736