I was looking into why the property that $\int_a^b f(x) \ dx = -\int_b^a f(x) \ dx$ holds true. I found that 2 common answers were that
- It comes from $\int_a^b f(x) dx + \int_b^c f(x) dx = \int_a^c f(x) dx$ for arbitrary $a \le b \le c$ (for example, in this answer).
- It comes from the fundamental theorem of calculus $\int_a^b f(x)\,dx = F(b) - F(a)$ (for example, in this answer).
From my understanding of these answers, the first one has a more lenient hypothesis, since to apply F.T.C. we need the function to have an antiderivative, which is not always the case.
Knowing this, I was wondering about the extension of this question into a line integral. Let's say that $C$ is a path that starts at point $p$ and ends at point $q$. If I define $C^*$ to be the same path but staring at $q$ and ending at $p$, is it generally true that
$$ \int_{C} \mathbf{F} \cdot d\mathbf{r} = - \int_{C^*} \mathbf{F} \cdot d\mathbf{r} \quad ? $$
where here $r:[t_0, t_f]\subset \mathbb{R} \to C$, with $r(t_0) = p$ and $r(t_f) = q$ being a biyective parametrization of our path.
I know that I can show this to be true if $\mathbf{F}$ happens to be a conservative field using the gradient theorem, in a similar manner as the 1D case can be shown by F.T.C., but since this is not always true I don't know if I can say that this holds in general.
Is there a way to show that this always holds? Or alternatively, is there a counterexample where this fails? Thank you!