1

Let $L$ be a differential operator.

$$$$

We suppose that $\phi: \displaystyle{\bigwedge_{j=1}^n L_j x=f_j}$ and we assume that $\phi$ can be written as $Lx=f \land \psi$, where $\psi$ doesn't contain any $x$.

We prove this by induction on the number of equations, $n$.

Base case: For $n=1$ we have one equation, so it is of the form $Lx=f$.

Inductive hypothesis: We suppose that it holds for $n=k$, i.e., if $\phi$ contains $k$ equations, then we can reduce it into the form $$Lx=f \land \psi \ \ \text{ where } \psi \text{ doesn't contain any } x.$$

Inductive step: We will show that it holds for $n=k+1$, i.e., if we have $k+1$ equations we can reduce this system into the form $$Lx=f \land \psi \ \ \text{ where } \psi \text{ doesn't contain any } x.$$ From the inductive hypothesis we know that we can reduce the first $k$ equations into the above form. So we have two equations that contain $x$ and its derivatives. We add these two equations and we get a new differential equation $Lx=f$. So the initial system is equivalent to the new differential equation $Lx=f$.

$$$$


$$$$

We suppose that $\phi: \displaystyle{\bigwedge_{j=1}^n L_j x \neq g_j}$ and we assume that $\phi$ can be written as $Lx \neq g \land \psi$, where $\psi$ doesn't contain any $x$.

We prove this by induction on the number of inequations, $n$.

Base case: For $n=1$ we have one inequation, so it is of the form $Lx \neq g$.

Inductive hypothesis: We suppose that it holds for $n=k$, i.e., if $\phi$ contains $k$ inequations, then we can reduce it into the form $$Lx \neq g \land \psi \ \ \text{ where } \psi \text{ doesn't contain any } x.$$

Inductive step: We will show that it holds for $n=k+1$, i.e., if we have $k+1$ inequations we can reduce this system into the form $$Lx \neq g \land \psi \ \ \text{ where } \psi \text{ doesn't contain any } x.$$ From the inductive hypothesis we know that we can reduce the first $k$ inequations into the above form. So we have two inequations that contain $x$ and its derivatives, $Lx \neq g \land L_{k+1}x \neq g_{k+1} \land \psi$. These inequations are equivalent to $L_{k+1}x = f_{k+1}+a \ \ , \ \ Lx \neq f+b$, where $a, b \neq 0$. Now we can add these two equations and we get $\tilde{L}x=L_{k+1}x+Lx=f_{k+1}+a+f+b, a, b \neq 0$. Then we have $\tilde{L}x \neq f_{k+1}+f$.

Is the last part correct? But if $a=-b$? Do we maybe have to suppose that $a,b >0$ ?

$$$$


$$$$

Is everything correct?

Could I improve something at the formulation?

Mary Star
  • 13,956
  • 1
    the original one looks ok in general, also I don't understand where is this $\psi$ comes from. The second one looks wrong. $$-1\ne -2$$ $$2\ne 3$$ but $$-1+2=-2+3$$ for differential equations try to find a better counter example then the following one $$-u' \ne 0$$ $$u'\ne 0$$ but the sum is zero. – Michael Medvinsky Nov 11 '15 at 15:20
  • So, we don't need $\psi$ ? I thought that we need $\psi$ because it my notes I found the following examples: $$\begin{pmatrix} x'+x=f\ x''+x'=g \end{pmatrix} \Leftrightarrow \begin{pmatrix} x'+x=f\ f'=g \end{pmatrix}$$ $$\begin{pmatrix} x'+x=f\ x''+2x'=g \end{pmatrix} \overset{\text{Gauss elimination}}{\underset{\text{Linearity of differentiation}}{\Leftrightarrow}} \begin{pmatrix} x'+x=f\ f'=g \end{pmatrix} \Leftrightarrow \begin{pmatrix} x=f-g+f'\ x'=g-f' \end{pmatrix} \Leftrightarrow \begin{pmatrix} x=f-g+f'\ f-g'+f''=g-f' \end{pmatrix}$$ @MichaelMedvinsky – Mary Star Nov 11 '15 at 15:49
  • What could we do at the inductive step of the second induction when we have two inequations? @MichaelMedvinsky – Mary Star Nov 11 '15 at 15:53
  • 1
    the unknown is $x$, $f$ and $g$ is an input data, i.e. known functions, so you don't have to solve $f'=g$.$$$$ Regarding the inequations, first you have to see if you can find an appropriate counter example, the one I gave isn't enough in general. After this you will need to make some (hopefully not too strong) assumptions that prevent such situations. Are you sure that it should work for inequatoins? – Michael Medvinsky Nov 11 '15 at 16:16
  • I not sure if this works for inequations... What I am trying to do is the following: Let $L={+, ' , 0, 1, T, C}$ be the language where $ ' : \frac{d}{dz}$, $T(x) \Leftrightarrow x \notin \mathbb{C}$ (so $x$ is not constant), $C(x) \Leftrightarrow x \in \mathbb{C}$ (so $x$ is constant) $$$$ $x, f, g \in \mathbb{C}[z, e^{\lambda z } \mid \lambda \in \mathbb{C}]. $

    (In this ring the differential equations have always a solution.)

    – Mary Star Nov 11 '15 at 17:08
  • We have the following formula $$\exists x \phi (x , \overline{y}) \text{ where } \phi: \displaystyle{\bigwedge_{j=1}^n \phi_j} \text{ with } \phi_j : \left{\begin{matrix} Lx=f \ Lx \neq g \end{matrix}\right.$$ I want to show that there is an equivalent quantifier-free formula. – Mary Star Nov 11 '15 at 17:08
  • To prove that there is a quantifier elimination I thought to do the following:

    First by induction on the number of equations $n$ we show that a formula $\phi: \displaystyle{\bigwedge_{j=1}^n L_j x=f_j}$ can be reduced to the formula $Lx=f \land \psi_1$ where $\psi_1$ doesn't contain any $x$.

    Then in the same way we show that a formula $\phi: \displaystyle{\bigwedge_{j=1}^n L_j x\neq g_j}$ can be reduced to the formula $Lx\neq g \land \psi_2$ where $\psi_2$ doesn't contain any $x$.

    – Mary Star Nov 11 '15 at 17:09
  • And then $\phi: \displaystyle{\bigwedge_{j=1}^n \phi_j}$ with $\phi_j : \left{\begin{matrix} Lx=f \ Lx \neq g \end{matrix}\right.$ can be reduced to the formula $L_1 x=f \land L_2 \neq g \land \psi$, where $\psi$ doesn't contain any $x$.

    Is my idea correct? @MichaelMedvinsky

    – Mary Star Nov 11 '15 at 17:09
  • 1
    $\bigwedge$ and $\land$ are both quantifiers, the right hand side should be considered known, the inequations probably ok if the system is consistent. at least I didn't success to find more appropriate counterexample. so may be you fine with this part. – Michael Medvinsky Nov 11 '15 at 18:07
  • Aren't the quantifiers the following? $$\exists \ , \ \ \forall$$ So we can add two differential inequations and get a differential inequation? @MichaelMedvinsky – Mary Star Nov 11 '15 at 18:16
  • 1
    according to https://en.wikipedia.org/wiki/Quantifier_%28logic%29#Notation $\bigwedge$ is universal quantifier. If so, also $\land$. If you do not consider them quantifiers what do you want to get rid of? the $\exists$? What $\exists x \phi (x , \overline{y})$ mean? There is a solution to the system of equations with RHS$=f$ which isn't a solution to the same system of equations with different RHS$=g$. – Michael Medvinsky Nov 11 '15 at 18:28
  • Isn't the universal quantifier $\forall$ ? $$$$ I want to eliminate the quantifier $\exists x$ from the formula $\exists x \phi (x, \overline{y})$. That means that I want to be able to write the following $$\exists x \phi (x, \overline{y}) \leftrightarrow \psi$$ where $\psi$ is quantifier-free. The formula $\exists x \phi (x, \overline{y})$ means that there is a $x$ such that $\phi$ is satisfied, and $\phi$ is of the form $$\phi: \displaystyle{\bigwedge_{j=1}^n \phi_j}$$ where $$\phi_j : \left{\begin{matrix} Lx=f \ Lx \neq g \end{matrix}\right.$$ @MichaelMedvinsky – Mary Star Nov 11 '15 at 18:55
  • 1
    How the converting of the system of equations to one equation removes the $\exists x$ thing? It is true that the solution, if exists, doesn't change regardless of the form of the equation, but what does it say about the existence of the solution? Sorry, but I don't understand it. – Michael Medvinsky Nov 11 '15 at 20:03
  • I have shown that in the ring, we are working in, $\mathbb{C}[z, e^{\lambda z}\mid \lambda \in \mathbb{C}]$ the differential equations have always a solution. So, in the last step when we have one differential equation left, we could have for example something like $0=0$. @MichaelMedvinsky – Mary Star Nov 11 '15 at 20:38
  • I edited the inductive step of the second induction. Could you take a look at it and tell if it is correct? @MichaelMedvinsky – Mary Star Nov 11 '15 at 22:49
  • 1
    I thought about something somewhat similar.

    If you have existence theorem for $Lx=f$ it should come with uniqueness. You should be able to show it by proving that $Lx=0$ has only trivial solution. In general this would say you have a consistent system. $$$$ In general you should state: let $g_i=f_i+a_i$ such that not all $a_i$ are zeros, i.e. there is $j$ such that $f_j\ne g_j$. Then you have definitely different RHS and the same $x$ cannot solve both of $Lx=f$ and $Lx=f+a$ due to uniqueness. In such case you just use the result from $Lx=f$ straight forward without another induction etc.

    – Michael Medvinsky Nov 12 '15 at 08:52
  • I haven't understood why we suppose that not all $a_i$ are zeros, why do we not want to that no $a_i$ is zero? @MichaelMedvinsky – Mary Star Nov 12 '15 at 10:47
  • Now we are looking at a formula of the form $$\bigwedge_j L_j x =f_j \land \bigwedge_i L_i x \neq g_i$$ which we have shown that it is equivalent to $$Lx=f \land \bigwedge_i L_i x \neq g_i$$ or not? @MichaelMedvinsky – Mary Star Nov 12 '15 at 10:49
  • 1
    because this is enough that one of $f_j\ne g_j$ to get that the solution for $L_ix=f_i$ isn't solution to $L_ix=g_i$ for all $i's$ – Michael Medvinsky Nov 12 '15 at 10:49
  • So, we have to be sure that the same differential operator is at the equation and at the inequation, right? @MichaelMedvinsky – Mary Star Nov 12 '15 at 10:53
  • 1
    your definition of $\phi_j$ is missing the index in $L$,$f$ and $g$, but they all have to be with the same index as $\phi$ as long as I can understand. – Michael Medvinsky Nov 12 '15 at 10:55
  • 1
    do you have an uniqueness theorem for one equation? because it is look that you want to prove the existence and uniqueness of system given the existence and uniqueness for single equation. – Michael Medvinsky Nov 12 '15 at 10:56
  • I understood it as follows: $$\phi: \displaystyle{\bigwedge_{j=1}^n \phi_j}$$ where $$\phi_j : \left{\begin{matrix} L_j x=f_j \ L_j x \neq g_j \end{matrix}\right.$$ So $$\phi: \phi_1 \land \phi_2 \land \phi_3 \land \dots \land \phi_{n-1} \land \phi_n$$ For example $$\phi: L_1 x=f_1 \land L_2x =f_2 \land L_3 x \neq g_3 \land \dots \land L_{n-1}x \neq g_{n-1} \land L_n x=f_n$$ Or can this not be true? @MichaelMedvinsky – Mary Star Nov 12 '15 at 11:12
  • You said that to prove the uniqueness we have to prove that $Lx=0$ has only trivial solutions. Does this mean that the homogeneous part should have only the trivial solutions? So $x=0$ ? @MichaelMedvinsky – Mary Star Nov 12 '15 at 11:12
  • 1
    not the homogeneous part, but the homogeneous equation or system of equations - yes. The above $\phi$ is how I understand it. – Michael Medvinsky Nov 12 '15 at 11:17
  • So it is not necessary that the differential operators are the same, is it? It can be that every differential operator is different, or not? @MichaelMedvinsky – Mary Star Nov 12 '15 at 11:48
  • 1
    to promise consistency of the system $L_i$'s should be linearly independent in different $\phi_i$'s- , but inside the $\phi_i$ this is the same operator – Michael Medvinsky Nov 12 '15 at 11:51
  • But inside the $\phi_i$ isn't there just one differential operator, a differential equation or a differential inequation and not both? @MichaelMedvinsky – Mary Star Nov 12 '15 at 12:00
  • 1
    I think you try to show that $\phi: \displaystyle{\bigwedge_{j=1}^n \phi_j}$ can be represented as $$\left{\begin{matrix} L x=f \ L x \neq g \end{matrix}\right.$$(there is only one $L$) – Michael Medvinsky Nov 12 '15 at 12:00
  • I understand it that in each case one of them holds, that $\phi_j$ is either $Lx=f$ or $L x \neq g$. @MichaelMedvinsky – Mary Star Nov 12 '15 at 12:34
  • What happens if $x$ solves $Lx=g$? Neither will hold, right? – Michael Medvinsky Nov 12 '15 at 12:39
  • If $x$ solves $Lx=g$ then it would solve $Lx=f$ if $f=g$ but it doesn't solve $Lx \neq g$, right? @MichaelMedvinsky – Mary Star Nov 12 '15 at 12:45
  • But what do we get from that? I got stuck right now... @MichaelMedvinsky – Mary Star Nov 12 '15 at 13:19
  • 1
    I don't understand why $\left{\begin{matrix} L x=f \ L x \neq g \end{matrix}\right.$ is $ L x=f \lor L x \neq g$ and not $ L x=f \land L x \neq g$, the former is always true as long as $Lx=f$ has solution which you already proved. the later would mean uniqueness for system of equation – Michael Medvinsky Nov 12 '15 at 14:11
  • I found now in my notes the following formulation of the reduction $$\exists x \begin{pmatrix} \bigwedge_i W_i (x) =f_i \ \bigwedge_j U_j (x) \ne g_j \end{pmatrix} \Leftrightarrow \begin{pmatrix} \text{ the same form }\ \text{ without quantifiers } \end{pmatrix}$$ where $W_i$ and $U_j$ are linear differential equations with integer coefficients. @MichaelMedvinsky – Mary Star Nov 12 '15 at 14:45
  • 1
    If so, what the relationship between $U_i$ and $W_i$, $f_j$ and $g_j$ and what these brackets $()$ should say about their relationship? what does this mean $\exists x {x=x\choose x\ne y}$? – Michael Medvinsky Nov 12 '15 at 16:29
  • $U_i$ and $W_i$ are differential equations. $f_j$ and $g_j$ are functions in $z$, they don't contain the unknown $x$. We use the brackets to symbolize that this is a system, both have to be satisifed. In your example $$\exists x {x=x\choose x\ne y}$$ both relations should be satisfied, i.e, it must stand that $x=x$ and $x\neq y$. @MichaelMedvinsky – Mary Star Nov 12 '15 at 18:18
  • 1
    then from the discussion about we do have $Lx=f\land Lx\ne g$, not $\lor$ which is what I wanted. But now you say that this should be $Lx=f\land \tilde Lx\ne g$ or it is still the same $L$? – Michael Medvinsky Nov 12 '15 at 18:46
  • No, it is not the same $L$. It should be $$Lx=f\land \tilde Lx\ne g$$ @MichaelMedvinsky – Mary Star Nov 12 '15 at 19:27
  • can you find x that satisfy the following $x'=\cos t \land x'' \ne -\sin t$? – Michael Medvinsky Nov 12 '15 at 19:59
  • We can't find such a $x$, can we? @MichaelMedvinsky – Mary Star Nov 12 '15 at 21:05
  • 1
    I try to say that the originally when you asked the $L$ to be the same it was well defined, with different $L$'s not. For the case of the same $L$ you pretty much have a solution here along all these infinite comments.... – Michael Medvinsky Nov 12 '15 at 21:08
  • In your example there is a quantifier-free formula that is equivalent to the initial system : $$\exists x (x' \cos t \land x'' \neq -\sin t) \leftrightarrow 1=0$$ That means that is never true. But can we generalize it? @MichaelMedvinsky – Mary Star Nov 12 '15 at 21:13
  • 1
    this is enough to have one example to disprove something, so what a generalization you asking for? – Michael Medvinsky Nov 12 '15 at 21:31
  • When don't have specific differential (in)equations. When we have the formula $$Lx=f \land \tilde{L}x \neq g$$ since we know that $Lx=f$ has a solution, can we maybe find the solution and check if $\tilde{L}x \neq g$ is satisfied? Or is there also an other way to check if the system is solvable, other than solving it? @MichaelMedvinsky – Mary Star Nov 12 '15 at 21:42
  • this doesn't make sense to me. the fact that one equation has a solution has nothing to say about solution of another equation – Michael Medvinsky Nov 12 '15 at 22:28
  • So can we not do anything if have a formula of the form $$Lx=f \land \tilde{L}x \neq g$$ ? @MichaelMedvinsky – Mary Star Nov 12 '15 at 22:31
  • at least I don't know what to do with it. – Michael Medvinsky Nov 12 '15 at 22:58
  • Ok... What about the case $$L_1x \neq g_1 \land L_2 x \neq g_2$$ Can we reduce this to the form $Lx \neq g$ ? Or is this only possible when $L_1$ and $L_2$ are the same? @MichaelMedvinsky – Mary Star Nov 12 '15 at 23:05
  • 1
    this one should be fine – Michael Medvinsky Nov 12 '15 at 23:39
  • What do you mean? @MichaelMedvinsky – Mary Star Nov 13 '15 at 00:19
  • 1
    given that $L_1x=f_1\land L_2 x=f_2$ is consistent then $L_1x\ne g_1\land L_2 x\ne g_2$ can be reduced to $Lx\ne g$ – Michael Medvinsky Nov 13 '15 at 00:31
  • By consistent do you mean that we know that it can be reduced to $Lx=f$ and therefore that it has a solution? @MichaelMedvinsky – Mary Star Nov 13 '15 at 01:04
  • 1
    yes, for all input – Michael Medvinsky Nov 13 '15 at 07:15
  • But how exactly does it follow that $L_1x\ne g_1\land L_2 x\ne g_2$ can be reduced to $Lx\ne g$ ? @MichaelMedvinsky – Mary Star Nov 13 '15 at 16:35
  • Do you mean that we write the inequations as equations adding a nonzero constant? @MichaelMedvinsky – Mary Star Nov 13 '15 at 20:23
  • 1
    I say that your problem is equivalent of uniqueness theorem for systems. The same $x$ cannot solve both, for $f$ and for $g$, so as long as there is some $f_i\ne g_j$ and the system is consistent\non singular\have a solution to every rhs\have only one solution to homogeneous problem then the inequation is satisfied as implication of the satisfaction of the equation. – Michael Medvinsky Nov 14 '15 at 08:31
  • 1
    I understand!! Thanks for your help!! :-) @MichaelMedvinsky – Mary Star Nov 16 '15 at 00:55
  • You said that to show the uniqueness of the solution we have to show that $Lx=0 \Rightarrow x=0$. To solve the homogeneous equation $$\sum_{k=0}^m \alpha_k x^{(k)}(z)=0$$ we find the characteristic equation and its eigenvalues $\lambda_i$. $$$$
    • If $\lambda_i$ are eigenvalues of multiplicity $1$, then the solution of $Lx(z)=0$ is $$x_{H}(z)=\sum_{i=1}^m c_i e^{\lambda_i z}.$$
    • If $\lambda_i$ is an eigenvalues of multiplicity $M>1$, then the $$e^{\lambda_i z}, ze^{\lambda_i z}, z^2e^{\lambda_i z}, \dots , z^{M-1}e^{\lambda_i z}$$ are $M$ linear independent solutions of $Lx(z)=0$.
    – Mary Star Nov 16 '15 at 01:03
  • So aren't there also solutions other than $x=0$ ? Does this mean that the solution is not unique? Or have I understood it wrong? @MichaelMedvinsky – Mary Star Nov 16 '15 at 01:03
  • when the multiplicity of eigenvalues is $1$, there is only trivial solution. when and eigenvalue has multiplicity $>1$ this (may ?) have effect of degree of freedom, which says infinite solution to some RHSs, and no to some other RHSs. http://math.stackexchange.com/questions/1499876/why-is-the-transformation-not-unique-if-eigen-values-are-repeated-or-zero/1509188#1509188 – Michael Medvinsky Nov 16 '15 at 01:48
  • I have to think about it... $$$$ I have also an other question... Which is the domain and which the range of a differential operator? @MichaelMedvinsky – Mary Star Nov 18 '15 at 14:33
  • this is depends on the original problem, but the transformation from system to single equation doesn't change it. Usually ODE considered time dependent, so you talking about $t>0$ and initial condition at $t=0$. – Michael Medvinsky Nov 18 '15 at 18:55
  • Is the image of a differential operator the set of functions? I was wondering if we can say the following: $$$$ Since the image of the differential operators, so the set of functions is infinite and the number of inequalities is finite we can always find a function $x$ such that $\bigwedge_i L_i x \neq g_i $. So $$\bigwedge_i L_i x \neq g_i \leftrightarrow 0=0$$ @MichaelMedvinsky – Mary Star Nov 18 '15 at 19:35
  • I $ $ think $ $ yes – Michael Medvinsky Nov 18 '15 at 20:57
  • So we have $$\exists x \bigwedge_j L_j x=f_j \leftrightarrow \exists x Lx=f$$ and $$\exists x \bigwedge_j \tilde{L}_j x\neq g_j \leftrightarrow 0=0$$ Can we now use these results to reduce the general case $$\exists x \left (\bigwedge_j L_j x=f_j \land \bigwedge_j \tilde{L}_j x\neq g_j \right )$$ into the case $$Lx=F$$ ? @MichaelMedvinsky – Mary Star Nov 18 '15 at 23:30
  • nope. we already discussed this case - try to find the counter example in the comments. – Michael Medvinsky Nov 19 '15 at 06:41
  • How can we justify/prove that the image of a differential operator is the set of functions, which is infinite? @MichaelMedvinsky – Mary Star Nov 19 '15 at 10:15
  • i would say it is so by definition? – Michael Medvinsky Nov 19 '15 at 10:45
  • Ok... I thought about it again... Is the following correct? $$$$ If $\phi$ is of the form $\displaystyle{\bigwedge_{i=1}^n \mathcal{L}i x=f_i \land \bigwedge{j=1}^m \tilde{\mathcal{L}}j x \neq g_j}$, then $\phi$ can be reduced to the form $\displaystyle{L x=f \land \bigwedge{j=1}^m \tilde{\mathcal{L}}_j x \neq g_j}$.

    We consider the system $\displaystyle{L x=f \land \bigwedge_{j=1}^m \tilde{\mathcal{L}}_j x = g_j}$.

    This system can be reduced to the form $\hat{\mathcal{L}}x=s$.

    @MichaelMedvinsky

    – Mary Star Nov 24 '15 at 11:18
  • If this equation has no solution then we have that there is no $x$ that satisfies the equation $L x=f$ and the equations $\bigwedge_{j=1}^m \tilde{\mathcal{L}}_j x = g_j$.

    So $$\exists x \left (L x=f \land \bigwedge_{j=1}^m \tilde{\mathcal{L}}_j x \neq g_j\right ) \leftrightarrow \exists x \ L x=f$$

    @MichaelMedvinsky

    – Mary Star Nov 24 '15 at 11:19
  • If the equation $\hat{\mathcal{L}}x=s$ has a solution, let $\hat{x}$, then we have that $\hat{x}$ is a solution of the equation $L x=f$ and the equations $\bigwedge_{j=1}^m \tilde{\mathcal{L}}_j x = g_j$.

    We set $x=y+\hat{x}$. Then we have $$L (y+\hat{x})=f \overset{L \hat{x}=f}{\Longrightarrow} Ly=0$$ and $$\bigwedge_{j=1}^m \tilde{\mathcal{L}}j (y+\hat{x}) \neq g_j \text{ with } \bigwedge{j=1}^m \tilde{\mathcal{L}}_j \hat{x} = g_j$$

    @MichaelMedvinsky

    – Mary Star Nov 24 '15 at 11:21
  • So we have that $$\exists x \left (L x=f \land \bigwedge_{j=1}^m \tilde{\mathcal{L}}j x \neq g_j\right ) \leftrightarrow \exists \hat{x} \exists y \left (L \hat{x}=f \land Ly=0 \land \bigwedge{j=1}^m \tilde{\mathcal{L}}j (y+\hat{x}) \neq g_j \land \bigwedge{j=1}^m \tilde{\mathcal{L}}_j \hat{x} = g_j\right )$$ @MichaelMedvinsky – Mary Star Nov 24 '15 at 11:21
  • Since a system of equations is equivalent to one equation we have that $$L \hat{x}=f \land \bigwedge_{j=1}^m \tilde{\mathcal{L}}_j \hat{x} = g_j \leftrightarrow \overline{\mathcal{L}} \hat{x}=K$$

    @MichaelMedvinsky

    – Mary Star Nov 24 '15 at 11:39
  • So $$\exists x \left (L x=f \land \bigwedge_{j=1}^m \tilde{\mathcal{L}}j x \neq g_j\right ) \leftrightarrow \exists \hat{x} \exists y \left (L \hat{x}=f \land Ly=0 \land \bigwedge{j=1}^m \tilde{\mathcal{L}}j (y+\hat{x}) \neq g_j \land \bigwedge{j=1}^m \tilde{\mathcal{L}}j \hat{x} = g_j\right ) \ \leftrightarrow \exists \hat{x} \exists y \left ( \overline{\mathcal{L}} \hat{x}=K \land Ly=0 \land \bigwedge{j=1}^m \tilde{\mathcal{L}}_j (y+\hat{x}) \neq g_j \right )$$ @MichaelMedvinsky – Mary Star Nov 24 '15 at 11:40
  • I have problem with $Ly=0$ for a linear operator there is always a solution to homogeneous equation, may be $L0=0\land\nexists, y\ne0 (Ly=0)$ i.e. only trivial solution to homogeneous equation. Also, I still can't see how one solution to one equation can give information about another equation without when there is no knowledge about relationship between these equations. The same $x$ can solve two different equations, I've showed it above. Now you take two distinct hopefully consistent system of equations and combine (using "+"?), why do you think the new equation will be still consistent? – Michael Medvinsky Nov 24 '15 at 12:14
  • shortly, I missing you. – Michael Medvinsky Nov 24 '15 at 12:14
  • When $Ly=0$ has only the trivial solution $y=0$ then we get $x=\hat{x}$ which is the solution of $Lx=f$ and of $\bigwedge_{j=1}^m \tilde{L}j x=g_j $. Therefore in that case we would have $$\exists x \left (Lx=f \land \bigwedge{j=1}^m \tilde{L}_j x=g_j \right ) \leftrightarrow 1=0$$ or not? $$$$ Does it not stand that $$\exists x \left (L_1 x =f_1 \land L_2 x=f_2 \right ) \Leftrightarrow \exists x \left ((L_1+L_2)x=f_1+f_2\right )$$ ? @MichaelMedvinsky – Mary Star Nov 24 '15 at 12:27
  • for the case of $L_j=-L$ you will loose the consistency of the system even if initially it was consistent. – Michael Medvinsky Nov 24 '15 at 13:14
  • I see... So, when we have two equations and they have a common variable and we substitute it from the one to the other, we get one equation, right? Do we loose then the consistency of the system? @MichaelMedvinsky – Mary Star Nov 24 '15 at 15:32
  • Or is it impossible to reduce from a system of two equations into one equation without loosing the consistency of the system? @MichaelMedvinsky – Mary Star Nov 24 '15 at 15:34
  • If you have a consistent system then you can reduce it to one equation, however when you have two distinct systems their sum may not be consistent. Similarly a system + one extra equation, i.e. two system of different sizes. – Michael Medvinsky Nov 24 '15 at 22:43
  • Can we do the following: $$\left.\begin{matrix} L_1 x=f_1\ L_2 x=f_2 \end{matrix}\right} \Leftrightarrow (L_1 x-f_1)^2+(L_2 x -f_2)^2=0$$ ? Is the differential equation $(L_1 x-f_1)^2+(L_2 x -f_2)^2=0$ linear? @MichaelMedvinsky – Mary Star Nov 25 '15 at 00:27
  • the answer - they aren't linear. Can you explain why do you still trying to prove the impossible? – Michael Medvinsky Nov 25 '15 at 00:32
  • Since I saw one of the examples in my notes where they are reffering to Gaussian elimination, I thought that when we have a system of differential equations we could do the same as in the Gaussian elimination, where we end up with only one equation. @MichaelMedvinsky – Mary Star Nov 25 '15 at 15:32
  • Gaussian Elimination of consistent system leads to the reduced system, in the best case it will b $Ix=b$, not to one equation. In differential equation we have one unknown, a function (everything else it the derivatives of this function), so you can sum all rows up to get one equation. When you add one extra equation you may loose consistency, either in system of linear equations or in system of differential equations. – Michael Medvinsky Nov 25 '15 at 16:30
  • I got stuck right now... The inductive step of the induction at my original post is correct but we loose consistency? Or have I understood it wrong? @MichaelMedvinsky – Mary Star Nov 25 '15 at 16:44
  • you can't add apples to oranges, that is all. – Michael Medvinsky Nov 25 '15 at 16:46
  • I thought about it again... Can we do the following? $$$$ We suppose that we have a system of differential equations $$\left{\begin{matrix} L_1=0\ L_2=0 \end{matrix}\right. \tag 1$$ where ther order of $L_1$ is $n$ and the order of $L_2$ is $m$ and the coefficients of the highest-order term is $1$.

    Let $n<m$.

    Let $d_i$ be the $i$th derivative of $L_2$.

    We define the differential equation $d_{m-n}L_1-\frac{m!}{(m-n)!}L_2=0$ the order of which is smaller than the order of $L_2$.

    @MichaelMedvinsky

    – Mary Star Nov 27 '15 at 14:02
  • So the system $(1)$ is equivalent to the system $$\left{\begin{matrix} L_1=0\ d_{m-n}L_1-\frac{m!}{(m-n)!}L_2=0 \end{matrix}\right.$$

    So the initial system is equivalent to a system of smaller order.

    We do the same procedure until we get a system of the form $$\left{\begin{matrix} L=0\ \alpha \end{matrix}\right.$$ where the order of $L$ is $0$ and $\alpha$ doesn't contain $x$, i.e., it is a relation between the parameters.

    So we conclude to one differential equation. That means that any system can be replaced by one differential equation. $$$$ Is this correct? @MichaelMedvinsky

    – Mary Star Nov 27 '15 at 14:03
  • I missing you. In addition the page is now loading so slow because of all these comments. Please open new question and write your question in detain and with examples (you may reference this page if necessary). – Michael Medvinsky Nov 27 '15 at 14:57
  • Here is the new question: http://math.stackexchange.com/questions/1549092/system-of-differential-equations @MichaelMedvinsky – Mary Star Nov 27 '15 at 20:12

1 Answers1

1

You can continue as following: take the first k equations and create $$Lx=f \land \psi \ \ \text{ where } \psi \text{ doesn't contain any } x.$$ Now you can set new $\phi$ which contain exactly two equations, the new one and the extra one from the previous $\phi$. The size of new $\phi$ is less then $k$ so you fine.

Alternatively you split it into two parts of sizes $\lceil k/2\rceil$ and $\lfloor k/2\rfloor$ and proceed similarly.

  • Why do we have to take the cases $k=1$ and $k=2$ at the base case and not only the case $k=1$ ? – Mary Star Nov 05 '15 at 18:01
  • 1
    well, finally it should be fine with $k=1$, sorry for misleading you – Michael Medvinsky Nov 05 '15 at 19:46
  • At the inductive step where we have two equations left, do we have to show how to get from 2 equations to one? – Mary Star Nov 05 '15 at 20:37
  • 1
    the hypothesis is for $k\ge1$ so - yes – Michael Medvinsky Nov 05 '15 at 20:55
  • So, we have the following $$\phi : L_1 x=f_2 \land L_2 x=f_2 \land \psi$$ We want to solve at $L_2 x=f_2$ as for one derivative of $x$ and substitute this in the first equation, then we would have only one equation containg $x$ and its derivatives, right? But how can we be sure that at $L_1 x$ and $L_2 x$ there is a common derivative of $x$ so that we can make this substitution? – Mary Star Nov 05 '15 at 21:02
  • 1
    If your system assumed consistent, i.e. well posed/has a solution etc, then either of two should happen: 1) they have common variable to substitute the solution of one to another, or 2) they a distinct and can be solved separately, and so you can substitute the solution to the $\psi$. Does it make sense? – Michael Medvinsky Nov 05 '15 at 21:11
  • But it is a system, can we solve the two equations seperately? When the two equations don't have a common variable to substitute the solution of one to another, do we maybe differentiate the one of them so often until we get a common variable? – Mary Star Nov 09 '15 at 22:52
  • consider multivariable polynomial $P(x_1,..., x_n)$ evaluated with differential operators, e.g. $x_j=\frac{d}{dx_j}$ now you apply this operator on a vector function $u=(u_1({\bf x}),\dots ,u_m({\bf x}))$....i.e. your system reads for...$P(\frac{d}{dx_1},..., \frac{d}{dx_n})u=(f_1,\dots,f_n)$ – Michael Medvinsky Nov 10 '15 at 00:28
  • But in our case we have only one variable. – Mary Star Nov 10 '15 at 01:18
  • Since we have two equations we solve the one and then we have one differential equation and one equation which does not contain $x$, i.e., we substitute the solution at the first equation. I mean for example the following $$\begin{pmatrix} x'''+x''=f\ x'+x=g \end{pmatrix} \Leftrightarrow \begin{pmatrix} x'''+x''=f\ x=ce^{-t}+g \end{pmatrix} \Leftrightarrow \begin{pmatrix} x'''+x''=f\ (ce^{-t}+g)''+(ce^{-t}+g)''=f \end{pmatrix}$$ So now we have the desired form $Lx=f \land \psi$. Or can we do not do that? – Mary Star Nov 10 '15 at 04:01
  • You have the second derivative twice in the last system, one of them should be third derivative. Note that if you differentiate you get condition on $g$ (i.e. $g'''+g''=f$ ) which have no $x$, but $g$ is known. If you differentiate the second equation twice you will get $x'''+x''=g''$ from one side this one have 3 solution while the original one have one solution only, but from the other side, in this specific this is simply mean that $g''=f$. I'm not sure this was good example. – Michael Medvinsky Nov 10 '15 at 08:28
  • 1
    may be all you need is to sum between two equations in such case? a.k.a. $x'''+x''+x'+x=f+g$ which is what you need I guess. – Michael Medvinsky Nov 10 '15 at 08:30
  • I see... I edited my initial post... I completed the induction and I added also an other induction... Could you take a look at it and tell me if everything is correct? – Mary Star Nov 11 '15 at 13:46
  • When we add two differential equations do we get $$L_1 x =f_1 \land L_2 x=f_2 \Leftrightarrow (L_1+L_2)x=f_1+f_2$$ or $$L_1 x=f_1 \land L_1 x=f_2 \Rightarrow (L_1+L_2)x=f_1+f_2$$ ? – Mary Star Nov 16 '15 at 01:00
  • 1
    the first one of course – Michael Medvinsky Nov 16 '15 at 01:04
  • Ok... Thanks again!! :-) – Mary Star Nov 16 '15 at 01:07