Lets say we are given a primal linear programming problem:
$\begin{array}{ccc}
\text{minimize } & c^{T}x & &\\
\text{subject to: } & Ax & \ge & b \\
& x & \ge & 0
\end{array}$
The dual problem is defined as:
$\begin{array}{ccc}
\text{maximize } & b^{T}y & &\\
\text{subject to: } & A^{T}y & \le & c \\
& y & \ge & 0
\end{array}$
According to the duality theorem $c^{T}x \ge b^{T}y$ for every feasible solution $x$ and $y$, and in addition when $x$ and $y$ are optimal solutions to the primal and the dual task then $c^{T}x=b^{T}y$
So if we define linear programming task with following constraints:
$\begin{array}{ccc}
Ax & \ge & b \\
x & \ge & 0 \\
A^{T}y & \le & c \\
y & \ge & 0 \\
b^{T}y & \ge & c^{T}x
\end{array}$
Then any feasible solution to this task should be an optimal solution to the primal and the dual task, because the last constraint might be satisfied only if $x$ and $y$ are optimal.
The question is why this approach is not used?
I see three potential reasons:
1) I've made somewhere mistake and it doesn't make any sense.
2) It is often the case when primal or dual problem is infeasible. I've seen such examples, but in all of them the optimal solution was unbounded, is it the only case when exactly one of the primal and dual problem is infeasible?
3) Finding any feasible solution might be hard. The so called Phase 1 of simplex method can be used to find a feasible solution. I couldn't find the complexity of this phase, is it exponential just like the simplex algorithm? The other question is what is the fastest method to determine whether there exist any feasible solution? This solution doesn't have to be found.