I recently came across Dan Piponi's blog post An End to Coding Theory and it left me very confused. The relevant portion is:
But in the sixties Robert Gallager looked at generating random sparse syndrome matrices and found that the resulting codes, called Low Density Parity Check (LDPC) codes, were good in the sense that they allowed messages to be transmitted at rates near the optimum rate found by Shannon - the so-called Shannon limit. Unfortunately the computers of the day weren't up to the task of finding the most likely element of M from a given element of C. But now they are. We now have near-optimal error correcting codes and the design of these codes is ridiculously simply. There was no need to use exotic mathematics, random matrices are as good as almost anything else. The past forty years of coding theory has been, more or less, a useless excursion. Any further research in the area can only yield tiny improvements.
In summary, he states that the rate of LDPC codes is very near channel capacity—so near that further improvements would not be worth the while.
So, my question is: What does modern research in error-correcting codes entail? I noticed that Dan did not mention what channel the rate of LDPC codes approach the capacity of, so maybe there exist channels that LDPC codes don't work well on? What other directions does modern research in the field explore?