However, now am wondering what the roots are of propositional logic, I mean, we don't know propositional logic innately, we have to first learn it, but what are the informational prerequisites?
You know English, and therefore you know the rough meaning of the words "if", "not", "and" and "or". Propositional logic is merely a precise system that assigns symbols to precise semantic notions that are represented by these words, nothing more and nothing less. For example we write "$A \lor B$" to denote the assertion "Either $A$ is true or $B$ is true (or both)". In other words, it's just notation.
Same for predicate symbols, function symbols and quantifiers in first-order logic. See Introduction to Logic by Suppes, an old book but one that clearly explains the intuitions behind logic.
To sidetrack a little, does information ever bottom out? Do we ever reach the rockfall of knowledge where everythings just a given, without question?
You will have to stop somewhere, of course. Notice that you cannot define anything from nothing. To define logic, you need to already know about strings of symbols and conditionals. Without prior understanding of these concepts, you simply cannot define them. But we do understand them, and in fact we use those to bootstrap all the way to formal systems, in the sense that we define formal systems in terms of rules (if you have derived these strings, then you can derive that string). To go further and reason about formal systems and not just follow them, we need the prerequisite notion of worlds or models. See my comments and answer to this question for more about this issue.