It has since been clarified in the comments that the OP is specifically interested in proving the relevant properties for addition and multiplication.
Let me show how to prove that addition is commutative; hopefully you'll get the idea of how to do the other ones.
First, we need to specify our axioms. Proofs don't take place in a vacuum, and specifying one's axioms is especially important when (a) one is proving something somewhat arcane (e.g. that might depend on the axiom of choice), or (b) when one is proving something so simple that it is generally taken for granted (as here). The axioms I'll be using are those of (first-order) Peano arithmetic.
Our proof will be consist of three proofs by induction: the first on its own, and the third inside the second. This is the price of doing everything from scratch!
First, we establish that $0+x=x$ for all $x$. (That $x+0=x$ for all $x$ is one of the Peano axioms.) To do this, let $\psi(x)$ be the formula "$0+x=x$". We have that $\psi(0)$ holds, that is, $0+0=0$, since $x+0=x$ for all $x$ (including $x=0$). Now suppose $\psi(n)$ holds; we want to calculate $0+S(n)$. Well, one of our axioms is $$a+S(b)=S(a+b),$$ so applying that here gives $$0+S(n)=S(0+n);$$ but our induction hypothesis means that $S(0+n)=S(n)$. So we've shown that $\psi(n)\implies \psi(S(n))$; by the induction scheme, then, this gives $\forall n\psi(n)$.
Now we'll use that to prove the commutativity of addition. Let $\varphi(x)$ be the formula "For all $y$, $x+y=y+x$". By the above result, for all $y$ we have $y+0=0+y=y$, so $\varphi(0)$ holds. Now suppose $\varphi(n)$ holds; we want to compare $S(n)+y$ and $y+S(n)$ for arbitrary $y$.
Well, a previously-mentioned axiom allows us to compute the latter: $y+S(n)=S(y+n)$, and by our induction hypothesis this is just $S(n+y)$. So now we just need to compute the former. This will need one last proof by induction, this time with $n$ as a parameter:
Let $\chi(x)$ be the formula "$S(n)+x=S(n+x)$". Clearly we have $\chi(0)$, since $S(n)+0=S(n)=S(n+0)$ (since $a+0=a$ is an axiom). Now suppose $\chi(m)$ holds, and let's look at $S(n)+S(m))$. By the previously-mentioned axiom relating successor and addition, this is $S(S(n)+m)$; by the induction hypothesis, this is just $S(S(n+m))$. But $S(n+m)=n+S(m)$, so this is just $S(n+S(m))$. So $\chi(S(m))$ holds. By the induction scheme, this shows $\forall x\chi(x)$.
In particular, $S(n)+y=S(n+y)$, and we already showed that $y+S(n)=S(n+y)$; so we've shown that $\varphi(n+1)$ holds. So by the induction scheme, this shows $\forall x\varphi(x)$ - that is, addition is commutative!
Hopefully you can see how similar arguments can be used to prove the other basic properties about addition and multiplication.
Associativity, commutativity, distributivity, and the like are not properties of general functions, or even of functions on the integers (or even "nice" functions!). Rather, they are properties a function may or may not have. Although many functions do have these properties, the general picture is more complicated.
I think considering some natural examples of non-commutative and non-associative operations will be helpful:
The most basic example is exponentiation! This is neither commutative nor associative, and does not distribute nicely over multiplication or addition:
$2^3=8\not=9=3^2$.
$(2^2)^3=4^3=64\not=256=2^8=2^{2^3}$.
$2^{3\times 3}=2^9=512\not=64=8\times 8=2^3\times 2^3$.
Etc. (Note that exponentiation does satisfy some laws that look vaguely like distributivity: namely right distributivity $(xy)^z=x^zy^z$, as well as the laws $x^{y+z}=x^yx^z$ and $(x^y)^z=x^{yz}$. And these laws, of course, are extremely useful.)
Composition of linear transformations (or matrix multiplication) is not commutative. A linear transformation $T$ is a map from a vector space to itself (actually, linear maps can go from one vector space to another, but then "swapping the order of composition" may not even make sense: given $f: U\rightarrow V$ and $g: V\rightarrow W$, what should $f\circ g$ be?), with certain nice properties. Think rotation, reflection, dilation, etc. (Actually, affine transformations will more closely match your intuition - linear transformations have to fix the origin - but whatever.) It's a good exercise to come up with an example of this.
- In general, if I have two functions $f, g$ from a set to itself, $f\circ g$ need not equal $g\circ f$.
Multiplication of quaternions is not commutative.
- And multiplication of octonions isn't even associative! But octonions are less useful than quaternions.
Although Boolean algebras satisfy a distributive law, general lattices need not; and there are lots of important non-distributive lattices (e.g. the lattice of normal subgroups of a group need not be distributive, although it will always satisfy the weaker modularity property).
You also ask how you prove that something is commutative or associative. Well, you prove that (assuming it's true!) the same way you prove anything else: using the definitions, together with axioms and things you've already proved.
For example, consider the binary operation $*$ on $\mathbb{R}^2$ defined as $$(a, b)*(c, d)=(ac-bd, bc+ad)$$ (you might recognize this as complex multiplication in disguise). Then we can show that $*$ is commutative as follows: $$(a, b)*(c, d)=(ac-bd, bc+ad)=(ca-db, cb+da)=(c, d)*(a, b).$$ The first and third equalities are just the definition of $*$, and the middle equality uses the commutativity of standard multiplication of real numbers (which you either take for granted, or prove first).