(This is a response to a question OP asks in the comments which is too long to keep in the comments, but is still relevant as an answer to the original question.)
Model Theory and/or Universal Algebra provide a very general way of approaching the question you're asking about what 'structure' is and what it means for 'isomorphisms' to preserve 'properties' of that structure. The other questions have provided more concrete explanations of this for the specific case of vector spaces, so I'll give a more abstract answer. I will also be approaching $\mathbb{F}$-vector spaces as a single-sorted structure, in contrast to user21820's approach.
A (first-order) language is a triple $\langle \mathcal{F}, \mathcal{R}, \operatorname{ar}\rangle$ consisting of disjoint sets $\mathcal{F}$ and $\mathcal{R}$ and a function $\operatorname{ar} \colon \mathcal{F} \sqcup \mathcal{R} \to \mathbb{N}$. Our interpretation of this triple is that $\mathcal{F}$ denotes a set of function symbols, $\mathcal{R}$ denotes a set of relation symbols, and $\operatorname{ar}$ associates to each function and relation symbol an arity.
In our particular example of $\mathbb{F}$-vector spaces (where $\mathbb{F}$ is any field, e.g., $\mathbb{R}$ or $\mathbb{C}$), we can take $\mathcal{F} = \{+\} \cup \{s_a \mid a \in \mathbb{F}\}$, $\mathcal{R} = \emptyset$, and $\operatorname{ar}(+) = 2$ and $\operatorname{ar}(s_\alpha) = 1$ for each $\alpha \in \mathbb{F}$. $+$ has its usual interpretation as vector addition, while the function symbols $s_a$ correspond to scalar multiplication by $a$ (this is how we get around the 'two-sorted' nature of vector spaces after fixing a field). Optionally you can include a $0$-ary function (i.e., constant) symbol $0$ for the zero vector and/or a unary function symbol $-$ for vector negation, but these are 'definable' in the sense I'll indicate below.
Given a language $\langle \mathcal{F}, \mathcal{R}, \operatorname{ar}\rangle$, a structure over that language is a triple $\mathbf{A} = \langle A, \langle f^\mathbf{A}\rangle_{f \in \mathcal{F}}, \langle R^\mathbf{A} \rangle_{R \in \mathcal{R}}\rangle$ such that for each $f \in \mathcal{F}$, $f^\mathbf{A}$ is a function $A^{\operatorname{ar} f} \to A$ and for each $R \in \mathcal{R}$, $R^\mathbf{A}$ is a $(\operatorname{ar} R)$-ary relation over $A$. In other words, a structure over a language is a set and instantiations of all the function and relation symbols (such that the arities line up appropriately). Note that at this point we haven't introduced any 'axioms' or the like.
If $\mathbf{A}$ and $\mathbf{B}$ are two structures over a language $\langle \mathcal{F}, \mathcal{R}, \operatorname{ar}\rangle$, an isomorphism between $\mathbf{A}$ and $\mathbf{B}$ is a bijection $\varphi \colon A \to B$ such that for function symbol $f \in \mathcal{F}$ and relation symbol $R \in \mathcal{R}$ (say, where $\operatorname{ar} f = \operatorname{ar} R = n$) and $a_1,a_2,\ldots,a_n$,
$$\varphi(f^\mathbf{A}(a_1,a_2,\ldots,a_n)) = f^\mathbf{B}(\varphi(a_1),\varphi(a_2),\ldots,\varphi(a_n))$$
and
$$\langle a_1,a_2,\ldots,a_n\rangle \in R^\mathbf{A} \iff \langle \varphi(a_1),\varphi(a_2),\ldots,\varphi(a_n)\rangle \in R^\mathbf{B}.$$
In other words, an isomorphism is a bijection which is compatible with the operations (or commutes with the operations) and both preserves and reflects each relation.
In our particular example of $\mathbb{F}$-vector spaces with the language I gave above, a structure over the language of $\mathbb{F}$-vector spaces is just a set $V$, a binary operation $+\colon V^2 \to V$, and unary operations $s_\alpha \colon V \to V$. An isomorphism $\varphi$ between such structures asks $\varphi$ to be a bijection and for $\varphi(v+w) = \varphi(v)+\varphi(w)$ and $\varphi(s_\alpha(v)) = s_\alpha(\varphi(v))$ for all $v,w$ in the domain of our first structure and $\alpha \in \mathbb{F}$. So far, this doesn't look quite like the typical definition of an isomorphism of $(\mathbb{F})$-vector spaces which would ask that $\varphi$ satisfy $\varphi(s_\alpha(v)+w) = s_\alpha(\varphi(w)) + \varphi(w)$ for all $v,w \in V$ and $\alpha \in \mathbb{F}$. We'll address that now.
We like to deal with structures that exhibit particular properties, like groups, rings, posets, etc. One way to realize this in the framework above is to define a first-order theory $\Phi$ over a language $\sigma = \langle \mathcal{F},\mathcal{R},\operatorname{ar}\rangle$ to be a set of well-formed formula in that language. To give the shortest definition I can of what is meant by 'well-formed formula', fix a countably infinite set of variables $X$ and consider strings over the set $\Sigma = X \cup \mathcal{F} \cup \mathcal{R} \cup \{ \text{`('},\text{`)'},\text{`,'},=,\vee,\wedge,\to,\neg,\leftrightarrow,\forall,\exists\}$ with the usual interpretation of the logical symbols and assuming that everything is distinct. $\mathrm{Term}_\sigma$ is the smallest subset of $\Sigma^\ast$ (the set of strings built up over the alphabet $\Sigma$) such that:
- $X \subseteq \mathrm{Term}_\sigma$,
- if $f \in \mathcal{F}$ is $n$-ary and $t_1,t_2,\ldots,t_n \in \mathrm{Term}_\sigma$, then $f(t_1,t_2,\ldots,t_n) \in \mathrm{Term}_\sigma$ (where if $n=0$ we interpret this as just '$f$').
In other words, $\mathrm{Term}_\sigma$ is the set of expressions built up from the variables and function symbols. Then define $\mathrm{Form}_\sigma$ to be the smallest subset of $\Sigma^\ast$ such that:
- For any terms $t,s \in \mathrm{Term}_\sigma$, $(t = s) \in \mathrm{Form}_\sigma$.
- If $\varphi,\psi \in \mathrm{Form}_\sigma$, then $(\varphi \wedge \psi)$, $(\varphi \vee \psi)$, $(\varphi \to \psi)$, $\neg \varphi$, and $(\varphi \leftrightarrow \psi)$ are in $\mathrm{Form}_\sigma$.
- If $\varphi \in \mathrm{Form}_\sigma$ and $x \in X$, then $\exists x \varphi$ and $\forall x \varphi$ are in $\mathrm{Form}_\sigma$.
- If $R \in \mathcal{R}$ is $n$-ary and $t_1,t_2,\ldots,t_n \in \mathrm{Term}_\sigma$, then $R(t_1,t_2,\ldots,t_n) \in \mathrm{Form}_\sigma$.
I'll call elements of $\mathrm{Form}_\sigma$ wffs (well-formed formulas). A sentence is a wff $\varphi$ such that if $x$ is a variable appearing in $\varphi$, then it appears in a subformula of $\varphi$ of the form $\exists x \psi$ or $\forall x \psi$ -- in other words, they're wffs without free variables, or equivalently wffs all of whose variables are bound. The point of sentences is that they correspond to wffs whose 'truth' in any particular structure shouldn't depend on providing an additional assignment of values to free variables.
Finally we can say that a first-order theory is nothing more than a set $\Phi$ of sentences; in other words, we're picking out sentences which will be our 'axioms'.
In our particular example of $\mathbb{F}$-vector spaces, the theory we take consists of the following sentences/axioms (diverging a bit in notation by using infix notation):
- $\forall v \forall w (v+w = w+v)$ (associativity of vector addition)
- $\exists 0 \forall v ((v+0 = v) \wedge \exists w (v+w = 0))$ (existence of a zero vector and additive inverses)
- For each $\alpha \in \mathbb{F}$, $\forall v \forall w (s_\alpha(v+w) = s_\alpha(v) + s_\alpha(w))$ (scalar multiplication distributes over vector addition)
- For each $\alpha,\beta \in \mathbb{F}$, $\forall v ( s_\alpha(s_\beta(v)) = s_{\alpha \cdot \beta}(v))$ (compatibility of scalar multiplication with field muliplication)
- For each $\alpha,\beta \in \mathbb{F}$, $\forall v (s_{\alpha+\beta}(v) = s_\alpha(v)+s_\beta(v))$ (compatibility of scalar multiplication with vector addition)
- $\forall v (s_1(v) = v)$ (identity element of scalar multiplication; here $1$ is the multiplicative identity of the field)
A $\mathbb{F}$-vector space is then a structure over the aforementioned language which satisfies the above first-order theory. Giving a precise definition of satisfaction would be too long, but an intuitive idea probably suffices. It can then be easily shown that for any map $\varphi$ between vector spaces, having $\varphi(\alpha v + w) = \alpha \varphi(v)+\varphi(w)$ is equiavlent to asking for $\varphi(\alpha v) = \alpha \varphi(v)$ and $\varphi(v+w) = \varphi(v)+\varphi(w)$.
Last but not least, we double back to what all this means in terms of isomorphisms. The general result is the following:
Theorem. Given a language $\sigma = \langle \mathcal{F},\mathcal{R},\operatorname{ar}\rangle$ and two structures $\mathbf{A}$ and $\mathbf{B}$ over that language, if there exists an isomorphism from $\mathbf{A}$ to $\mathbf{B}$, then for every sentence $\varphi$ in the language $\mathbf{A}$ satisfies $\varphi$ if and only if $\mathbf{B}$ satisfies $\varphi$.
So isomorphic structures agree with each other on any property/identity/etc that can be written down as a first-order sentence.
Disclaimer: The specific way I defined what a wff is isn't entirely universal; some approaches use Polish notation, some move $=$ from being a logical symbol to being a relation symbol and then add appropriate axioms characterizing it, some use a single variable symbol and include a way to generate infinitely many variables out of that (e.g., given $x$, we add another logical unary operation symbol $'$ such that $x'$, $x''$, $x'''$, etc. are all independent logical variables), etc. The specific ways I wrote the axioms for $\mathbb{F}$-vector spaces isn't standard (though pretty typical), nor is the language I used (as mentioned above, a constant ($0$-ary function) symbol $0$ and/or a unary funciton symbol $-$ might be used). Some folks like to distinguish constant symbols from function symbols. Etc. etc.