Let $(V, ||\cdot ||)$ be a finite-dimensional normed vector space. Let $S(r) = \{ v \in V: ||v|| = r \}$ be the sphere of radius $r$ in $V$. Assume, if necessary*, that the length/measure of $S(1)$ is finite. In other words, that the circumference of the unit sphere is finite.
Question: Then do we have, for any choice of norm $|| \cdot ||$ on $V$, that for all $r > 0$: $$S(r) = r \cdot S(1) \,\,? $$
In other words, do the circumferences of spheres in any finite-dimensional normed space scale linearly with respect to their radii?
A pointer to a reference, or a yes/no will suffice for an answer -- I just want to know whether the result is correct before spending time actually trying to prove it.
Note: In order for the notion of circumference of a sphere to be well-defined, presumably we would need the notion of length to be well-defined first. However, I am not sure if every finite-dimensional vector space can be construed as a length space -- completeness is necessary for being a geodesic space, but not for being a length space. If necessary, assume that the ground field is $\mathbb{R}$.
My gut instinct is that the answer to this question is yes, and that this follows from the absolute homogeneity of any norm (i.e. $||rv|| = |r| ||v||$). However, I am not sure how to set up/write the integral expressing the circumference of the unit sphere in an arbitrary normed vector space. Having such an expression, it would be easier to see whether or not absolute homogeneity plus linearity of integrals automatically implies the result or not.
In particular, I am not sure how to find a continuous map $\gamma: [0,1] \to V$ such that the trace/image of $\gamma$ is the unit sphere for any given norm $|| \cdot ||$. (In order to set up an integral to calculate the length of the sphere, i.e. its circumference.) Presumably it would then be easy to modify such a map such that the trace is $S(r)$ for any $r > 0$, and this would then tell me immediately, using linearity of integrals and absolute homogeneity of norms, whether the conjecture is correct or not.
*I assume that this assumption is probably unnecessary, since all norms on finite-dimensional vector spaces are topologically equivalent, so if the circumference of one's unit sphere is finite, then perhaps this is true for all possible norms. On the other hand, boundedness of a set is not necessarily preserved under homeomorphisms (i.e. "boundedness is not a topological property") so it seems like I would need additional assumptions for this to be correct, and I am not positive that the axioms of a normed vector space are sufficient enough additional assumptions. Never mind, this probably all follows from the fact that the unit sphere/ball in any finite-dimensional normed vector space is compact, which is a topological property, and since all norms are topologically equivalent, if compactness for one norm implies boundedness, then presumably compactness for any norm implies boundedness. Still, this is somewhat conjectural on my part.