Definition : A curve $\omega : [0,1]\to X$ is defined absolutely continuous whenever there exists $g\in L^1([0,1])$ such that $d(\omega(t_0),\omega(t_1))\le\int_{t_0}^{t_1}g(s)ds$ for every $t_0<t_1$.
I would like to show that, according to the above definition, the graph of the Cantor function is not absolutely continuous. I think this is true since I read that for curves $\omega:\mathbb{R}\to\mathbb{R}^d$ we must have that the map $g$ from the definition satisfies $g(t)=\|\dot{\omega}(t)\|$. However, if $\omega$ is the curve of the cantor function, we have $d(\omega(0),\omega(1))=d((0,0),(1,0))=\sqrt{2}$ and $\int_0^1 \|\dot{\omega}(t)\|dt=\int_0^1 \sqrt{1^2+0^2}dt=1$ so $d(\omega(0),\omega(1))\not\leq\int_0^1 \|\dot{\omega}(t)\|dt$.
I was wondering if there is a way to show that the Cantor curve is not absolutely continuous (according to the above definition) in a more direct way without using the result that $g=\|\dot{\omega}\|$? I was thinking by contradiction : "Suppose there exists $g$ with this property..." but I am not sure how this is done.
Maybe it helps to know that the length of the Cantor curve is $2$ from this discussion, so in particular the length of the curve is different from the integral of the velocity.
Edit
Also I was wondering if the definition of absolutely continuous curve whas equivalent to the more natural (for me) condition that the metric derivative exists almost everywhere and for every $a,b$ such that $0\le a \le b\le 1$ we have $$\operatorname{length}_{a,b}(\omega)=\int_a^b|\dot{\omega}|(t)dt$$ where $$\operatorname{length}(\omega):=\sup\left\{\sum\limits_{k=0}^{n-1}d(\omega(t_k),\omega(t_{k+1})),n\ge 1, a=t_0<t_1<\ldots<t_n=b \right\}.$$ and $|\dot{\omega}|(t)$ is the metric derivative defined by $$|\dot{\omega}|(t):=\lim\limits_{h\to0}\frac{d(\omega(t+h,t))}{h}.$$
NB : the definition of absolute continuity has an inequality because although for absolutely continuous function $f:I\to\mathbb{R}$ we have $f(y)-f(x)=\int_x^y f'(t)dt$ this is not true for function $f:[0,2]\to\mathbb{R}^n$. Take $f(t)=(t,0)$ for $t\in[0,1]$ and $f(t)=(1,t-1)$ for $t\in[1,2]$. $d(f(0),f(2))<d(f(0),f(1))+d(f(1),f(2))=\int_0^2 |f'|(t)dt$