Often with new mathematical objects, it's not that the object has no meaning, but that we're asking the wrong question, or from the wrong context. It makes sense to double my apples, but it makes no sense to multiply my apples by i. That's asking the wrong question. One way of understanding multiplication by a complex number is that it scales for real values and rotates for imaginary values, so if we want a physical example that makes sense for multiplying imaginary numbers, allowing for rotation in the context would enable us to use this fact. Numbers of apples doesn't work, but perhaps modifying the velocity vector of my car would. We aren't accustomed to saying, "Double our velocity" or "i-ble" our velocity (turn 90 degrees), but it would make some sense, if we're asking the right question. We definitely could talk about multiplying by imaginary numbers whenever we rotate ourselves, even though we commonly don't.
What question, what context, could make sense for a half derivative? We have one famous context that I know of, namely the Riemann Hypothesis and the functions that surround it. There are probably other contexts.
In the Riemann Hypothesis, we have essentially a Fourier analysis of prime numbers, and the F(s) frequency space function is related somehow to the Riemann zeta function. (Question for the audience for my benefit: IS the frequency space function equivalent to the Re(z)=1/2 slice?). Hopefully my ignorance won't hinder the discussion here too much.
The Riemann Zeta function is an interesting one, which for $Re(z) > 1$, can be simply defined like this:
$$
\zeta(z) = \sum_{k=1}^\infty \frac{1}{k^z}
$$
Since this combines both exponentiation and addition, this makes things tricky for moving around the graph of $\zeta(z)$. How would we consider shifting by $z+2$, for example? For integer amounts of shift, we might be able to rely on binomial expansion or something similar, but the number of terms becomes unwieldy quickly.
Differentiation and integration of a polynomial provides a convenient alternative, since they are linear operators, and modify the exponents. If we could differentiate $\zeta(z)$ in respect to k, the exponent would change, providing every term with a multiplicative factor but we can deal with that.
Confirmation of differentiation being somehow related to the values of $\zeta(z)$ can be found in the equations that link the Riemann zeta function and the polygamma function $\psi_n(z)$, where different values of $z$ for the Riemann zeta function correspond to different amounts of derivative in the polygamma function:
$$
\psi_n(1) = (-1)^{n+1} n! \zeta(n+1)
$$
where $\psi_n(z)$ is the nth derivative of the digamma function. But for fractional derivatives, and using the gamma function instead of the factorial, this can be made valid for non-integer n.
But the points in question for the Riemann zeta function are well defined for Re(z) for integer $z$, whereas the points in question are all on the Re(z) = 1/2 line. We could probably use half derivatives of $\psi_n(z)$ to get there, and mayble we could use imaginary derivatives to traverse imaginary-wise along the critical line. I would wager that this impetus is what led Riemann and Cauchy to developing the beginnings of fractional calculus in the first place.
Other contexts might deal with exponential functions, which are eigenfunctions for derivatives and calculus. Repeated differentiation will lead to different eigenvalues. If the eigenvalues can be real valued instead of just integer valued, and if those eigenvalues have physical meaning, then fractional calculus will also find meaning there.