I am going through tensor-flow tutorial and noticed that they use one-hot encoding in regression tensorflow. I don't fully understand how it works. Let us take oversimplified case of ordinary least square regression. Assume we have y = [1,2,3] and x = [cat, dog, mouse]. Converting to one hot vector we get
cat = [0,0,1]
dog = [0,1,0]
mouse = [1,0,0]
how does regression equation looks now? Is it multivariate regression now?
y = alpha + beta*x_1 + beta*x_2 + beta*x_3,
where x_1, x_2, x_3 are coordinates of one-hot vector?
P.S. I am interested more in mechanics of this set up, not so much meaning.