Well, my question is simple, I would like to compute the complexity time of an algorithm related to image processing.
I simplified the algorithm ... so that we focus only on the problematic part.
Algorithm
You have an image/matrix of $N =n \times m$ size and a block of size $b \times b$ (block is just a window or rectangle that will slid through the picture of pixels or the matrix of intensities).
The algorithm goes through most pixels in the image (as center of the block, so it may step over the end of the line or the beginning) by a step of size $s$. The step is inferior to the size of the block $s<b$, so there is some overlapping (because some pixels are being considered in multiple blocks), and then it performs some operations with a complexity relative to the size $b$ equals $O(b^2)$.
Concretely, the algorithm starts in line $1$, and process blocks starting at $1, s+1, 2s+1, ... $ columns, then it goes to the $s+1$ line and do the same as on line $1$, then to $2s+1$ line and so on. Every iteration, the algorithm computes some operations related to the size of the block (which can be $3 \times3$ or $5 \times 5$, ... and $s$ is $1/8$ the $b$ for instance).
Any help, or reference related to this topic is a welcome.