First you can always assume GL2, I think, as you won't find much pre-2.0 hardware around. Next, it is mostly a question of the major version, as these are mainly dependent on the hardware, whereas minor versions often are more of a driver question. For example a hardware supporting GL 3.1 is likely to support GL 3.2 (and 3.3), too.
Next, it depends on the features you need. For example these are IMHO the major features of the newer versions (forgive me for forgetting your favourite feature):
OpenGL 3:
- geometry shaders
- texture buffers
- instanced rendering
- integer textures and attributes
- integer operations in shaders
- transform feedback
- FBOs
- floating point textures
The first six of these are real hardware features of GL3/DX10 hardware, whereas the last two (important ones, I think) are core since 3.0, but are supported on most newer GL2 hardware by extensions (since GeForce 6, I think)
OpenGL 4:
- tessellation shaders
- double precision attributes and shader operations (and textures?)
- improvements of shader managment
- image load/store operations
In my opinion GL3 brings some really nice features, but there is nothing that prevents you from writing modern and future ready GL applications (without fixed-function) with only OpenGL 2.0/2.1, that may later be easily improved by new features of newer hardware. But maybe nowadays GL3 might also be a valid requirement, although I still sit on 2.1 hardware. But honestly, it just depends on the hardware features you need and GL 2.0/2.1 might suffice for a solid "baseline". But as you say you last time used the old and deprecated way of OpenGL you might first need to make yourself acquainted with the new and modern shader-only approach, as this is the way to do hardware accelerated real-time graphics today (and tomorrow).
Finally, if you're more familiar with DirectX (and as that seems the one people classify hardware by, nowadays), the relations GL2 ~ DX9, GL3 ~ DX10 and GL4 ~ DX11 may also guide you.