Well the hardware needs to be properly aligned, so that where it senses your touch matches up with what you see yourself touching.
There are also general algorithms the system applies to make decisions about where the center of your touch is, since your finger is not a perfect point, and so on.
Finally, and app needs to decide what to do with a touch. When an app uses simple widgets like buttons it's fairly simple to map the touch position to a widget. In the browser, however, it's much more difficult. Determining the position of a control rendered dynamically via HTML (and JavaScript etc.) is a fairly complex task. You'll find that most touch browsers, even on other platforms, have varying levels of difficulty with touch accuracy.
So it depends on both hardware and software, but the variance you're seeing between apps is probably all due to the apps.