As mentioned by others, it's a matter of trade-offs and of having the right knowledge.
The only pitfall you might want to consider is this: you mention in your question that you see the web as having "cross-platform" as an advantage. But does it really? Think of it this way: if you develop something for the desktop, you need to define the list of platforms and their requirements to support.
Make no mistake, it's the same for the web. And even though it's already tremendously simpler than it used to be, if you design a wide public app, you're going to have to deal with every possible version of every web browser out there. And if it's more of an enterprise app, then brace yourself and prepare to draft your supported browser platforms requirements very precisely.
Don't think you'll avoid to have platform-specific hacks here and there if you build something significant.
And then the fun parts. What is best? Browsers that update themselves almost transparently very regularly like Chrome? Or the ones that roll-out security updates only monthly and major features every stone age (like IE)? The answer is not as obvious as you might think, because some of these frequent "transparent" updates might break your code, and you will need to follow this and react promptly. Or keep an eye on beta and dev pre-releases while developing and testing. For all of the browsers you foolishly said you wanted to support (good luck).
Oh and let's not forget UI considerations. You also face the joy of deciding whether you want a consistent UI ACROSS all your target platforms, or a consistent UI WITH each host's target platform. See all those little buttons you can view on web-pages? Do you want them to be exactly the same everywhere, or to integrate with the environment used by your user? Of course this problem isn't new and existed for other development models, but it seems to be more important here, and depends on the type of users you target and what they expect. Public end-user would tend to want you to integrate with their platform - but will still want you to "wow!" them with fancy stuff - while the enterprise user will want something that looks like a desktop app. And mobile platforms had a new dimension to all this.
For the last 2 paragraphs, a common idea is sometimes to package a pre-configured web-browser with your installer, which then connects to your web-app (locally hosted or on the web). It's great because you control the update frequency and you can "freeze" the state and you know exactly what to support and test on. Plus you can add cool stuff like dedicated user extensions. For instance, packaging a "frozen" Chromium with small Chrome Extensions you've developed to make the use of your web-app easier for different types of users can be extremely nice. On the other hand... you're now responsible if a security breach occurs because you froze the release cycle, and your app won't benefit of speed improvements (if any).
Like many things, it's a double edged ax.
Note: I have a strong bias against the web for being basically a big pile of half-baked technologies (and I am polite here), down to the OSI layers, on which we keep adding layers of crap hiding the problems underneath without really solving or fixing them.
That being said, I am in favor of the web for its ubiquitous nature as a platform. I think your company's move is (probably) the right one. It depends on your target market and the platforms you aim for, obviously. If you want to expose something as a service, then you're probably good to go (though it's not necessary either). If you don't, then maybe there's not so many reasons for it.
Hmm, and expect some fun developments in the future now that more light-weight variants of existing operating systems keep spawning for mobile environments (netbooks, smartphones, PDAs, tablets, eBooks...), with more emphasis on using lightweight embedded browsers... but with all their new share of UI rendering glitches.
Regarding plugin-based technologies... I'd say steer away from them. They will enhance your app's power, but will limit its market penetration. In some cases you will see it as a plus in terms of cross-platform support, until a new platform suddenly refuses to support them. Web Standards are here for a reason (be careful not to get too excited about everything in HTMl5 either, or it might blow up in your face).
EDIT: other things to consider...
Recruitment
It's extremely hard to find knowledgeable web developers. You'd think there's a herd of them, but they're lost in a huge pool of, well, fairly incompetent people who think having managed to write 700 lines of JavaScript/ECMAScript to implement some validation in their forms is the end-all and be-all of what can be achieved in terms of high-level skills.
I'm not kidding, lately my first question for all web-development interviews is how to declare a variable, and then whether there's a different between using var
or not (depending on how they answer). It's depressing. I find it a lot more difficult to find an average or advanced web developer than to find an average or advanced desktop developer.
Perception
No one will ever consider you seriously when you say "I'm a web developer". It's for a sub-class of programmers, developers, isn't it? The ones you ignore and mock from afar, and don't join when they go get coffee. :)
This is obviously untrue, but it comes down to the fact that you develop for an environment that is mostly managed for you. Browsers correct your screwed up markup, your screwed up styles, and will even correct your screwed scripting for some of them, and optimize it for you if you please. And if you're a web developer, people won't assume you have a clue about lower-level programming, so you must be a complete idiot, right?
And then they realize how crazily complex ECMAScript can be, but will refuse to review their opinion. Because it's web. We don't like it intrinsically, we just like what it enables us to do.