104

If an engine supports Windows, OS X, and Linux, why do we sometimes see games using these engines, like Space Hulk: Deathwing, restricted to Windows only?

Gnemlock
  • 5,263
  • 5
  • 28
  • 58
Nathan
  • 1,085
  • 2
  • 7
  • 8

5 Answers5

158

Technical Reasons:

  • Game-code made platform specific: When some developers are making their games, they can sometime rely on platform specific functions. While the game engine might be able to build the game for multiple platforms, the game specific code might make a Windows specific call that either doesn't exist on other platforms or would require remaking a difficult part of the game (Licensing services, File handling systems, etc.).
  • Lack of Capable Machines: For a long time, most Apple computers did not come with enough graphics power to run most games. So why release somewhere where the users will most likely only have a bad experience? This is slowly changing, thanks to better integrated graphics, but might still be a reason why some go Windows only.
  • Plugin/Library compatibility: Game developers might use 3rd party libraries to help speed up development or use industry standard/validated code (SSL, Serialisation libraries, etc). If these don't support a platform, the game most likely won't run reliably on it, so certain platforms get excluded unless the developer is willing/able to spend time supporting this code.
  • Increased QA: During game development, there is a small subsection of the team that ensures that there are no bugs and that the game meets performance standards. Once you add a platform, the game must essentially be tested twice! The generic parts of the game could be left alone, but there is still much more testing to do before release. This can also lead to increased cost, not only with the additional time needed but also specialised hardware, depending on the platform (Apple, Xbox, PlayStation, Phones, etc.).
  • Increased Support: As games release they will have bugs (some games more than others). As you add more platforms, the amount of post-release support the developer has to do increases. Platform-specific bugs will have to be fixed in a way that fixes it for the broken platform and doesn't affect the platforms which already work. If a platform changes, say, Windows 7 to 8 or an OSX iteration, there will have to be some level of QA required to ensure that there are no bugs on the newer version. And, if there are, that is yet another platform that must be supported alongside the older version. This can have a huge affect on cost, especially after the game has launched (since most revenue is made directly after launch), 3-6 months after release.

Non-Technical Reasons:

  • Publisher Agreement: Some developers will have agreements with the platform holder to release specifically on their platform. While this happens more with consoles, this could also be the case for PC platforms (e.g., Windows).
  • First party Developer: Some developers are owned by the platform holder and are not allowed to release their games on specific platforms. You likely won't see Halo on PS4 or Forza on Mac.
  • Lack of an audience: Developers have lots of statistics about consumer trends on specific platforms, especially if they have large publishers with lots of data available. If they have information that says that 90% of their target audience is on Windows, they might not bother to release on other platforms to try and reduce potential bugs or/and keep the marketing material focussed.
  • Does not meet Platform Requirements: Some platforms such as Apple's App Store have strict requirements on layout and design that need to be followed in order to be published. If a game doesn't meet these requirements for a particular distribution platform, it might not be worth the engineering time to adapt the game and release it if there are not enough predicted sales.
  • Lack of experience with a platform: If the developer has solely worked with Windows (and there are no/few people with experience of other systems on the team), it might be a lot of work to learn the small differences that can cause issues late in development or there might not be enough budget to hire new staff to be in charge of a Linux/OSX build.
  • Marketing Cost: if two platforms have significantly different audiences or age groups the marketing material for one may not reach the other meaning that more money must be spent on marketing. If the two groups require different marketing it must all be re-created with the new target audience in mind. Marketing can get extremely expensive especially if you need to reach large audiences, the more platforms need marketing the faster the cost increases.

I'm sure there are more. These are just some of the top of my head. Hope this helps.

user3797758
  • 3,631
  • 2
  • 28
  • 51
  • 27
    This is a good answer but I think you miss one other important element: the cost of testing / QA. Cross platform engines never do a perfect job of hiding platform differences so it's essential to test on all target platforms. There may be bugs that only show up on one platform or significant performance differences relating to different implementations on different platforms. There can also be behaviour differences in functionality that is supposed to be cross platform. You need increased QA time and to provide testers with extra hardware. Some developers will need extra hardware and software. – mattnewport Jul 09 '17 at 18:35
  • 2
    I thought I covered that in the lack of experience section.... but I can add it :) always better to be though – user3797758 Jul 09 '17 at 20:16
  • 5
    I'll agree with @mattnewport that cost is understated here - it can be quite labor intensive to develop, build, test, debug, rinse, repeat for multiple platforms. It's hard enough to properly test against various hardware configurations for a single platform! – A C Jul 09 '17 at 20:47
  • At least there are only 1-2 configurations for console and you can always make automated test sequences for badly performing sections to get an understanding of how the performance is from build to build – user3797758 Jul 09 '17 at 21:13
  • 3
    Also, "support" is sometimes greatly exaggerated. Many engines "support" Windows, but actually you would have to install Visual Studio and Python and Perl and Cygwin and Msys and 5 flavors of MinGW and gigabytes of libraries to actually use it. – AndreKR Jul 10 '17 at 02:23
  • 1
    It should also be noted that game engines can be used for a wide variety of games, some of which wouldn't work well on, say, different controller schemes. The most obvious example being RTS games, which either target mouse+keyboard which makes them unsuitable for the bulk audience of console gamers, or they target gamepad-like controls, which largely makes it suck for PC gamers (Mac/Win). Additionally, different hardware still has different capabilities - you may need to make many assets multiple times to make it work really well. Multi-platform is still hard and expensive. – Luaan Jul 10 '17 at 07:35
  • 2
    "Developers have lots of statistics about consumer trends on specific platforms" - of course, using that to determine what platforms to technically support is a bit of a self-perpetuating prophecy. – O. R. Mapper Jul 10 '17 at 08:20
  • I just performed a grammar edit (mostly comma issues and extra words leftover from previous edits). But I was unsure if the last paragraph should be removed as being fluff. – trlkly Jul 10 '17 at 14:20
  • Do you mean the bit about there being more reasons than mentioned? – user3797758 Jul 10 '17 at 14:23
  • @O.R.Mapper True, but that's what the market is for. When someone actually notices a good opportunity, they will reap massive benefits, and the others will quickly jump on the wagon. This has happened many times in the past. Companies like iD or Electronic Arts started that way, back in the "IBM PC sucks for games" days :) – Luaan Jul 12 '17 at 08:45
  • 1
    You add one little WinAPI function, and bam! Windows-only. – htmlcoderexe Jul 13 '17 at 19:55
31

Because being available doesn't mean being free & instant.

Supporting one more operating system, in its most simplistic form, means one more platform to provide technical support for.

The more platforms you support = The more platforms you need to provide support for = Spending more time on support = Losing work time that could have spent improving your game.

Supporting a platform all comes down to the confidence in which if your game can draw enough playerbase in that target platform, so it makes up for the time you spend providing support for the platform.

starikcetin
  • 5,275
  • 4
  • 25
  • 59
  • 2
    Also more work is lost to market place development ( App store, Steam, Google play) and platform specific integrations particularly in the social department. (friends, sharing, authentication) – CostelloNicho Jul 09 '17 at 15:04
19

There are good answers so far, but let's get to the bottom line.

According to Steam's June 2017 Hardware survey, 96.24% of users sampled used Windows. Of Windows users, 87.37% are either Windows 10 or 7, 64 or 32 bit. OSX variants represent 2.95% of users, and Linux variants total 0.72%.

Time is money. Unless your market is niche and targets OSX or Linux specifically, you'd have to sell a lot of games before <4% of the market is worth your time, especially as games developers are usually stretched for time to make their product feature complete.

  • 3
    ... and this also assumes all of those users pay comparable money for games on average. I remember the stats for Android vs. Windows Phone, where while Android had a much bigger device share, the direct (non-ad) revenue was pretty much the same - and both were tiny compared to iOS. If you're selling a game, you need to know how much people are willing to pay. Making a game for a 0.7% segment of the market that rarely buys games is outright bonkers :D It probably wouldn't be worth it even if the engine was perfectly multi-platform (no weird gotchas on the different platforms). – Luaan Jul 12 '17 at 08:54
  • 1
    @Luaan According to Humble Bundle the revenue for Linux is just slightly less than Mac even though Mac is three times more popular. Maybe explained by the fact that high-end Linux computers tend to have much better graphics cards than Macs of the same price because the trade-offs of these machines lie elsewhere. – jobukkit Jul 12 '17 at 10:14
  • 1
    @JopV. Humble bundle is a bit tricky because of the bundle factor. If I only care about one game in the bundle, I'm not inclined to pay more just because I get extra games I won't play (or games I already paid for before, sometimes multiple times; yes, I'm a bit crazy :P). There might also be an effect from the fact that Linux has been largely ignored for gaming until relatively recently, so someone with a Mac or Win computer might already have some of the games, while it could be the first release for Linux. Maybe. Do they also have statistics from the Humble Store? – Luaan Jul 12 '17 at 13:14
  • 2
    @Luaan Every time there's a Humble Indie Bundle there are some statistics near the bottom of the page, including total and average amount of money contributed per OS. They stay fairly consistent every bundle. – jobukkit Jul 12 '17 at 14:21
14

The other answers here are good, but here is one that wasn't mentioned.

I'm having this problem right now - my team is about to release a game made in Unity for Windows/Mac. We've gotten lots of questions as to why our game isn't on mobile. There are 2 main answers:

1) Phones simply aren't powerful enough to keep up with the game. Maybe we can reduce the fidelity of the art (fewer polygons in models, fewer pixels in textures, etc), but that only goes so far. Most of the game would need to be rewritten in order to be optimized enough for a phone to run it. We did try it, but we only got about 0.5 frames per second. Obviously, not playable.

2) Input. The entire user interface was designed around using a mouse, and part of the game depends on knowing where the mouse is at any given time. Our entire input mechanism would have to be rewritten in order for the game to still work without knowing where the mouse is, and translating various "touch" actions on the screen into "mouse" actions to get the same functionality. Like user3797758 mentioned in their answer, this will require stuff to be rewritten so that "mouse" and "touch" inputs can be fed into the same system, and a bug in one won't affect the other, and a fix of that bug also won't affect the other. This requires more resources to do than my team has available at the moment.

Also mentioned in user3797758's answer, we don't even have Linux support because one of our packages crashes on Linux machines, but works on Windows/Mac. Just because the engine is cross platform doesn't mean everything using it is.

Cody
  • 241
  • 1
  • 4
  • 1
    My phone can play GTA Vice City with no trouble. Sounds like your rendering engine is inefficient! :) – Lightness Races in Orbit Jul 11 '17 at 10:31
  • 2
    Anyway this answer is more about porting games to different hardware platforms/paradigms, and less about OSs on a PC – Lightness Races in Orbit Jul 11 '17 at 10:31
  • 2
    @LightnessRacesinOrbit well, it is Unity... Really though, we have one particular effect that is central to the game, and really inefficient. The guy who wrote it is gone and we haven't been able to find a way to make it better. Such is life... – Cody Jul 11 '17 at 15:39
  • @LightnessRacesinOrbit Well, it isn't a question of raw power, really. It's just that some things happen to be a lot more expensive on some platforms than other. Even today, there's massive differences between otherwise comparable AMD and Intel CPUs on certain specific workloads - GPUs are even worse. Even the "GPU SDK's" have massive differences in unexpected places :D Even different OS on the same PC can have significant differences - e.g. poor drivers, or even just things that are inherently slower (e.g threads/processes on Linux vs. Windows - both approaches have their pros and cons). – Luaan Jul 12 '17 at 08:50
  • @Luaan: Understood but 0.5 frames per second in something that is (presumably) expected to do more like 25+ is pretty extreme inefficiency! – Lightness Races in Orbit Jul 12 '17 at 09:33
  • @LightnessRacesinOrbit 30 FPS is the minimum for anything interactive. Below that you can't sync input with video anymore. – jobukkit Jul 12 '17 at 10:17
  • @JopV.: Fine, 25, 30, whatever. Although, you don't need to sync input with video so interactivity with under 30 FPS is certainly possible (it just won't look very good) – Lightness Races in Orbit Jul 12 '17 at 10:23
  • @LightnessRacesinOrbit Believe it or not, I've seen worse, boiling down to a single seemingly innocuous check somewhere in the OpenGL port... that was one of the things that made me stop attempts at serious game development, or at least anything multi-platform (it's still fun to make things that work on my computer :P). I assume primarily OpenGL-programmers (or 3Dfx, or Vulcan, or...) have similar anecdotes when trying to port their code to DirectX. Well, if they ever tried that, of course. 3D development is getting better, but it still has plenty of hair-pulling horribleness. – Luaan Jul 12 '17 at 13:08
  • For the record, the 0.5 fps was when the game was at alpha, tested on an older device, and most of the code was not optimized. It just served as a good example of the massive performance differences between desktop computers vs other devices since the desktop build at the time was running about 30 fps. I honestly have no idea how well it would perform now, but definitely not the 50 fps the game currently gets on Windows. – Cody Jul 12 '17 at 15:41
  • 1
    @Luaan Thing is, DirectX was solely created as a trick to lock developers into Windows and hamper cross-platform development. There is no reason to use DirectX for anything (other than nativity) since OpenGL and Vulkan work on Windows (and tend to perform better than DirectX). – jobukkit Jul 13 '17 at 13:39
  • 3
    @JopV. Umm, no. The history is a tiny bit more complicated than that. But there's really no point in discussing this here. I'll just note that OpenGL is a 3D rendering API; DirectX is a lot broader than that. If you need to engage in pointless debate, at least compare OpenGL to Direct3D :) OpenGL was designed for professional 3D work ("opening" the proprietary IrisGL). What other platform supported OpenGL at that time? There was no "hampering cross-platform development" - Windows had native OpenGL support. OpenGL just never was intended for anything but high-end professional work. – Luaan Jul 13 '17 at 17:58
  • @Luaan Actually that's not true, Microsoft forced DirectX on people in the beginning, for example by bribing GPU manufacturers not to support OpenGL, they only ended these tactics after DirectX became popular so they weren't necessary anymore. And OpenGL was not used for games in the beginning because it's older. I just wanted to respond to your allegation that cross-platform 3D development sucks because of API fragmentation, that doesn't actually exist anymore because there has been no reason not to use OpenGL/Vulkan (even for a game planned to be Windows-exclusive) for a very long time. – jobukkit Jul 13 '17 at 19:26
  • @JopV. When was this gap when "GPU manufacturers needed to be bribed not to support OpenGL"? OpenGL simply wasn't relevant for consumers back then. And it's not like SGI was being altruistic when they made it "open" (or when they crushed the previous open standard pushed by big players like IBM). And again, DirectX is more than OpenGL. Many OpenGL games on Windows still use DirectX (e.g. for sound). And finally, I stopped game development ten years ago. OpenGL was at its worst, and Vulkan didn't exist :) My last (cross-platform) attempt was in OpenGL, and it was a waste of effort. – Luaan Jul 14 '17 at 05:54
  • @Luaan "Since most PC GPUs at the time only implemented a small subset of OpenGL in hardware, Microsoft wrote a full software OpenGL implementation and then offered it to GPU companies, so those companies could just replace the parts that their GPU implemented in hardware and still have a full OpenGL driver. Once they had all spent a good deal of time doing this, Microsoft actually refused to license any of their OpenGL code for release, effectively guaranteeing that smaller GPU companies would only have support for DirectX." – jobukkit Jul 14 '17 at 09:31
  • @LightnessRacesinOrbit Maybe he wants to run a game that is not 15 years old. – marsh Jul 14 '17 at 14:49
  • @marsh: Full 3D, textures, large environment.. the fundamentals are all there are work seamlessly. If you dropped to 0.5FPS on modern phone h/w then something is very wrong or you are attempting photorealism to a degree that would stretch even a gaming PC. – Lightness Races in Orbit Jul 14 '17 at 15:05
  • 2
    phone gpus are much less powerful than even intel graphics since they have thermal and size limits. also many people have phones which do not support opengl es 3.0 so only few people will be able to use your app unless you rewrite your game for primitive opengl es 2.0 – Suici Doga Jul 15 '17 at 12:07
-2

Like someone already mentioned above, it can be all about a cost of testing and quality control for such games. More platforms - always mean and cause more testings your separate QA team should accomplish before dev team can publish such game, especially if we're talking about not so big projects like any of indie game. You also should keep in mind, that every detail of user interface also should be tested by QA workers, even before you will publish it to beta testing for first open player testings.

DMGregory
  • 134,153
  • 22
  • 242
  • 357
  • 4
    This late answer does not really seem to add much which wasn't already mentioned in other answers. – Philipp Aug 03 '20 at 10:48