114

There are a lot of version control systems available, including open-source ones such as Subversion, Git, and Mercurial, plus commercial ones such as Perforce.

How well do they support the process of game-development? What are the issues using VCS, with regard to non-text files (binary files), large projects, et cetera? What are solutions to these problems, if any?

For organization of answers, let's try on a per-package basis. Update each package/answer with your results.

Also, please list some brief details in your answer, about whether your VCS is free or commercial, distributed versus centralized, etc.

Update: Found a nice article comparing two of the VCS below - apparently, Git is MacGyver and Mercurial is Bond. Well, I'm glad that's settled... And the author has a nice quote at the end:

It’s OK to proselytize to those who have not switched to a distributed VCS yet, but trying to convert a Git user to Mercurial (or vice-versa) is a waste of everyone’s time and energy.

Especially since Git and Mercurial's real enemy is Subversion. Dang, it's a code-eat-code world out there in FOSS-land...

Cyclops
  • 3,129
  • 3
  • 32
  • 39
  • Note - this is intended to replace Question http://gamedev.stackexchange.com/questions/245/what-version-control-systems-work-best-with-games. In 72 hours I am going to delete that question - I leave it up for now, to give people time to copy over their Answers/comments to this Question. :) – Cyclops Jul 15 '10 at 17:37
  • This will probably end up being one of my favorite questions, if people follow the editing instructions. Working on a Game Dev specific solution for this now :D – Jesse Dorsey Jul 15 '10 at 17:39
  • @Noctrine, you had to add the disclaimer if, didn't you? :) Of course people will follow instructions... – Cyclops Jul 15 '10 at 17:41
  • Don't delete the other question. If it's really off topic (I don't think it is) it can be closed and archived. It can also be closed as a duplicate of this one. – Firas Assaad Jul 15 '10 at 17:42
  • @Firas, it was closed - and re-opened :) Regardless, I think this formulation is better (and yes, it's a wiki) - and if people copy over their answers, this one should cover all the information in the other question. – Cyclops Jul 15 '10 at 17:44
  • @Cylops: I disagree. I voted to close it as a duplicate of this question. You can't be sure that everyone will copy their answers here. – Firas Assaad Jul 15 '10 at 17:48
  • @Firas, I don't care either way, but if it keeps getting downvoted, I'm deleting it. :) I'm only leaving it undeleted to give people time to copy over their answers. – Cyclops Jul 15 '10 at 18:12
  • @Cyclops: I think you can edit it to be community wiki and then downvotes won't have an effect. Perhaps a moderator can merge the answers of that question with this one? – Firas Assaad Jul 15 '10 at 18:27
  • @Firas, good point, thanks. It's now wiki'ed, so once people vote/close it, I won't delete it. – Cyclops Jul 15 '10 at 18:40
  • Someone add Bazaar please. :) – Noldorin Jul 26 '10 at 10:09
  • Wow, I've earned the Great Question badge, for a question that's been closed for years. :) – Cyclops Oct 06 '17 at 17:09

6 Answers6

74

Git

Recently I have been on the Git bandwagon (I've used SVN and Mercurial). So far I really like what I get with Git. It is far from a pain to setup and more development tools are starting to adopt using it.

It's a distributed version control system. This allows for us to have our own independent trunk-like area. I can work in my own area and invite you over to view changesets very easily. I can rollback in my own space without mucking up the central repo. I can commit, branch, and do everything you can do with SVN locally. I really like having this control.

With SVN, you need access to your repo in order to commit. What if you're on the road or at a cafe with no internet? Not good.

Sure, SVN is much simpler to learn but I think the advantages of distributed source control largely outweigh the fact that it has a little learning curve.

I also like that it is smarter about merging.

A major downside of GIT is that it stores the entire history locally. (Yes, you can perform surgery to cut that down, but it's the default behavior). It's not a problem at all for source files, but if you have a large project with gigabytes of asset data, it becomes a problem quickly. In my current experience, I'd recommend GIT only for smaller or source-only repos.

If you're still curious about GIT, check out http://thkoch2001.github.io/whygitisbetter/ for some good information/metrics. Also check see https://git.wiki.kernel.org/index.php/GitSvnComparsion

congusbongus
  • 14,835
  • 58
  • 89
David McGraw
  • 4,071
  • 2
  • 31
  • 38
  • 1
    I love the idea of hierarchy of branches allowing QA to test changes before they make it to mainline, or easy creation and validation of demos. – tenpn Jul 15 '10 at 18:58
  • @tenpn, is that a feature of distributed VCS in general, or just Git? – Cyclops Jul 15 '10 at 19:08
  • no VCS in general – tenpn Jul 16 '10 at 07:39
  • 1
    I've heard that Git and Mercurial both suffer when shoving very large files into them. Any truth to that rumor from people with more experience than me? – drhayes Jul 17 '10 at 00:57
  • @D. Hayes: I believe there are initial costs (checkouts and what not), but after that it is faster than other solutions. – David McGraw Jul 17 '10 at 01:12
  • 6
    Git is good and all but when it comes to handling big graphic or music assets, such as larger than 100's of MB, it becomes noticably slow at commits and check-outs. At the moment Mercurial, the contender to Git, has a "big files" extension that addresses this specific issue. If you have a game project that doesn't have a lot of assets you could give Git a try. – Spoike Jul 17 '10 at 20:49
  • 3
    Why not use Git's submodules to manage the binary files? That way you could create separate repositories as needed and then tie them together using submodules. Any future changes in the main repository should not affect the submodules. – Alex Schearer Jul 26 '10 at 20:43
  • My big vote for Git (or hg if you prefer), once you have more than 3 maintenance branches SVN gets absolutely impossible to manage. – coderanger Aug 06 '10 at 17:07
  • @AlexSchearer: Submodules are a good solution, but then you have one git submodule with every version of assets where one file may be 1 GB (and they don't compress well since their deltas are the whole file). This doesn't impact your main repo, but it may grow absurdly over the course of a project. You also have no ability to lock a binary file (so people don't start editing files that can't be merged). It seems like it would be ideal if you could have a svn submodule that was an svn repo. (I guess you could use git-svn so you can lock, but you'd still have all of that history.) – idbrii Dec 21 '11 at 19:50
  • For mercurial bigFile extension equivalent there is http://git-annex.branchable.com/ – JackLeo Feb 13 '13 at 14:05
  • Git doesn't support locking, so your artists will go insane ;) – Andreas Oct 29 '13 at 13:37
  • Git submodules are pinned to a specific commit sha and not to HEAD, so your (junior) coders will drive you insane. – Andreas Oct 29 '13 at 13:39
  • The main problem I have with using git for games is the large files. I have worked on several projects recently where the repository took up ~12 GB on disk, and ~9 GB of that was the .git directory. I have seen scenarios where art assets are in a SVN repository that sits below the git repo, and is ignored by git. This works reasonably, because it means you can "embed" a particular branch of your art assets in your repository, and have a robust process for putting new assets in the hands of developers. It needs a bit more automation though. – guysherman Nov 18 '13 at 22:53
  • I use git at work (not game dev), but for game development, it is not a good choice. Git is designed for handling source code and that's it. Using submodules for binaries are if'y at best. If you do make the mistake of putting many binaries over 10MB into git, the git system quickly explodes in size and using git becomes horribly slow. In game development you have many different assets and you need a reliable way to package everything at a particular revision of code and assets; git cannot do this nor was it designed to. As for git lfs, only seasoned DevOps people should mess with that option. – AhiyaHiya Mar 19 '16 at 13:51
  • Yeah, sorry, git jsut isn't fit for purpose here. Even with the LFS (large file) extension, it rapidly bloats to the point where a clone can take hours, and some operations are painfully slow. – Basic Apr 28 '19 at 22:43
65

Mercurial

Key features:

  • Distributed VCS
  • Free, open source
  • Plugin scripts are easy to write---can be written in Python or as shell scripts
  • There are many plugin scripts already freely available
  • Lots of documentation available, including this book (highly recommended)

With regard to the use of non-text files, last versions of Mercurial (>=2.0) provide the largefile extension by default:

largefiles solves this problem by adding a centralized client-server layer on top of Mercurial: largefiles live in a central store out on the network somewhere, and you only fetch the ones that you need when you need them.

There are other extensions providing similar solutions like the bigfiles extension which lets you store your assets in the same Mercurial repo, but only fetch the binaries you need when you need them.

I am not aware of any issues with regard to large projects beyond those related to having large binary files. The Python project is a large project and uses Mercurial.

Joel Spolsky has written a mini-tutorial on using Mercurial at Subversion Re-education

Klaim
  • 2,624
  • 1
  • 26
  • 37
talljosh
  • 719
  • 9
  • 12
  • 4
    Interesting note about the Bigfiles Extension - that addresses one of the problems reported in the original thread, that Distributed VCS wouldn't fit well with game productions that had large numbers of binary file assets. – Cyclops Jul 16 '10 at 13:30
  • 1
    +1 for Mercurial. It's fast, easy to use, and surprisingly powerful. :) I am using it for everything: web development, game development, private one-person projects and team projects. Thanks for introducing the BigFiles extension! – jacmoe Aug 06 '10 at 12:13
  • With regards to large projects - TortoiseHg appears to go much slower on a large repository with 8 years of revisions than it does on a small repository with less than 20 revisions. I don't yet know whether this is something specific to Tortoise or to Mercurial more generally. – Kylotan Aug 06 '10 at 14:17
  • 4
    There doesn't seem to be an option comparable to svn:needs-lock, and since there's also no way to tell who's locally working on what files, you're back to passing a bowl around the team, literally (you aren't allowed to edit without the bowl on your desk). BigFiles extension or not, this VCS is useless for binary files without a practical solution to this. – Sam Harwell Aug 07 '10 at 11:43
  • 280Z28, have you come across any distributed VCS which has a solution to that problem? Having a lock would seem to me to defeat one of the premises of distributed version control---that you don't have a central server and don't necessarily know the location of all check-outs. – talljosh Aug 09 '10 at 00:19
  • @280Z28: this is also an issue we found for any files if you have a policy that any new code needs to be tested against the latest build. Otherwise you can find that when you come to push your tested code, someone else has committed new changes, so you have to pull, merge, and repeat your tests. Obviously this could go on indefinitely. – Kylotan Aug 18 '10 at 10:44
  • 3
    Regardless of bigfile support, if two people edit, say, a Maya file at once, one will check in and the other will have to redo the work. With Perforce, at least, you can know that someone else is editing the file (and also automatically have a lock on that file). – dash-tom-bang Sep 10 '10 at 22:06
  • @talljosh From v2.x, Mercurial now provide the largefiles extension by default, so you can update the part about the BigFile extension. – Klaim Dec 22 '11 at 13:22
44

Perforce

Perforce (commercial/closed-source, centralized) is the industry standard for a number of reasons.

  1. It's a commercial product, which means it comes with commercial support. Open-source projects may be eligible for a free license (minus the technical support).
  2. It supports workspaces very well, which allows very flexible source and asset directory layouts.
  3. It supports changelists very well.
  4. You can see who is working on what. Games have an abnormally high number of rapidly changing binary files (assets) compared to other development projects. Most of the time these are non-mergeable, so keeping track of who has what/where/when is critical. Subversion and DSCC clients intentionally avoid this technique, but it's quite beneficial in certain applications.
  5. It supports gigantic code/asset bases. It does not store duplicate data on client machines, which is important when your sub-view of the tree is a couple dozen gigs.

That said, it's painfully obvious on an almost daily basis that Perforce doesn't feel their position in the industry is threatened. Their visual tools, including P4V and P4SCC (integrate with Visual Studio) are slow and buggy, with the latter known to freeze Visual Studio for the sheer enjoyment of it. AnkhSVN is miles ahead of Perforce.

Comment by xan: It is worth noting however that their merge tool, P4Merge (used for diffing and merging) is excellent and far superior to the likes of Tortoise Merge. Surprisingly, this component is available for free as part of the P4 Visual Tools package.

Comment by slicedlime: Another drawback with Perforce is that branching in it tends to be a huge pain, especially if you have large trees. Almost every other vcs is better at branching and merging. This is usually a small price to pay for the above advantages though.

Comment by roe: Perforce is extremely chatty. There's not much going on without the server involved. Most notably, you need the server in order to do open-for-edit, which means you need to jump through a few hoops if you intend to break the connection to the server.

Comment by jrista: As a daily user of Perforce for over two years now, with an extended development and quality engineering team of well over 100 people, I have become intimately familiar with it. While it is a decent source control system, it does have its drawbacks that those evaluating SCC systems should be aware of:

  • As mentioned by others, branching/integrating is particularly cumbersome and difficult to do. You have an ungodly amount of control, but it comes at the cost of excessive complexity. On the flip side, the visual merge tool is one of a kind, and presents a beautiful three-file "based" merge view of your work. Perforce does provide some graphical visualizations of branch paths (called the Revision Graph), however the way it is visualized often makes the tool rather useless. If you only need to see a very small segment of time for one or very few files, it can be useful...anything more, and it is near impossible to navigate the Revision Graph.
  • Perforce is also not a very efficient tool, as almost any file operation requires duplicating files and data: branching, labeling, change lists, etc. No sparse or lightweight tagging or branching here. If you are not afraid to use a tremendous amount of disk space tracking your changes, perforce will probably serve you well. If not, I would look to another tool.
  • Perforce makes use of workspaces, however these can be frustrating at times, as perforce caches all state in your workspace, rather than using the actual files on disk to determine some state. This often results in files not getting synced because your workspace says they are up to date, when, for whatever reason, the physical files on disk are indeed NOT up to date.
  • A final annoyance, Perforce is rather brutal on your network. It is an extremely chatty program, and consumes a considerable amount of bandwidth. Any network connectivity loss, and you run the high risk of being unable to do any work with your source-controlled files until connectivity is restored. As of yet, I have not discovered an activity that can be performed off-line in Perforce.
idbrii
  • 798
  • 3
  • 18
Sam Harwell
  • 149
  • 2
  • 9
  • 2
    Another advantage of Perforce is that it's free for the first two users, which is great for small hobby projects, or two evaluate its suitability for a larger project. –  Jul 16 '10 at 20:01
  • From my experience point 5 from your answer is very true. It is extremely scalable. Errors are rare, support is speedy.

    Not certain on the cost, Yacine notes that it is free for the first two users. For smaller studios with smaller games it would be worth evaluating other options.

    Visual Studio integration is a little off out the box; but we are lucky enough to have a complete Perforce enthusiast at our studio write an excellent source provider for VS from scratch.

    – paulecoyote Jul 20 '10 at 21:16
  • Perforce is awesome, if you can afford it. I've not used all of the version control systems out there, but in 10 years of commercial game development, and several version control systems, it's the best I've used by quite some margin. It handles large data files quickly and reliably, and if set up properly, performs well in a multi-site setup over relatively slow internet connections.

    The Visual Studio integration may not be perfect, but it's pretty solid.

    – bluescrn Aug 06 '10 at 21:44
  • 1
    If I had to choose one VCS for (large scale) game development, it would be Perforce. If I could choose multiple, I would add a DVCS for text assets (code, scripts, and other miscellaneous data files) but keep Perforce for binary assets. – dash-tom-bang Sep 10 '10 at 22:14
  • re offline work: p4v recently introduced "Reconcile offline work" which will scan for changes to files that have been edited but not checked out, or not added to the depot. Very handy. – tenpn Sep 27 '10 at 09:12
  • Perforce is well worth the money - if you can spend it. I've worked with other version systems like Git and Subversion, and especially the latter feels like it carries the "Sub" in its name for a reason. – CodeSmile Sep 27 '10 at 22:19
  • 2
    I worked on a large game project that used Perforce and was completely bewildered that anyone would pay for it. The need to be in sync with the server is obnoxious, even on a LAN. I understand the "reconcile offline work" feature but in practice the workflow is so intrusive that when there are network hiccups you just don't work. If you don't use an IDE that has a P4 plugin, or you just want to edit a file here and there from the command line or another tool, you have to go back to the P4 client and do some bookkeeping. No other VCS puts up so many barriers to doing work. – Suboptimus Dec 22 '11 at 01:30
  • Update to the first comment here, it's apparently free for 20 users now (Says their website). – Genesis2001 Sep 05 '12 at 04:06
  • For offline in Perforce, look at "Reconcile Offline Work..." – Almo Nov 12 '13 at 21:01
  • So the only advantage of Perforce is working with the large files. In other criteria Git is better. Experience with Perforce is required if you want to get the job at Ubisoft. –  Apr 18 '14 at 09:19
28

Subversion

  • Open-source, centralized

  • Blender files - I'm not entirely sure if .blend files are binary (they look like it), but I have had no problems adding them to Subversion. Having done a few experiments, the file size increase for changed files appears nominal, so it's not simply copying in the entire file.

  • Large projects - It works, though it can get quirky. It's definitely able to handle repositories of at least 5.5 GB (total size of repository dir on server; mostly binary assets).

  • Duplicated Data on the Client - Subversion keeps a duplicate copy of every file in the user's workspace as a pristine copy. The advantage of this is you can do a diff or revert without going back to the server. The disadvantage is that your 10 gig of working files takes 20 gig of disk space.

  • The ignore list is a property of a directory (simple with a gui, annoying on the command line).

  • Subversion allows locking of files/assets - which is really helpful if multiple artists and designers work on the same files.

  • Externals are a great way to handle shared (e.g. library or base) code between projects.

Andreas
  • 918
  • 6
  • 10
Cyclops
  • 3,129
  • 3
  • 32
  • 39
  • Large projects - KDE, GNOME. – Dr. Snoopy Jul 15 '10 at 18:10
  • FSFS can be very brittle for recovery - so extra care should be taken with backups.

  • Easy to understand, TortoiseSvn is a great client.

  • Open source bug trackers, continuous integration systems, etc often have support for Subversion "out the box".

  • There's some good books available about using Subversion.

  • – paulecoyote Jul 20 '10 at 21:21
  • 1
    @paulecoyote, this is a wiki post, feel free to edit it with new information, not just comments. – Cyclops Jul 20 '10 at 23:17
  • 1
    SVN is great to use with Tortoise SVN (http://tortoisesvn.net/), a client which integrates nicely into the context menu and provides guis for all actions. Unfotunably, tortoise has no Linux/Mac OS ports (at least of writing this). Protip: if working with multiple people, always update (and merge/resolve conflicts) BEFORE commit-ing. – Exilyth Mar 23 '13 at 19:09