67

Possible Duplicate:
How to manage a Closed Source High-Risk Project?

I'm working on an institution that has a really strong sense of "possession" - each line of software we write should be only ours. Ironically, I'm the only programmer (ATM), but we're planning in hiring others.

Since my bosses wouldn't count the new programmers as people they can trust, they have an issue with the copies of the source code. We use Git, so they would have a entire copy of each of the projects they work on, when they clone the repository.

We can restrict access to them to a single key with Gitolite and bind that to their PC's, but they can copy those keys to another computer and they would have the repository access in another PC. Also (and the most obvious method) they could just upload the files somewhere else, add another remote, or just copy the files to an USB drive.

Is there any (perhaps clever) way to prevent events like these?

EDIT: I would like to thank everyone for their insights in this question, since it has been not only more eye opening, but also a firm support of my arguments (since you basically think like me, and I've been trying to make them understand that) against my bosses in the near future.

I am in a difficult situation work-wise, with my coworkers and bosses (since I'm basically in the middle) being like two gangs, so all this input is greatly, greatly appreciated.

It is true that I was looking for a technical solution to a people problem - both the management and the employees are the problem, so it can't be solved that way (I was thinking about some code obfuscation, perhaps working with separate modules, etc., but that wouldn't work from my developer POV). The main problem is the culture inside and outside the company - development is not taken seriously in my country (Venezuela) so naivity and paranoia are in fact a real issue in here.

The real answer here is an NDA (something that here in Venezuela doesn't completely work), because that's the people solution, because no sane developer would work in those conditions. Things will get ugly, but I think I will be able to handle that because of your help. Thank you all a lot! <3

AeroCross
  • 871
  • 117
    It sounds like you are asking for a technical solution to a people-problem. If we take this a step further, how would you prevent the new programmers from committing the lines of code they saw / wrote to memory and copying them down later? –  Oct 17 '12 at 16:32
  • 152
    Wow... remind me to never apply for a job there. – Walter Oct 17 '12 at 16:35
  • 35
    If your boss don't make the difference between intellectual property and real property, then he is likely to be as competent in the business than he is technically. Is he pointy haired ? – deadalnix Oct 17 '12 at 16:50
  • 34
    There is an easy technical solution to this problem. Don't write any code. No code, no leaks, no problems (other than the problems that would have been solved by the code). – emory Oct 17 '12 at 17:08
  • 4
    Just to help clarify this issue both for you and for people responding, whether or not you can trust people won't steal the code only really matters for the (paranoid?) worry that a new person would steal all your code in the first week and then leave. Beyond that, trusting that the code they are writing is any good is the bigger issue. As in, if you are trusting their code enough to rely on it for your business, then it seems small potatoes to trust them with a copy of the codebase. – jhocking Oct 17 '12 at 18:33
  • 2
    @jhocking - the truly paranoid would hire multiple developers (preferable unaware of each other) to do the same work. A voting system would collate their inputs. You would not have to trust any developer, just trust that a majority of them are not making the same screwup. Not that I think it is a good idea. – emory Oct 17 '12 at 18:47
  • 35
    I've worked for people like this. Their companies never grow past a certain point because they are unable to give up control and ultimately fail in some area. I don't think I'd consider this a long-term career although being the first programmer in a company can certainly give you some great experience (Does he mind you taking THAT with you when you leave?) – Bill K Oct 17 '12 at 19:20
  • 7
    As others have already pointed out, you need to consider which risk is bigger: having the code leaked or hiring people who are not passionate. Passionate people don't work where they don't feel trusted, sometimes like to tackle problems from home, and hate having obstacles put in their way by clueless managers. Some of them also prefer their laptops. Obviously, there are some counterexamples where it's strategically important to at least divide codebase between employees, like at Apple or Microsoft. Are you creating iOS, or Windows, or what? – Dan Oct 17 '12 at 19:32
  • 13
    If you're competent and intent on stealing code you will succeed. – MrFox Oct 17 '12 at 20:07
  • 5
    You'd have to prevent camera phones, printers, and everything. Heck, we could easily hide our code in an innocent looking image and sneak that out. I've worked at a place that locked down USB ports, blocked tons of sites. Monitored email. and generally restricted access heavily. And there's still dozens of ways to get code out. Ironically, it also made it hard for me to bring something in if I alread had the solution at home! – CaffGeek Oct 17 '12 at 20:30
  • 1
    @emory ISTR reading that sort of setup being tried with 3 parallel teams as an error prevention setup (assuming in most cases only 1 of 3 would get it wrong); but that it was a miserable failure in practice. – Dan Is Fiddling By Firelight Oct 17 '12 at 21:11
  • 4
    @CaffGeek not to mention you'd have to wipe their brains off memories at the end of each work day. The boss is always complaining about devs not remembering what they did yesterday. – emory Oct 17 '12 at 21:15
  • 6
    Unless you work for a government institution where secrecy actually matters, this is a recipe for disaster for the company. If you're so worried about your stuff being stollen, you should go do one of those silly patent and trademark and copyright nightmares that everyone seems to be so into right now. – Linuxios Oct 17 '12 at 21:34
  • 3
    Good luck hiring ... I wouldn't work for a company that didn't trust me with the source code, and I know no good developer who would. – Duncan Bayne Oct 18 '12 at 01:57
  • 3
    Lock all the computers, remove DVD drives, fill all the USB ports with putty, no printers, no internet connection. But what a miserable place to work! – sgud Oct 17 '12 at 18:13
  • 5
    I disagree with most people here. Here's a real life situation from a big company I worked for: New programmer is hired. New programmer is very enthusiastic. New programmer wants to work over the weekend! New programmer copies entire code base to USB, so he can work from home. This is a MAJOR security breach, for every company! Luckily - IT security did they job - there were tool in place to detect leaking of sensitive data. USBs work, emails work, but you cannot send all types of data. This is not paranoia, but common sense. – Kobi Oct 18 '12 at 07:15
  • Many of these answers (not the comments) are not distinguishing between source code and data. There are established protocols for dealing with sensitive data e.g. one of the answers mentioned it, about Protected Health Information. Code leaks, which was the specific question, is different, and from my experience, a more difficult matter. Regardless, suggestions such as what @Kobi stated is just good sense, and has nothing to do with paranoia. – Ellie Kesselman Oct 18 '12 at 07:43
  • 4
    Comments are not suitable for extended discussions. Please refrain from commenting if you don't have something valuable to add to the question, if you just want to talk about it we have chat, comments are only meant for clarifications. – yannis Oct 18 '12 at 14:36
  • The very point of a Development Team is to work together to solve the problem that the program needs to solve. I may know how to do this, but I'm getting a compile error when trying to build my new code and have no idea what's causing it. I shoot an email to a coworker, he comes over when he has the time (after he's finished with his block he's on...say 5-10 minutes). I work on something else for a few minutes, he comes over, we look it over, bam he see's I misspelled something. That's only a few minutes vs. banging you head for hours. It just simply cuts the idea of a team out of the equation – Randy E Oct 18 '12 at 14:40
  • I think it's more important to ask how to reduce unnecessary code leakage than it is to ask how to prevent code leakage. It'll happen either way, whether it's on purpose or accidentally. – zzzzBov Oct 18 '12 at 19:20
  • @Kobi Emails work? Okay, then then there's a very good chance a zipped tarball of code in an email will also work... Especially if you changed the extension. – Izkata Oct 19 '12 at 11:02
  • @Izkata - Indeed, too clever. My point is not that you cannot bypass the protection by easy means. Not all leaks are caused by malice. For example RMS can encrypt all documents, stop you from sending them in mails, copying data, and even take screenshots. This send a strong message to people: This data is important. So yes, you can use your camera to take a picture of the screen, but it achieves its goal nonetheless. It is naive of most answers here to assume you should do nothing, trust people, and everything would be OK. – Kobi Oct 19 '12 at 12:01

14 Answers14

137

This is one of the situations where you are looking for a technical solution to a social problem.

A social problem should require a social solution, which, in this case, takes two complementary forms and an additional organizational solution which may help:

  • Trust. If you don't trust developers, don't hire them. Working with people you don't trust is synonymous of failure. Relations based on mistrust require a lot of formalism, which may severely impact not only the productivity of your employees, but also the number of persons ready to work with you. Chances are, the best developers will avoid your company at all costs.

  • NDA. Trusting someone doesn't mean you shouldn't take legal precautions. Those precautions can take a form of a contract or a NDA clause with severe consequences for the employee in a case of a disclosure.

    How severe are the consequences depends on who you are. Government organizations, terrorists or mafia can permit some deterrent ones. Ordinary companies may be limited, by law, to financial ones only.

  • Slicing. Trust and contracts are a good start, but we can do better. If the sensitive part of the code base can be sliced so that two or more parts are required for the product to function, make sure that the developer from department 1 never sees the source code developed in department 2, and vice versa.

    People from one department shouldn't be able to meet people from other departments, and ideally, they shouldn't even be able to guess what other departments are doing, nor how much departments are there. Each person knows only a small part, which is not enough to have an entire picture (and reconstruct an entire product outside the organization).

Those were social and organizational measures.

Now, technically speaking, there is nothing you can do.

You may try to:

  • Force the developers to work in a closed room on a machine which is not connected to the internet and doesn't have USB ports.

  • Install cameras which monitor everything which happens in the room, with several security officers constantly observing the developers working.

  • Strip-search every developer each time he leaves the room to be sure he don't have any electronic device which can hold the code.

  • Require every developer to have an ankle monitor. The device will listen to what they say, record their position and attempt to detect any electronic device nearby. If the developer was near a device which is not identified and doesn't have your tracking software installed on it, private investigators and hackers may attempt to check whether the developer wasn't using the device to leak information.

  • Forbid developers to leave your buildings, unless being under heavy surveillance, and to interact in any way with the outside world.

Some or all those measures are illegal in many countries (unless you represent some government agencies), but the worst part is that even with all those measures in place, developers will be able to get the code, for example by discretely writing it on their skin or on a piece of paper and hiding it in their clothes, or simply memorizing it if they have Eidetic memory.

Or they can just globally memorize the data structures and the algorithms—that is the only important thing where intellectual property matters—and create their own product inspired by those two things.

  • 41
    -1 - Of course there's something you can do. Do you think armored car companies trust all of their employees? Do banks have to trust all their developers? They all have various security measures in place to prevent both tangible theft, and IP theft isn't that much different. How do you think defense contractors handle it? This is a misleading and provably wrong answer. – Scott Whitlock Oct 17 '12 at 17:29
  • 64
    @ScottWhitlock I'm sorry but how do you stop people from memorizing a block of code they're writing and then going home, rewriting it from memory and selling it? Though this is a ridiculous fear, it is 100% possible and 100% unstoppable if an employee so chooses to do it. This is precisely why MainMa is spot on that you must trust your developers. (As well as having contracts and a legal team to enforce trust violations) – Jimmy Hoffa Oct 17 '12 at 17:47
  • 25
    @JimmyHoffa - if you're talking about a code base of any significant size, then the amount you can memorize is insignificant compared to what's at stake. That's like saying that since people can steal my garden gnomes, what's the point of locking my car? People tend to take the easy route, and that's making a digital copy. – Scott Whitlock Oct 17 '12 at 17:49
  • 8
    @ScottWhitlock I'm not speaking about stealing the entire code base, there are of course plenty of security tactics available to make that difficult, I'm just saying that regardless of security measures; you need to trust your developers, because you can't completely stop them from acting against your interests. – Jimmy Hoffa Oct 17 '12 at 17:54
  • 2
    @JimmyHoffa - I'm sure we're on the same page then. I'm suggesting a layered approach - technical barriers to bulk copying, and NDA + good background checks, social engineering, training, threatening, etc., to deter smaller scale stuff. – Scott Whitlock Oct 17 '12 at 17:58
  • 2
    @ScottWhitlock yes, money is very different, so your banking comparison don't make any sense. Money do not duplicate like code do. Obviously you can take security measures, but a level of trust is always required. – deadalnix Oct 17 '12 at 19:02
  • 3
    I agree with @ScottWhitlock - I had training at several places I wrote code (Apple, NASA, FBI) that included common-sense ethics stuff I learned in my BS degree, as well as signing NDAs, reminding folks what the penalties were for violating copyright, etc. At NASA nobody on my team could access the IT resources until they had followed the training on MTCR, which included videos. The machines were locked down with keyboard spies and reminders (every time you logged on) that everything you typed was recorded. – Fuhrmanator Oct 17 '12 at 19:42
  • 25
    @ScottWhitlock, you don't need to take the entire code base, just the important proprietary bit. Any decent dev can recreate what they've done a second time if they need to. The value is in the novel approach to solve the problem. That's what developers do, sovle problems. Code is merely the recording of the solution. And the real IP, the solution, the intangible thought that solved the problem is impossible to protect. If I'm working on the next big thing at apple, they can't wipe my brain from knowing what it is...and if I know what it is, I can recreate it. – CaffGeek Oct 17 '12 at 20:36
  • 1
    +0 While I want to upvote this answer, there are ways to help the OP accomplish at least a little part of his goal. See some other answers for examples. – Phil Oct 17 '12 at 22:32
  • 1
    @ScottWhitlock you've obviously never heard of samuel slater. – electron_avalanche Oct 17 '12 at 23:13
  • 1
    It is really best answer. The main problem of IT projects is "how make this work", not "how to prevent this from leaking". I had worked in many companies (more then 5) and after leaving I had removed all source. Because code it is only part of the business and not the full business. So if it neccessary to prevent leaking code, how to prevent leaking team, mathemetical ideas and other non-controlled things? Answer is simple:

    Technically, there is nothing you can do.

    – pinocchio964 Oct 18 '12 at 12:17
  • 2
    Because it's possible to pick a lock, should you never lock a door? Just because you can't make something theoretically impossible doesn't mean you can't create obstacles that deter people. Defense contractors DO make it difficult to take their code home. – MetricSystem Oct 18 '12 at 16:01
  • @CaffGeek, That depends on the nature of the business you are working on. For most, there's nothing revolutionary about the code. Sure, you can re-create the code, but there's no way to do that as fast (nor as cheaply) as a thumbdrive transfer can. – Pacerier Oct 04 '15 at 20:35
70
  1. Make them sign a non-disclosure agreement.

  2. Only hire people you trust.

  3. Compartmentalize your code base. Use of dependency injection so you can give them requirements that, when finished, resulting classes would fall right into place into the existing architecture, but they will not have acces to the "complete picture", only loose pieces. Only senior, trusted people would have clearance to the "architectural glue" that makes all work as a complete whole.

  • 18
    I wouldn't recommend the last one, because it would prevent any dev to do smart stuff. Any competent dev would runaway from such a situation. The conclusion is not hard to deduce. – deadalnix Oct 17 '12 at 16:48
  • 20
    @deadalnix: #3, if done properly, makes code much easier to maintain; it forcefully avoids coupling. I'd say this approach is more common on very large projects. That being said, I don't think the OP's company is at the scale needed to justify it. – Brian Oct 17 '12 at 17:05
  • 15
    I have nothing against separation a project in different parts. I have something against hiding the larger picture. This is making your devs blind on purpose. Do you think ANY dev that can choose his/her work will accept to play this stupid game ? – deadalnix Oct 17 '12 at 17:33
  • @deadalnix Maybe not all developers will have to work that way, only those who doesn't have enough seniority or haven't earned anough clearance. – Tulains Córdova Oct 17 '12 at 17:57
  • 13
    @deadalnix given enough money yeah. I worked for a DoD contractor once. Closed environment, two separate machines (one with Internet Access for research one without for coding), and it was obvious that what we were coding wasn't the full product. Everything you could complain about a software project was there. But boy did they pay! It was soul crushing but I looked at my bank account every week and continued on. – Michael Brown Oct 17 '12 at 18:28
  • I have to disagree with #3 - the exorbitant architectural costs of compartmentalizing your codebase to protect against what is a people problem is not worth it. In fact I would say that even splitting up your codebase based on VCS permissions is too expensive to justify. Look at the additional cost in unecessary merging, ofuscation from devs who are just trying to be productive and infrastructure to support this notion. Its why OSS will always win. – deleted_user Oct 17 '12 at 20:33
  • 9
    @stackmonster An example: the developer or an Eclipse plugin doesn't have to have access to the Eclipse code, only to the plugin interface definition. I suspect that in a company like Apple, for instance, very few developers have access to the whole code base of iOS. – Tulains Córdova Oct 17 '12 at 20:37
  • @user1598390: if you needed access to eclipse's source code to write plugins, it would be an appalling plugin architecture. The same with applications on iOS. That isn't solving the same problem, that's solving the problem of it being completely unworkable to develop any other way on systems that large with requirements like that. – Phoshi Oct 17 '12 at 21:36
  • @user1598390 I'm pretty sure every apple dev have access to sevral BSD distro and Mach kernel. – deadalnix Oct 17 '12 at 22:30
  • It's a legal issue indeed, let them sign an agreement. – Carra Oct 18 '12 at 07:11
  • The third option is also a way to easily introduce bugs in other sections of the code by not knowing how it's all going to work together. A Dev needs access to the entire code base to be affective in almost any situation. – Randy E Oct 18 '12 at 14:19
  • @randy-e Do you think that appies also to humongous projects like OS X, Linux, etc. Does the developer of functionality "A" need to study and know the other hundred million lines of code ? Or Has he/she to know only the specificatios of his/her part of the project ? – Tulains Córdova Oct 18 '12 at 14:51
  • "...in almost any situation." The vast majority of programmers aren't working on projects like an entire operating system...common sense should apply to the term almost any situation. – Randy E Oct 18 '12 at 14:53
  • @randy-e I guess in very small projects. – Tulains Córdova Oct 18 '12 at 14:54
  • Even on large projects, developers should still have access to the entire source because my code could be interacting different than expected with another persons code, but I wouldn't know that and wouldn't be ableto know the best way to fix it without being able to see their code. – Randy E Oct 18 '12 at 14:58
  • 1
    @RandyE That's what object oriented design is for. You design interfaces and abstract classes other people has to implement or extend in order to work with the rest. You only have to comply with a contract, i.e. method signatures, interfaces etc. It's the same auto-makers work with part-makers: they issue engineering specifications. The part maker doesn't need to know what exact car model will be using a certain spark plug or clutch. They only have to follow custom specifications and quality standards. The automaker doesn't need to leak model designs to part-makers. – Tulains Córdova Oct 18 '12 at 15:05
  • @MikeBrown, Cool =) Btw are they paying way above Google rate? – Pacerier Oct 04 '15 at 20:39
45

I love the idea there might be a "clever" idea that "we" as developers would be baffled by. Given that every developer tool written was written by a developer and all that.

Your boss's biggest problem is naivety with a dash of paranoia. I'm being polite there. Really really polite.

If you really want a shopping list of things to keep your code proprietary, just implement the following:

  1. Disable USB and other IO on all company computers. This can be done through most enterprise anti virus or similar.

  2. All developer machines to be desktops or towers. No laptops.

  3. Do not allow any machine to connect to the internet. No web, ftp, email, IM, no internet. Cut the wires.

  4. No remote working/access (sort of covered by no internet, but some smart spark might suggest a VPN)

  5. No mobile phones or other electronic devices to be taken into the secure "development" room.

  6. Configure all printers to print a large visible watermark on every page, front and back.

  7. bag searches both in and out. Search for hand written notes, anything printed on company printers (anything, they might have hidden code in an image with steganography!). Any electrical or electronic devices. In fact, probably best to ensure bags are kept out of the secure area and developers should wear clean suits, the sort of thing you'd see in drug dens and chip fab plants.

  8. Servers should be equally isolated, back ups should be encrypted and only your boss should know the password to restore from them.

Ian
  • 5,452
  • 5
    And even then, good old memory works great. And without cavity searches... – CaffGeek Oct 17 '12 at 20:44
  • 11
    Hah, you know I started adding a 9) developers might remember something, but couldn't think of a way to fix that besides filling the developers with vodka :) – Ian Oct 17 '12 at 20:51
  • but I'm smarter frunk, or at least I deel that way. ;) – CaffGeek Oct 17 '12 at 21:17
  • 25
  • No outside phone calls allowed from landlines etc (they could tell someone the code over the phone).
  • Developers must work in windowless room then they can't shine a torch out the window and transmit the code via morse code.
  • Developers must be neuralized after completing each day of work.
  • – zuallauz Oct 17 '12 at 21:19
  • This is the kind of ridiculous security that only top secret government data needs. Although I think it's reasonable to have a government programmer in the NSA doing that, I can't imagine any civilian company needing to go that far. +1. – Linuxios Oct 17 '12 at 21:37
  • 3
    For number 9, you can use Jason Bourne style behavior modification training. Everyday before employee goes home. – hrishikeshp19 Oct 18 '12 at 01:39
  • 5
    continuing zuallauz's list, #12 developers go to work nude to show they don't tatoo code. #13 developers do not brown bag lunch lest they arrange their potato chips in code instead of eating them. – zundarz Oct 18 '12 at 04:31
  • you forgot brainwipe at 5pm every day. – Johnno Nolan Oct 18 '12 at 10:17
  • 8
    why send them home? Keep them in the building, food and drink passed in through an airlock system only. – jwenting Oct 18 '12 at 11:28
  • 17
  • Don't ever run software written by developers. It can contain malicious code that steals confidential data and sends outside the company
  • – Adam Dyga Oct 18 '12 at 14:16
  • I'm baffled by "clever" asymmetric encryption algorithms used on the web, and I am unable to bypass them despite being just a developer tool written by "us" developers. Care to enlighten me? – Allon Guralnek Oct 18 '12 at 18:35
  • Allon: I don't really understand how the software that grants Root to my Android phone works, but a developer did and I used the software they wrote to bypass security on my phone. For your example, knowing how encryption works (probably) won't allow you to bypass it. That wasn't the point. – Ian Oct 18 '12 at 20:32
  • 4
    I heard a story told of a great theif. He went to major trade shows, carrying nothing. On leaving the first day of one particular trade show, he told a guard that he was indeed a great theif, having stolen untold billions. The guard searched him, but found nothing. Again and again the theif returned telling tale of the great wealth he had stolen, and again and again the guard found nothing on him. Perplexed, the guard asked the theif how he carried out his great thefts. He replied, "I steal ideas." -- You do not need to copy code. You need only the ideas contained within. – greyfade Oct 20 '12 at 09:07