16

If experienced programmers actually ever use debuggers, and if so under what circumstances. Although in the answer to that question I said "months" ago I probably meant "years" - I really don't use a debugger. So my specific answerable question is under which circumstances would you, as an experienced programmer, use a debugger?

user
  • 2,200
Neil Butterworth
  • 4,066
  • 3
  • 23
  • 28
  • 14
    It's like asking if experienced programmers are using keyboard... I don't understand what experience has to do with it - do you think they're Gods and create perfect- working code without errors from beginning? And even if so what does it mean to you - will you stop using debugeer when you need and star saying: "I don't use debugger so I'm reaa programmer"... :) BTW. I doubt any professional will answer such a question... –  May 21 '11 at 16:53
  • 3
    @Wooble: the basic question "do experienced programmers use debuggers" is a good one. It actually surprised me that it set off a mini holy war. – Kevin May 21 '11 at 17:09
  • Neil - you keep saying you have no desire to use a debugger. Is this because you don't write bugs in the first place, because you instrument your code in ways that make a debugger unnecessary, or because you write your code in a way that makes "debug by reading source" easier/faster than using a debugger? Or is it the real-time situation Luke Graham mentions below? – E.Z. Hart May 21 '11 at 17:15
  • 20
    Real programmers, of course, use butterflies – Rein Henrichs May 21 '11 at 18:49
  • 1
    @Rein Henrichs: Nicely put. @OP: See also: The Story Of Mel (Some of you might have read this before; I know I have ;)) – Piskvor left the building May 21 '11 at 19:46
  • 1
    I don't know why this was closed, but since I'm not 3k I can't vote to re-open. I won't add another me too answer, but I just use the debuggers that are built into my ide's, so I can use breakpoints when needed. –  May 21 '11 at 21:56
  • 4
    Most existing debuggers are old-fashioned, have crappy interfaces, and require the programmer know and understand concepts and paradigms that are difficult to master, and, nowadays, not fair to expect most programmers to use or know. As a result, most modern, experienced programmers, go to great lengths to learn the skills necessary to write the kind of code that rarely ever has to be debugged in a debugger, to avoid the pain of the experience. So "yes they use it" and "as little as possible" – blueberryfields May 21 '11 at 22:00
  • 1
    @Rein H. -- special Badge for you ... something about linking to a HILARIOUSLY FUNNY cartoon, while refraining from using that in an "answer", while voting the question down. – Smandoli May 25 '11 at 13:31
  • 1
    I like special badges. You forgot "voting to close". – Rein Henrichs May 25 '11 at 15:41
  • I've been developing since the mid '80s and using a debugger has always been rare for me. The only one I used semi-regularly was SoftICE, and looking back, it helped in many cases, but it was a crutch that delayed the reasoning about the problem that actually led to the solution. I see a developer depending on a debugger as a problem. – janm May 29 '11 at 12:15
  • 7
    Experienced programmers who "don't use debuggers" are probably thinking in terms of gdb/SoftICE, and have never used an actual integrated-debugger (and probably don't use an IDE for that matter). They are so far behind the times it's painful. – BlueRaja - Danny Pflughoeft May 31 '11 at 21:23
  • 3
    Perhaps you should have asked if good programmer's use debuggers. The best programmers I seen spend a fraction of time thinking about what the nature of the bug is and then jump immediately to the correct piece of code and fix it. They almost never use a debugger. – snakehiss Jun 12 '11 at 21:06
  • The basic problem is a computer never tells you anything unless instructed to. Using a debugger you can learn the things the programmer did not explicitly ask to be told. –  Aug 12 '12 at 17:09
  • @dietbuddha it seems you never worked in code someone else wrote. BTW it seems those "good" programmers also never worked on other code than their own. – Pablo Ariel Jan 07 '13 at 18:25
  • @PabloAriel I've worked on plenty of code that I didn't write. I've coded in many languages, scripting and compiled; and I've found simply reading the code and understanding the intent usually lead me to the bug, not always, but most of the time. Our automated tests usually tell me everything I need to know to fix a bug. – snakehiss Jan 08 '13 at 22:01
  • @dietbuddha there are things that cannot be easily fixed without a debugger, specially if you have a deadline for a project and you just landed in a company where all the code is a mess and there are not enough automated tests to cover all posibilities. Plus there are things that can't be automated, you can be working in a graphics rendering software or even simulations where all your perceptions are required to test everything you can. Of course you can easily guess the fix for many of them but you can't automate, and there are cases where you can automate but is not easy to find a fix. – Pablo Ariel Jan 10 '13 at 03:35
  • @PabloAriel That's why I said most of the time and not all the time. I do use the debugger sometimes, but not often. To me it is more a question of what your first approach is. Do you immediately jump into the debugger, or do you already know (within a few lines) where the problem is because you've read and understand the code. – snakehiss Jan 10 '13 at 03:49
  • @dietbuddha I understand, but I believe the more you code the more chances that you will need a debugger. Even if you are good or know the code enough to look right into the function or code that is failing, the debugger can make finding things faster, as do the subversion, the code analyzer, etc. Of course depending on the language and the kind of application you make, you may need it (or care) less. Maybe if your application makes use of complex deterministic math functions or if you perform custom memory management, then the debugger can help much more, it would be smart to use it often. – Pablo Ariel Jan 12 '13 at 19:59
  • 1
    @BlueRaja-DannyPflughoeft That's an incredibly opinionated statement. I used to use "integrated" debuggers all the time and prefer lldb over Eclipse any day of the week. I'm 23; am I "so far behind the times"? – Qix - MONICA WAS MISTREATED Sep 18 '16 at 18:43

15 Answers15

46

I would say that not using a debugger is a sign of inexperience. Stepping through code line by line is the best way to trace the flow of execution.

mikerobi
  • 283
  • 30
    strange then that after over 30 years of programming in assembler, fortran,, C, C++ etc. etc. I feel no desire to use one. –  May 21 '11 at 16:56
  • 60
    Doing something for a long time doesn't necessarily make you good at it. – ceejayoz May 21 '11 at 17:03
  • 31
    Not being able to use a debugger is a sign of inexperience. Understanding the flow of a program by just reading code isn't. Of course, experienced programmers will need the debugger once in a while, but if you can read the code, there is no need to, and it won't make the process of debugging any faster either. – GolezTrol May 21 '11 at 17:03
  • 1
    @GolezTrol, I often wonder at the complexity of software from people who claim debuggers don't make debugging faster. Being able to understand the flow doesn't make it fast to determine which of hundreds of potential flows into one function among millions of lines of code is triggering a certain condition you didn't expect. – Karl Bielefeldt May 21 '11 at 20:27
  • 10
    @Karl Bielefeldt: Let me name a couple of famous examples of programmers who don't use debuggers for debugging. Linus Torvalds, author of Linux. Larry Wall, author of Perl. Complex enough software for you? – btilly May 21 '11 at 21:14
  • 9
    @Neil: how much of your time do you spend working on your own code, and how much maintaining code written by other people? And in particular, how much maintaining code written by other people who should never have been allowed anywhere near a programming language? – Carson63000 May 21 '11 at 21:44
  • 1
    @btilly, I'm not arguing that debugging complex software isn't possible without a debugger, I'm arguing that it isn't always as fast. – Karl Bielefeldt May 21 '11 at 21:56
  • 3
    +1 - you need to see the data the code works upon. –  May 21 '11 at 22:55
  • 3
    @Karl Bielefeldt: Obviously debuggers make debugging faster. However they also encourage missing the forest for the trees. Thus keeping people from noticing bigger design mistakes. See http://lkml.indiana.edu/hypermail/linux/kernel/0009.0/1148.html for Linus pointing that out. Also there are often better ways to do what debuggers do. See http://lkml.indiana.edu/hypermail/linux/kernel/0009.1/1307.html for IBM's experience. And debuggers discourage certain programming techniques. If you like closures or macros, you probably don't like debuggers much. – btilly May 22 '11 at 00:38
  • 4
    This is very weird, mikerobi: The longer I program the less I use debuggers. It's reached the stage that for several languages I use I'm not even sure if they have a debugger. – JUST MY correct OPINION May 22 '11 at 07:57
  • I try to write my code so that I don't need a debugger (as they can be tricky to insert into a complex time-dependent remote environment). I keep control flow simple. I add plenty of logging statements which I can switch on and off at runtime. I minimize the amount of work I do on one line. I work in an environment that gives informative stack traces when I do hit a real problem. With that, the lack of a debugger isn't a problem; unexpected failures report what happened and why. – Donal Fellows May 22 '11 at 11:06
  • Sometimes, there is no debugger to be used. Other times, a debugger can cause more problems than it solves. One example of the latter is an embedded program; halting the CPU can cause interrupts to be missed and controlled hardware to fail. At the very least, it can add overhead that causes timing problems. – Mike DeSimone Jun 06 '11 at 04:54
  • @Neil - how do you debug binaries for which you don't have the source? – Barry Kelly Jun 16 '11 at 10:52
  • Lol I want these newbies trying to "understand" the 7 million of lines of C++ source code of the project I'm working on. Actually I have a coworker that it's trying to fix a memory leak without use of the debug build! Hilarious!! – Pablo Ariel Jan 07 '13 at 18:30
  • -1 This is wildly opinionated and baseless. – Shaun Luttin Jul 12 '17 at 15:51
  • @MikeDeSimone you of course have a point, but as with anything, an experienced programmer should know when to use a tool - not just throwing it at every problem they encounter. Obvious example where I've used it a lot is a program that bases its actions on events read from a socket. While the program expects one thing, the bug might lie on the other end of the TCP connection. With a debugger you discover this pretty much immediately. – EvenLisle Feb 22 '18 at 09:14
28

I use the debugger often, because I work on a large system and therefore I suck. http://steve-yegge.blogspot.com/2007/06/rich-programmer-food.html

No matter how short and frequently-read your code is, there is always going to be a possibility that it will have bugs. http://googleresearch.blogspot.com/2006/06/extra-extra-read-all-about-it-nearly.html

To err is human and one can never prove that a program is correct, so why not use tools such as debugger / automated testing to aid ourselves in this difficult business?

If the code is short enough, then simple tests will do. Also, if it is short and you know the nature of the bug, the reading the code can be enough. However, once the code base is large, involves several languages mixed together, plus 3 tiers, then you simply must have good test coverage on many levels plus a very good debugger - otherwise you will be wasting a lot of time.

So, when do I not need a debugger?

I am not the smartest coder, nor the most experienced, but still, sometimes I do not need to use the debugger. That is when:

  • The code is mine or well-written AND
  • It is written in a readable language AND
  • The overall project is small.

When do I rely on a debugger heavily?

  • Short Answer: often.
  • When an application crashes. Particularly when it is deployed. Having VS2010 installed on that computer can make a difference between "Unknown Error" and FileNotFoundException.
  • When a 3rd party library crashes or misbehaves.
  • When the code is poorly written. Particularly if the same file was touched by 10 different people in the last 10 years, 7 of which are no longer with the company.
  • When the project is large
  • When the code is rather monolithic.
  • When there are several tiers (GUI, SQL, BL) involved.

Note that "debugger" can refer to more than one tool. I use Visual Studio debugger, SQL debugger (mostly for stored procs) and SQL profiler as well (to figure out which SP are being called). Would I need tools of this caliber I were writing a quick sysadmin-ish Python script? No. If I made my own little GUI-based tool? Depends. If it is .Net WinForms - probably not. If it is WPF - yes.

What defines a "real" programmer anyway? One that is quick? knowledgeable? Is good at algorithms? Writes good documentation? Just when exactly does one graduate into this new title? When does one cross the magical line?

I would say that a programmer who has not gotten his/her hands dirty in an existing 100+ man-years effort has not had a chance to be humbled by the complexity and own limitations (as well as frustrated with code quality).

I personally try to use the best debugger available to me, and I tend to use it often. If a task is simple enough and does not require a debugger - I do not use it then. It does not take too long to figure whether I need one or not.

...

Now, in theory I could read the code base for so long, that I would just get it. However, hands-on approach works best, plus I often want to re-write that stupid code that I am seeing. Unfortunately it would take me 10+ years to clean up the code base that I am in. So, using debugger is an obvious first step. Only when I find out just which one of 5 million lines of code is acting up, would I scan the file up and down to try to figure out what that class is doing.

Job
  • 6,459
  • +1, excellent answer, I particularly agree with the "when there are several tiers involved" aspect, that's one that is seldom mentioned by the "just read the code and find the error" advocates. – Carson63000 May 21 '11 at 21:46
  • Glad you could read the whole thing. – Job May 21 '11 at 22:53
  • +1 for great answer and for examining the definition of a "real programmer". Use of this phrase made the OP sly, interesting, and potentially inflammatory (because of denigrating implication or innuendo). – Smandoli May 25 '11 at 13:28
  • 1
    "one can never prove that a program is correct" That's not true. – GManNickG Jun 09 '11 at 11:05
  • 1
    @GMan, please do elaborate on your statement. As I have learned, many previous attempts to prove correctness of short snippet of code for a specific language have failed, e.g. several bugs have been found after the proof was completed (by a professor specializing in such proofs). Some very trivial programs could be proven to be correct, I suppose. I am curious to find out your angle here. – Job Jun 09 '11 at 18:12
  • @Job: Take a look at http://en.wikipedia.org/wiki/Formal_verification#Industry_usage – GManNickG Jun 09 '11 at 18:43
  • @GManNickG that's for algorithms, in which case yes there is a way to formally verify them. Job is talking about "correct" being "bug free" in this context, in which case you run into things such as Turing's halting problem. – Qix - MONICA WAS MISTREATED Sep 18 '16 at 18:49
  • @Qix: "that's for algorithms" is not a meaningful distinction. There are algorithms that can't be shown to be correct because it might be impossible to prove termination. Everything you've said can be formalized, and now you're back in "algorithms", whatever you meant. If you prove something correct yet it has bugs, either your proof was wrong or your formalism didn't capture the actual semantics. And yes, halting problem is nothing new and changes nothing about what I said. The space of programs is large, the space of programs humans care about is small. Most aren't special snowflakes. – GManNickG Sep 18 '16 at 19:01
  • @GManNickG I mean standardized crypto algorithms and the like - algorithms that are formalized. Did you read the link you posted? There definitely is a distinction in CS. There are other great answers in the StackExchange universe talking about this exact question. – Qix - MONICA WAS MISTREATED Sep 18 '16 at 19:04
  • @Qix: The crux of it is, "define algorithm". I don't find your distinction between "algorithms" and not-algorithms-but-algorithms useful. You can formalize anything. I'm not sure where the disagreement lies so I don't know what point you're trying to make. Formalizing software and proving it correct is an entire field of research, as far as I can tell you're arguing it's...not? – GManNickG Sep 18 '16 at 19:59
17

"I don't like debuggers. Never have, probably never will." — Linus Torvalds

On the other hand, he doesn't have a Stack Overflow account, so I'm not sure if you are interested in his opinion :)

Adam Byrtek
  • 1,104
  • 3
    Not many of us are Linus Torvalds, for the rest of us mere humans we need the debugger. – Nodey The Node Guy May 21 '11 at 18:54
  • 7
    kernels don't lean well to debuggers. –  May 21 '11 at 22:56
  • 7
    Yeah, kernel programming is a different field than userspace programming. I don't typically agree with Linus's opinions for userspace, but they are definitely respectable when dealing with kernelspace. – alternative May 22 '11 at 00:22
  • 17
    "I don't like debuggers" doesn't mean "I don't use debuggers." What Linus actually said was "I don't like debuggers. Never have, probably never will. I use gdb all the time, but I tend to use it not as a debugger, but as a disassembler on steroids that you can program." (I know some will try to twist that to mean that Linus doesn't use a debugger, but that's not accurate.) – Kristopher Johnson May 24 '11 at 21:16
  • 7
    It seems like Linus Torvalds and I never agree on anything. – BlueRaja - Danny Pflughoeft May 31 '11 at 21:14
12

So my specific answerable question is under which circumstances would you, as an experienced programmer, use a debugger?

  • When you're unable to "debug" by reading your code.
  • When you're unable to predict what values certain variables have an a given time.
  • When your mental model of your code does not fit the output given by your code

Edit:

I had the fortune/misfortune of not knowing how to use a debugger in my programming journey. Thus in the past I was forced to debug without a debugger. However after learning to use a debugger -> I've become 100x more productive in finding bugs.

Adam Lear
  • 32,039
Darknight
  • 12,199
  • 1
  • 39
  • 58
  • +1 for "When your mental model of your code does not fit the output given by your code" – user Aug 12 '12 at 15:40
8

To give a slightly different perspective from the current answers; As an embedded software engineer working on systems that often have a real-time component I rarely use a debugger.

On occasion a debugger can be an amazing tool and whenever I am able to build and run code on a desktop then I would always use a debugger.

On chip, with real-time constraints, then there is a heavy burden associated with trying to use a debugger. As soon as you pause execution you are likely to upset, possibly fatally, the timing of the rest of the system. Generally on chip, printf in non-critical code and IO waggling in time-critical code is the best and actually simplest tool. It's not as good as a debugger, but it's much cheaper to get working with a real system.

Luke Graham
  • 2,393
  • 1
    you might want to investigate hardware-based debugger boards – Steven A. Lowe May 21 '11 at 18:21
  • @Steven thanks; unfortunately while some of the systems I work on have suitable hardware support, others do not. While we generally have the option of a logic analyser this tends to be even more expensive in terms of time. – Luke Graham May 21 '11 at 18:44
  • I'm the exact opposite. I use a debugger a lot more often on embedded systems. I agree about it upsetting the timing, though. It takes a fair amount of effort to filter out and/or minimize the changes caused by putting a debugger in the loop. – Karl Bielefeldt May 21 '11 at 20:34
7

I think experienced programmers almost exclusively use debuggers, when they are needed. What better way to track down a bug than to actually follow the execution of the code...

Are you under the assumption that the Skeets of the world don't make mistakes or just know everything? All but the most trivial programs behave in unexpected ways under some circumstances. It is a given that issues are going to have to be investigated. So the choices are use print statements, on one end of the spectrum, or look examine what happened, post mortem, on the other, or look right in the middle as the code executes and figure out what is going on.

Maybe a better way of thinking about it is that experienced programmers know when to use a debugger. In code with few dependencies looking at a stack trace is probably enough to figure out what is wrong. But there are complicated scenarios where your code is working with other code, and you need a debugger to look at the stuff you didnt write.

  • 4
    Well, this is exactly what I am trying to investigate - I'm an extremely experienced programmer and I never use one. –  May 21 '11 at 16:49
  • 5
    @neil, maybe you have no need. Rest assured, the time will come where the debugger will be the simplest way to get to the bottom of an issue, whether or not you actually end up using one.... – hvgotcodes May 21 '11 at 16:56
  • I can read stuff I didn't write too. And if I can't, it is useually because it is bad code. In other cases, I use the debugger. – GolezTrol May 21 '11 at 17:05
  • If the language you use supports exceptions, and if you're using them + a logging framework appropriately (e.g. log4j or something like that) you'll always end up with a stack trace pointing to the line of your error. 99% of the time it's a null pointer exception where you didn't expect it. What else is a debugger going to tell you? Now, when I was programming in c, there were things that you simply couldn't find without a debugger (e.g. stack corruption). But those types of things just don't happen in high level languages anymore. – Kevin May 21 '11 at 17:07
  • @kevin. right, NPE at line 34 or class xyz tells you everything. like what was null....oO. And NPEs are not in the class of harder issues that need to be resolved. – hvgotcodes May 21 '11 at 17:13
  • @hvgotcodes: what happens in java that you think is in a class of "too hard to solve" except by debugger? My experience is that the debugger is no help on the truly hard problems (e.g. memory leaks), and it's unnecessary on the "easy" ones (e.g. NPE). – Kevin May 21 '11 at 17:58
  • 1
    @kevin, right, i think there is a class of problems between those two where the debugger is the most natural way to get to the bottom of an issue. Maybe I want to see the dynamic properties put on an object in a dynamic language framework like grails. Maybe I want to see exactly where something I think is not null is made null (NPE tells you where the exception is, not why the thing is null). Maybe I want my debugger to pause on exception so I can see what combination of code caused an exception, not just that it occurred in the stacktrace. – hvgotcodes May 21 '11 at 18:03
  • @hvgotcodes: Ok, usually in those situations you look at the stack trace and it gives you hints, and then you look directly at the surrounding code and you can fill in the details by applying simple logic. I'm not saying you can't do the same thing with a debugger. – Kevin May 21 '11 at 18:58
  • @kevin i think you are making the assumption that 'filling in the details' is easy, being a product of 'simple logic'. I don't think that is a given. – hvgotcodes May 21 '11 at 21:45
4

I don't, and I've been programming for over 10 years. I used to, when I programmed in c/c++. Now I program in java. The truth is that if you're doing logging correctly you'll end up with a stack trace which is enough for most skilled developers. Also if you're writing (good) unit tests, and functional tests, that eliminates a whole class of bugs.

Kevin
  • 1,341
  • If it clarifies more, I know a lot of java programmers that DO use a debugger. They're mostly right out of school. – Kevin May 21 '11 at 17:01
  • 1
    stacktraces do not show data - you must add that information yourself - but then they are pure gold. –  May 21 '11 at 23:00
  • 1
    @Thorbjørn: They can show data, actually: see Python's cgitb module, for example. (The CGI in the name is mostly vestigial, the original purpose of the module having been to present usable stack traces when a CGI crashed.) Of course, with that, you sometimes get so much data that it becomes difficult to navigate to the stack frame of interest. I love cgitb.enable(format='text') anyway, though. – SamB May 24 '11 at 22:00
  • I don't really use debuggers and I use C++.. – Nikko Sep 01 '11 at 07:06
  • @SamB Kevin talked about Java, which cannot to that –  Aug 12 '12 at 17:13
  • @Thorbjørn: Hmm, yes, I now see that Java keeps no reference to the requisite information in its StackTraceElements – SamB Aug 13 '12 at 18:48
3

Rarely.

Your methods should be small/simple enough to be compiled and run by your mind, unit tests should cover functionality. If you find a bug, write a test. Run it, fix it.

I only tend to use the debugger when ive got unexpected behaviour from untestable code, like the ASP.NET framework.

  • 3
    theres some real hater noobs in this thread... –  May 21 '11 at 16:54
  • 2
    NO reason to down vote this - he's right. – wadesworld May 21 '11 at 16:55
  • 11
    -1 because this claim is like saying the way to make money at Vegas is to just win every hand. That doesn't reflect the reality of the situation, and the claim that all code will be simple only exists in small isolated problems. Plus, the "run it, fix it" claim completely ignores how you go about fixing it. I was going to let it slide but then insinuating that all those who disagree makes it worth downvoting. – whatsisname May 21 '11 at 18:35
  • 2
    -1: "Your methods should be small/simple enough to be compiled and run by your mind" is diconnected from reality. That's like saying a function that is longer than 20 lines is too long. Nonsense. – John Dibling May 24 '11 at 19:37
3

In Smalltalk, I develop almost entirely in the debugger:

  1. Write a test that I know will fail.
  2. Run the test. When it fails, the debugger pops up.
  3. Write, in the debugger, the code necessary to make the test pass.
  4. Resume execution.
  5. If I get a green light, go to step 1 with a new failing test. Otherwise, in the debugger find out what I did wrong and fix it.
Frank Shearar
  • 16,683
3

Who cares? What I want to know is will using a debugger prevent me from being a better programmer in the long-run? Maybe debuggers were of lower quality when many experienced developers started so they were a hinderance. Is it a crutch that prevents deeper understanding?

Some programmer, probably better than the rest of us, found a need for a debugger and built one (No idea who created the first one.). I'm sure they were more productive as a result of it. I doubt the motivation was to enable lesser mortals to write code.

JeffO
  • 36,816
2

I use a debugger when I need to. That is not daily, but it does occur occasionally. It is sometimes better to step through the code to see what exactly happens.

I must admit I use debuggers less and less. I've been developing in Delphi for over 10 years. I also write stored procedures in PL/SQL. Since a couple of months, I'm a PHP developer too.

I mainly use the debugger in either of these cases if I find a piece of obscure code that was written years ago and I need to modify it. It sometimes helps to find out the exact way a program works if it is hard to read the code. In PHP that is hardly ever necessary, but in Delphi, which is event based, it sometimes helps when you got a complex framework.

But as you say, using the debugger is an exception. Most problems are solved by just reading the code and fixing any mistakes you (or someone else) made.

But that goes for stepping through code. I do quite often use the call stack when an exception occurs, and I occasionally put a breakpoint somewhere to inspect a variable. But nearly always in a piece of code that needs a thorough refactoring anyway.

GolezTrol
  • 201
2

I occasionally code with no debugger, but only when forced to at gunpoint, ie. legacy embedded gunge on an 8051 or Z80.

IMHO, you need a debugger and logging on any complex job. Once is not a substitute for the other. A logging system cannot help if the app stuffs in a driver, for example, where the only thing the code can do is interact with hardware and set a semaphore.

A debugger cannot help with a system error where the apps are working fine according to the way you wrote them, but the system still doesn't work because of some intermittent comms protocol error.

So, I need the debugger to remove the stupid, glaring bugs and hardware cockups. I need good logging to catch intermittent system integration bugs.

I gotta have both - I need all the help I can get!

BЈовић
  • 14,031
  • 8
  • 62
  • 82
1

I only use a debugger when these steps fail:

  1. Get the error reproducible. Think. This is often all that is needed.
  2. Check any stack trace and logs.
  3. Add more logging around the offending code.

These steps takes care of 95% of all cases. That means I rarely use a debugger, and when I do, it tends to give me too much information and I get bogged down in unrelated details. This is especially true if working on a multi-threaded, real-time system.

So judiciously placed logging statements goes a long way.

1

Could it simply be that very experienced programmers are the same as very old programmers, and they learned to program, and formed their habits, back when debuggers were not always available, and sometimes not very good?

If you get really good at printf debugging (and back in the eighties, we didn't have much choice but to become really good at it), perhaps a debugger doesn't add that much.

0

It's a question of personal choice.

Honestly I think debuggers are useful in certains situations where it helps a lot knowing what's up on your ram at any given step of your program's execution.

The primary utility of a debugger is to halt a program without the program being designed to halt itself: this feature is quite important.

Apart from those 2 features, I don't think a debugger is really necessary; any complex program you make should have some sort of "verbose" mode, i.e. telling everything it is doing with printf or std::cout, what choices it made, and a lot of other parameters.

Just imagine you make a program, and the user has a problem using it: how to know if he is using it the way it was designed to be used, or if the thing he is complaining about might be a bug?

Debuggers are like the electrical steering for your car: it's more comfortable to have one, but it won't make your drive any better.

Progamming is about design and logic, the way tools can assist you in tracking stuff doesn't make you a better programmer.

Plus debuggers are useful for compiled languages, much less for intepreted ones.

jokoon
  • 2,242