Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: How often do you use a debugger?
4 points by phatbyte on May 23, 2012 | hide | past | favorite | 16 comments
I wanted to know if the HN community uses a debugger during development.

I have this feeling that developers who create native apps use it a lot. But web developers don't, from what I see, most of web dev use print/echos/var_dumps and so on.

Why aren't we all using debuggers ?



I do software consulting, so I tend to use a variety of different environments. Likewise, my debugger use varies quite a bit between projects and platforms.

Python/Django: never. I don't think I've ever used a debugger here. I tend to write unit tests instead.

Ruby/Rails: same as Python, never. Unit tests all the way.

JavaScript/CoffeeScript: occasionally (once a week?). Usually in chunks of code that are hard to test (DOM manipulation, generally). Usually use Jasmine and write a unit test, especially if it's a logic-related problem.

C#: I've been working on a legacy codebase written in C#, and I tend to use the debugger quite liberally here. When I got there, the codebase was around 36kloc with 0 unit tests and many global variables. Most of the code was not written with testing in mind.

Java: Not often. It's been a while, but most of my Java projects were greenfield projects, so they were written with a JUnit suite from the ground up.

Objective-C: Occasionally. The last project I worked on was using Cocos2D and network stuff. Neither was particularly easy to write tests for, so the debugger was useful for inspecting state when things got weird. More often, I would add a few NSLog statements instead, to capture long-running state changes so that I could reconstruct a model of what went wrong and when.


For Ruby, never. Literally never since 2005. For Objective-C, quite often.

Why? I'm not sure. Maybe because the Ruby has such good unit test support, and logging is pretty painless. It could very well be that I'm missing out by not using ruby-debug or some other tool, though.

In Xcode, the path of least resistance is the debugger. Maybe it's because I'm less familiar with Cocoa, but the code just seems more complicated and harder to trace from start to end. The debugger really helps with that. Even there, though, I use a lot of NSLog(). This helps when you have to track down a changing situation, and can't quite figure out when or how it breaks. Just log a bunch of stuff and let it rip.


For Java work I use the debugger a lot. When working on my Django or Python projects I use it a lot as well since its built into PyCharm. When writing code in emacs etc I tend to not bother.

I also make heavy use of logging statements, but these are usually left in for debug builds and used to help find unexpected bugs rather than active debugging.

Just recently had to start using JavaScript so I have spent some times figuring out FireBug and its debugger. I must say I was pleasantly surprised that it works well.


I use one about every other day and find it indispensable. We use Play! so compilation/deployment isn't an issue, but I still find the feedback loop between "I want to know the state of this variable", "I'll write a logging statement here" and "I'll reproduce the state I was in with the app" to be too long.

The only time I don't use one is when I need to debug an issue that occurs only on a remote server (our sysadmins don't like opening ports for that sort of thing).


I use a debugger when I code in C# (because the VS IDE is fantastic), but unfortunately I'm doing a lot of PHP these days. I've never found a debugger for PHP that I really enjoy using. XDebug + print_r() wrapped in <pre> tags does a pretty good job, but it's a lot more painful than using a real debugger :(

Edit: Hurp durp, XDebug has remote debugging. I've never actually used it...although I think I'll check it out soon.


I use an echo'd statement or the "or die" function when using PHP, I just find it easier for me, as I am used to the error messages.

PHP at compile time has it's own built in error warnings (for those unlucky enough to get PARSE ERROR as often as I do) which can be useful, although for loops the line it finds the error on can sometimes be completely wrong, and normally derivative of another error.


I'm writing a web app right now with a node backend, and I prefer to use the debugger over print statements. Granted I do 9-10 hours of Java programming every day before I work on the web app, so I may just be more comfortable with the debugger than some other web devs.


Same situation as me then, mostly Java during the day with debugger and then after hours with PyCharm and Python.


I walk every line of code I write multiple times with a debugger. So the answer is, a lot. But I mostly work in an environment that makes debugging easy (VS).

Lack of good debugging support is a good reason to avoid a platform. Logging, screen writes, dumps, are awful ways to work.


Doing PHP web Dev at the moment. I use chrome and xdebug close to 100% of the time.

I've done Java,c#,vb6,vb.net,c++ and have used a debugger for every project.

Does anyone remember the days before firebug doing js Dev. Using alerts() for debugging... god I'm glad web tools have got allot better.


I use dbx and gdb almost every day to debug c/c++ code. We have executables that approach the OS file size limit. Debugging with print statements is a nightmare. We absolutely need conditional breakpoints, stack traces and such.


I use printf a lot for embedded stuff over serial as I don't have a full debug tool - I would if I could though as using printf to debug interrupt driven things is a real pain and can mess things up


As a Web Developer I'd be lost without Chrome Dev Tools right about now.

Edit: to more directly answer the question, when I am programming all the time, not an hour goes by (when programming) without using it.


I also use a debugger daily, though I am doing mostly Objective-C programming.


Javascript debuggers did not exist for most of the 2000s. It was a godsend when Mozilla provided the Error Console to let developers know when an error had occurred in Javascript (in a nicer way than getting an alert for each error), but developers still needed print statements because code that worked in Firefox and Opera would break in Internet Explorer. When the first Javascript debuggers came out, they were difficult to use and slow enough to do more damage to productivity than they helped. Experienced developers might be continuing with their old work patterns or may not be aware that better tools exist now. Safari has a good Javascript debugger and I use it.

On the server side, print and vardump statements will provide the developer the desired information in a one-second refresh of the browser. A debugger will need to be run on an independent server instance which can take time to configure and start up. If the development environment is not already set up for this, running a debugger can take longer than identifying and fixing the bug using print statements.


The javascript console on -webkit (Chrome)is a godsend for me, because if you have a JQuery bug, nothing at all works, so instead of being able to do one large update you have to do it in small ones, or spend ages debugging.

Of course, you could just write correct code, but no one is perfect and mistakes do happen. Would be better if the errors it found were displayed in a more clear form of english instead of generic error names, but it is better than looking over line after line trying to find that darn missing semi-colon.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: