Posts
1150
Comments
891
Trackbacks
1
On TDD

Roy Osherove has been posting a bit about testing, OOP, TDD, and the like.  You can go to his post and find tons of comments, links and so forth.  Because of all the different interpretations people have put forth, it’s hard to summarize the discussion without prejudicing it.

But what the hell, it’s my blog, so here’s a thumbnail sketch:  the adoption of unit testing is hindered by it being tied to TDD, design considerations, and confusing terminology (“a mock?  a mock what?”). 

A very good post by Udi Dahan takes a pragmatic stance about the whole issue, but contains two things that I want to comment on.

The first is this:

“In a well designed system, most ‘logic’ will be contained in two ‘layers’ - the controllers and the domain model. These classes should be independent of any and all technological concerns. You can and should get high unit test coverage on classes in these layers…Most other layers have some dependence on technology that makes unit tests relatively less valuable. Coverage on these layers is most meaningless. My guidance is to take the effort that would have been spent on unit testing these other layers and invest it all in automated integration tests.”

The second is a comment by Casey that Udi agrees with:

“I think, and hope, what you are saying is any code that does not add *business* value is of low value, and tests that have no clear purpose, or that further concrete an already weak design, will ultimately decrease business value.”

I agree with both of these, but in my own special way.

 

In almost any business environment (I can think of a lot of other systems/environments where the following isn’t true…a health diagnostic system for instance), software exists primarily to deliver business value.  Or at least, it should.  One of my strongest gripes with Alt.NET is that while I think just about everyone would give lip service agreement to this notion, it is quite often de-emphasized, and the focus is placed on ‘reducing friction’ or ‘increasing maintainability.'  And clearly, if you do those things, you increase business value, right?

Not so fast.  Notions like ‘friction’ and ‘maintainability’ are relative, usually to the developer in question.  Various people have blogged in great and painful detail about what reduces friction or increases maintainability, but what they advocate often times makes it clear that what they advocate is something that would reduce friction and increase maintainability *for them*, but which would do the reverse for most everyone else.  Since this post is about TDD, I’ll use that as an example in a minute, but just to throw out another example:  anyone who is advocating ‘deprecating the database.’  It isn’t that there is necessarily a *technical* argument against it (though I think there could be), but there are so many other considerations that go into software development that the technical merits or demerits of software design is almost always a very minor aspect (I’m betraying my roots in operations/deployment/production support here).  There is almost no environment where ‘deprecating the database’ is even a possible solution.

side note:  I’ve made the following point in many different ways, and in many different places, but I think it right to make it again.  In large part, I 'follow’ Alt.NET (even helped to create the Chicago Alt.NET user group, not sure how that happened…think there was drinking involved) out of laziness and greed.  I am trying to ‘shortcut’ my way out of learning many techniques through experience, because learning through experience is usually painful, and hurts someone else (usually a business/client).  You can’t completely do this, obviously, and I know that, but whether it is learning how to implement IValidator, IMapper<Domain Object, DTO> or other techniques that I’ve ‘stolen’ (if you can’t tell, I just spent 15 seconds looking at one of my code bases), I hope to be able to avoid learning through the mistakes, and just learn from the end results of developers who I already know are better than I am.  Developers will be developers, so there will always be some numb-nut advocating a technically stupid design under the Alt.NET rubric, but in general, if you want to learn how to be a technically better developer, just read the Alt.NET blogosphere.  And if you don’t know what that is or what counts, look it up.  Google is your friend.

 

TDD is one of those techniques that has its fair share of evangelists/advocates, and that can decrease business value if done incorrectly.  On one of my code bases, I am forcing myself to use it as stringently as possible.  In almost every client situation I have come across it, it has been implemented poorly (and in the obvious case I can think of where it wasn’t, it was because of the single-minded determination and/or skill of the developer implementing it).  Like agile advocates, TDD advocates seem to be painfully addicted to confirmation bias (“I did it myself once and it worked great!!!!”), but that, in and of itself, has more to do with advocation (is that a word?) than TDD.  But it is pretty clear that in order to do TDD ‘the right way’ requires a lot of training, an eye of newt, and a lot of luck and/or skill.  In and of itself, this makes me skeptical of it, because any methodology that requires near-perfection in its implementation is essentially doomed to failure in the long run.

BUT, if it provides business value, which it can, you should use it.  I like very much what Udi said about layers that have dependencies on technology (I will expand this to include ‘protocols’ in a second) and what Casey said.  I’ve long advocated (yes, using that word on purpose) integration tests over unit tests (since there is always a limit on time and effort, if you have to limit what you can test, test the code that is actually in production.  Not mocks, not stubs, your production code.  If you have to run, e.g., Waitn tests, suck it up and do it), because of the ‘business value’ position.  No one in the business will generally give a crap about the latest developer ‘fad’ (since they generally neither know or care what counts as true progress versus fad, since they can’t judge it), but a set of tests that catch actual bugs in production code, before it actually gets to production, that usually gets people’s attention (if you are really good, your non-integration, TDD tests will give you the same, if not better, results…in theory, see side note).

How can you tell if you are providing business value or not?  That’s hard to say.  But, I will offer the following thought experiment (though it is based on a real-life example) as a guide:

Suppose you need to write code that will use FTP to go out to a site and download a file.  This is a typical requirement in almost every single business shop in the world.  If you immediately thought of creating an IFTPService interface, you have problems, and are probably part of the problem.  The FTP protocol is not going to be re-designed and neither should your FTP code.  Once it is built, it is done.  “But what about testing how the code handles different response codes from FTP?”  Setup a local FTP site that does whatever you want it to do, and create integration tests.  If you think an IFTPService interface is a good idea, not only are you wasting people’s time, but you are losing the semantic argument.  If you already have a TDD and/or top-heavy unit testing organization in place, then creating stupid interfaces like this is potentially okay because you can write the ‘extra’ code in a few minutes, but any seasoned developer is going to (rightfully) laugh you out of the arena if you think an IFTPService is a good idea.  Which will kill any chance you have of getting TDD in where and when it matters and can supply business value.

 

Business value, good.  Useless tests written because someone you read somewhere said you needed to have 100% code coverage, bad.

BDD, really bad.  But that’s another post.

posted on Wednesday, October 01, 2008 8:26 PM Print
Comments
Gravatar
# re: On TDD
rscot231
10/2/2008 10:09 AM
Jon,
Are you referring to my "single-minded determination"? Just curious.

Ideally, I don't want to be interfacing with FTP at all. I'd rather have my scheduling app handle that and assume a dependency on the file being present. The scheduling app (pick one) is going to have more and better monitoring and error management than I could bake in. Picking up at the point where the file is present allows you to dive right into "adding business value".

Operating under that assumption, I disagree that if you need to interact with FTP from business code then writing IFTPService is "part of the problem". If I have to interact with a "protocol" like FTP from "business" code, I will attempt to limit my interaction with it and test that interaction. I'll treat it like a service. I'm not testing that PUT is implemented correctly - I'm testing that I'm PUT-ing. Like you said - FTP is pretty well nailed. No reason to test the implementation. If that involves writing IFTPService, so be it.
Gravatar
# re: On TDD
Victor Kornov
10/2/2008 3:15 PM
I'm sure you know all ins & outs of BDD. Looking forward for your post on it.

You didn't mention that value of TDD is not testing. TDD is a design technique.

Most people say you will benefit from unit testing interesting bits, not 100% of the code.

"neither should your FTP code" - it should, but in reality it's changed. Bugs, more features. You advocate business value. That implies YAGNI & need driven development for me, i.e. you implement only what is needed to provide value right now. Then, you need to add more features & code. Suddenly, you are in the code&pry mode, hoping you didn't break anything.

Integration testing is good, but it's downside is longer feedback cycle (not to mentio lack of isolation, i.e. you don't really know where exactly it fails). Unit tests are fast. Remember your manager wanting to know just the same time when anything goes wrong? Why shouldn't you control your code in the same manner?

"any methodology that requires near-perfection in its implementation is essentially doomed to failure in the long run" - Unit testing requires learning. Just like anything else. Coding requires certain degree of proficiency. It's sure doomed in the long run if we won't work on improving the process.
Gravatar
# re: On TDD
jdn
10/3/2008 5:30 PM
@rscot231

Jon was someone else, and yes.

"I'd rather have my scheduling app handle that and assume a dependency on the file being present."

It's hard to say completely in the abstract, but not sure I agree. I can write an FTP console app that takes in the relevant info, outputs everything from the console to a log file, and be done with it, in about 15 minutes. Depending on the environment, I will have to tie into a scheduling app, but that's usually not a lot of additional work (usually involving writing error info to the console instead of 'handling it' and running something like grep against the log).

I don't even see the neet to test PUT-ing. Unless one is an idiot (or even when he is), the console app is going to do that, and once you know that, you are done.
Gravatar
# re: On TDD
jdn
10/3/2008 5:41 PM
@Victor

"TDD is a design technique"

Absolutely, but one of the thing I like about Udi's post is that he recognizes that it has a *cost* (few explicitly address this), and if you only have a vague 'it will pay off in the long run' argument to address it, you aren't really doing much but taking something on faith. Which I'm not against, but many in the TDD crowd simply assume the payoff will outweigh the costs and I don't think that is an accurate assumption in some instances.

"Most people say you will benefit from unit testing interesting bits, not 100% of the code"

Some argue you aren't doing TDD right if you don't have 100% coverage (D. Bailey's response to J. Bogard's recent post about code coverage at LosTechies takes this position). I wouldn't know if it was most, some, or a few.

"Integration testing is good, but it's downside is longer feedback cycle."

Yes, this is the argument everyone puts out there and, in comparison to immediate feedback on simple unit tests that don't hit external dependencies, it is obviously true. *But* if your simple unit tests don't sync with what happens in production, the tests are useless (and this does happen more than it should).

And I think people either forget or never knew what it was like to develop with absolutely no testing, other than manual, interactive testing (I program mainly with web applications, so think of clicking in a browser).

I have a current client where it takes literally 10 minutes to build the application and get to the login screen (we've timed it....I have a 3 year old machine, consultant penalty), then another 3-5 minutes to get to the page you want to test, and then step through the debugger (and you can see it pause between lines of code).

(Even Semi-)Automated Integration tests would be *vastly* superior to this. As good as running an NUnit test suite of 200 tests hitting another part of the codebase in 20 seconds (even on a 3 year old machine)? Of course not.

But irrelevant.

"It's sure doomed in the long run if we won't work on improving the process."

You should always work on improving the processes you choose to adopt, but I'm arguing against adopting TDD, unless you have a clear reason to think it will produce *real* business value other than vague guesswork.
Gravatar
# re: On TDD
rscot231
10/6/2008 8:49 AM
@Victor:

"I'm sure you know all ins & outs of BDD. Looking forward for your post on it."

LOLOLOLOLOLOL

@JoHn:

Sorry for the misspell.
Gravatar
# re: On TDD
Victor Kornov
10/6/2008 1:10 PM
@jdn:
"Absolutely, but one of the thing I like about Udi's post is that he recognizes that it has a *cost* (few explicitly address this), and if you only have a vague 'it will pay off in the long run' argument to address it, you aren't really doing much but taking something on faith".

And what's the cost of developing other way, i.e. no TDD? And what's the cost-to-value ratio, how it scales. Questions. Any arguments? :) I kinda have one: Code Complete 2, it has facts & numbers for unit testing. Not sure about TDD, we could search around the Internet, I'm sure I saw a scientific work on the matter...

P.S. I'd like to mention I'm not a TDD zealot.
Gravatar
# re: On TDD
Victor Kornov
10/6/2008 3:51 PM
I have a metaphor: your doctor gives you a recipie with several items, all needed to cure you. You buy only one of them which gives the most bang-for-the-$, in your opinion. If your stong enough you'll be OK, e.g. cough only 5 times an hour instead of 50. If you had bought all the drugs, you will be healthy. Now, what's your descision?

P.S. TDD has more to it that writing test before code. Miss/leave aside something and it shows.
Gravatar
# re: On TDD
jdn
10/6/2008 6:16 PM
@Victor

Appreciate the continued feedback.

"What's the cost of developing other way, i.e. no TDD?"

BTW, are you referring to chapter 22 in Code Coverage 2? Want to go look it up and refresh my memory, thanks.

You can't really do double-blind tests or have control groups, so it is always 'unscientific' to some extent when studying software methodologies. It isn't like there is *no* empirical evidence, clearly there is, but it is tenuous (like that 85% figure that's thrown out about the number of software projects that fail, IIRC, it comes from a Scandanavian study of a particular year...not that there is anything wrong with being Scandanavian, but it ain't exactly the burning bush, IYKWIM).

Having said that, TDD says it will increase the size of your code base (even with 100% code coverage, I don't think it would double, but it would increase by a non-significant amount), so by definition, there is the cost of maintaining this *and* keeping it in sync with the non-testing code (since code can change without test code failing).

Moreover, as my blog post states, I want to minimize the amount of painful learning from experience by trying to learn from the 'experts' (where I have to guess at whether someone really is an expert or is just another schmuck with a blog like me), and almost all of them say that during the first x number of months (where x seems to vary quite a bit, but seems to be at least 6), they wrote lousy tests, and so one could argue that the overall code quality decreased.

You say "you will be healthy" and most of the people who practice TDD believe that this is the case, but most of the 'experts' are the ones we don't need to worry about in the first place. Of course *they* create better code in the long-run, I'm worried about the developers who will get discouraged in that first x number of months and give up. I'm worried about getting the business as a whole to support lowering code quality in the short run because of supposed benefits in the long run.

If you haven't seen something that came up with the news recently, think of the Hippocratic Oath "Do no harm." But some people have argued that in medecine, while you shouldn't do anything that causes harm, it might sometimes be better to make a patient worse in the short-term in order for them to be better in the long-term. The problem is proving this is a good idea.

I think, as it stands right now, if TDD is a good thing, it is like that. And I think you need a really strong argument in many cases to undertake this.

"TDD has more to it that writing test before code. Miss/leave aside something and it shows."

I agree but that compounds the problem since not even the 'experts' agree on exactly what the 'more' is all the time.
Gravatar
# re: On TDD
Brian Johnston
10/6/2008 7:23 PM
I think where there is a real breakdown in communication is that a large percentage of the applications out there were not made in a service model design or some other design that can easily use DI/IoC for use with a testing framework such as RM (Rhino Mocks); they're built on the scholastic 5-tier design that may only have a few factory patterns that vaguely resemble constructor injector.

Given a group of developers - the cost and time to refactor legacy applications based upon what someone else tells them is a good idea (not having any tests for regression purposes) sounds like lunacy to them. I mean common, if I came in to where you worked for years if not decades and said your program was poorly designed and you should be using a design pattern that goes against what you learned in 'your glory days' would you take that very well? Yes, that's a sign of not staying up to date - but that's a whole different 'people and management' issue.

In fact I was riding in my car talking to Microsoft speaker just the other day about this because they have ran across similar situations in trying to get people to adopt TDD.

If you are experienced with DI/IoC - adopting TDD comes naturally. If you've lived your entire life in an IT department of a large enterprise that doesn't write software for living maintain legacy applications and building new applications similar to the legacy applications, it doesn't. Why? There's a cultural barrier there and a large amount of friction can be created trying to change that.

Now everyone will agree that testing is good, but when the *only* option you give them is to refactor their code to DI/IoC/service-locater - then you're shutting the doors on a lot of people and pretty much preaching that TDD and design patterns around TDD are silver bullets - and anyone who has been in this business for 10 years no fads come and go and there is not such thing as one-size fits all - so immediately that argument of 'everyone should be doing, or designing as' is discredited - because you're not everyone.

I know my team was totally against going around and putting in a crap load of interfaces and un-private/static-ing things just so we could use Rhino Mock - and you know what - I don't blame them - that's over a million lines of code to go through. And you know what - taking 6-12 months to refactor that would take a decision from someone at the near CXO level - it ain't gonna happen - it's a waste of money to the business - they know the value of testing - but they're also pragmatic and know there's more than 1 way to skin a cat and the way they've been testing so far (standard unit tests, GUI tests using tools such as eTester) has done just fine (i.e. good enough) for them.

So thus we introduce an alternative, something like Typemock that gives all the advantage you'd have using DI/IoC/service-locater - i.e. easily mock your objects - but not spend a fortune refactoring a large code base that is stable in production and now people say 'oh yeah that would be great'.

Learn to meet people half way - get them sold on testing first with a tool that doesn't force them to change their design habits, then work on the design habits - because contrary to what some of you may believe, TDD via IoC/DI/etc doesn't not necessarily result in a better, more maintainable product - in fact I've seen, and colleagues have seen, plenty of cases where it was an absolute mess.
Gravatar
# re: On TDD
jdn
10/6/2008 7:45 PM
@Brian

I think your comment is a good explanation of why TypeMock is a good tool.
Gravatar
# re: On TDD
Victor Kornov
10/7/2008 4:10 AM
I see TDD argument as "don't fix what's broken", i.e. you have already learned to write software. Good or bad (from tech/maint. PoV). Now, you need to re-learn, just to be able to write software. Supposedly good, or better. Which you already can. Since not many people will acknowledge their code is crap. Plus, re-learning is a big change, not just learn a bit to use some "3rd party drill down paging grid". This is a cultural/mental barrier.

Now, another metaphor. You like tennis. You hobby-play it, learn by experience how to hold a racket, do shots and other things. Then, you decide you go pay coach to play better. They tell you you play like crap, you hold the racket wrong and your shots are bad. You can just turn around and go away. But it wont help you play any better. The coach knows better, they can teach you.

Sport benefits from long centuries of professional games. There is a recognized "right" way/s to do it. Software does not have the luxury of "proved by time" & "all the champs do it".
Gravatar
# re: On TDD
jdn
10/10/2008 7:28 PM
I'm going to cherry-pick the reference from 'Code Coverage 2' that best supports my position, because this is my blog and so I can cheat like that, but also because it supports my position with something relevant.

Somewhere in Chapter 22 (don't have it in front of me), it states that 3 studies have shown that test code is as likely, if not more likely, to have bugs as the code it is testing.

This is exactly what I've talked about. If you need test code to prevent bugs in your code, but the test code is as likely, if not more so, to have bugs of its own, then you now have twice as much code with more than twice as many bugs.

So why have test code?
Gravatar
# re: On TDD
Victor Kornov
10/12/2008 9:52 PM
"Who watches the watchmen?" - That's one of the maijor software testing controversies. I'd say it's a major controvery of itself. So "why have test code?" or rather "why have a watchman?". Because it works. It has it's flaws, but it works for other areas of human life.

You know (or have been telled) what are the benefits of testing, you know what are the disadvantages. You know your foe.

You can apply many good software design principles to test code or just plain find out new desing principles that work for test code. That raises quality overall.

Also, tests give you possibilities/oportunities which you don't have without tests. One of them is fast response cycle, i.e. Fail Fast. Not to mention all the others.

The problem with test code is how to manage & control it, it's pros/cons. Since test code is different from regular production code you need to learn it. Also, test code + production code together equal more than those things apart, i.e. they have some emergent qualities which need understanding too.

So, it all comes down to learning & mastering. Now, is that out of your comfort zone? Is it too much to master, to the level it's "doomed in the long run"?
Gravatar
# re: On TDD
jdn
10/13/2008 9:51 AM
"Because it works."

Says who? The same people who said Corba worked? Or who claim that SOA works?

There are a lot of people who make a lot of claims in software development about what works, and a lot of it is about stuff that really doesn't work that well.
Gravatar
# re: On TDD
Corey Grusden
10/14/2008 10:37 AM
Brian's comment said it best. There are no silver-bullets.

If you are in charge of a project and you have the choice from the beginning to test or not to test and you choose not to test... The most important thing in the world is on the line and it's not the invoice. It's your brand. It will make or break you.

After practicing TDD for most of the applications I've worked on in the past, I can say that it *is* an art. The statement about you prefering integration tests over unit tests is OK, as long as you're testing it's a start. If you are prefering to write integration tests vs. unit tests, I have a assumption that you may not be doing as much as possible to put more logic into your Models.

The way that I tackle new features, bug fixes, or refactorings is that I look at what I can put *into* the Model -- test the shit out of it since you have ONLY the Model to test (almost no setup required unlike integration tests). Then I move onto the integration test with a few lines of test code and I'm done. If most of your logic is in your Model and is tested, the integration tests aren't very big or take much in the way of setup.

The reason why most people dislike testing is that they don't know how to do it and that's fine, it's a different mindset and skillset. However, the majority don't like it due to the shitty "design" left over from "architects" that "thought" they knew what they were doing. When the rubber hits the pavement is when you find out if your design is good or not. TDD is the way to do that ASAP.

So to those that complain about testing and have yet to really dig in and write tests for a large chunk of their current application you have zero room to talk in my eyes. Not because they're unwilling to try something new, it's because they aren't as professional as they say they are. I don't care if they are an MVP or whatever it is Microsoft hands out these days.
Gravatar
# re: On TDD
jdn
10/14/2008 8:02 PM
@Corey

I appreciate your feedback, but find your overall message unclear (stating upfront that I can be a bit slow, so bear with me).

You state (and I agree):

"There are no silver-bullets"

but seem to suggest TDD and/or testing is such a thing.

"you choose not to test... The most important thing in the world is on the line"

So, do you think testing is the silver bullet that will make or break you?

"If you are prefering to write integration tests vs. unit tests, I have a assumption that you may not be doing as much as possible to put more logic into your Models"

Well, for anything that started years ago, I would have to agree, but, you are missing (I think) one of the key points I was trying to make.

Given an infinite set of monkeys with an infinite set of typewriters, you would want unit tests, integration tests, and any other type of tests one would like.

But, since we work in a world with a finite set of monkeys and typewriters, my preference for integration tests is that those tests actually test your production code.

Unit tests can pass and yet your code can fail in production. This is unfortunate but true. So, if you have a limited set of resources, test what matters in production.

"...when you find out if your design is good or not. TDD is the way to do that ASAP."

If you are designing an API, yes. Otherwise, 'design' encompasses much, much more than what you write in TDD tests.

BDD tries to get beyond the developer-centric tests with higher-level specification tests, but right now it is so stuck in bad syntax and inconsistent goals, that it is not in a good state.

Note: this isn't to say you can't do BDD and deliver quality. You can do wild 'cowboy coding' and deliver quality. The question is whether BDD gives you a better chance of doing it. I doubt that right now.

"So to those that complain about testing and have yet to really dig in and write tests for a large chunk of their current application you have zero room to talk in my eyes"

Why? I will never write Corba code, and I don't need to, to know that there's something wrong with the basic idea.

"it's because they aren't as professional as they say they are. "

It sounds like you have a target in mind.

One of the saddest thing about alt.net is the insistence of calling into question the motives of those that disagree, but refusing to name names.

So, who are you accusing, or are you another Miller/Bellware type that likes to insult the Microsoft MVP program without laying it on the line?
Gravatar
# re: On TDD
jdn
10/14/2008 8:02 PM
@Corey

I appreciate your feedback, but find your overall message unclear (stating upfront that I can be a bit slow, so bear with me).

You state (and I agree):

"There are no silver-bullets"

but seem to suggest TDD and/or testing is such a thing.

"you choose not to test... The most important thing in the world is on the line"

So, do you think testing is the silver bullet that will make or break you?

"If you are prefering to write integration tests vs. unit tests, I have a assumption that you may not be doing as much as possible to put more logic into your Models"

Well, for anything that started years ago, I would have to agree, but, you are missing (I think) one of the key points I was trying to make.

Given an infinite set of monkeys with an infinite set of typewriters, you would want unit tests, integration tests, and any other type of tests one would like.

But, since we work in a world with a finite set of monkeys and typewriters, my preference for integration tests is that those tests actually test your production code.

Unit tests can pass and yet your code can fail in production. This is unfortunate but true. So, if you have a limited set of resources, test what matters in production.

"...when you find out if your design is good or not. TDD is the way to do that ASAP."

If you are designing an API, yes. Otherwise, 'design' encompasses much, much more than what you write in TDD tests.

BDD tries to get beyond the developer-centric tests with higher-level specification tests, but right now it is so stuck in bad syntax and inconsistent goals, that it is not in a good state.

Note: this isn't to say you can't do BDD and deliver quality. You can do wild 'cowboy coding' and deliver quality. The question is whether BDD gives you a better chance of doing it. I doubt that right now.

"So to those that complain about testing and have yet to really dig in and write tests for a large chunk of their current application you have zero room to talk in my eyes"

Why? I will never write Corba code, and I don't need to, to know that there's something wrong with the basic idea.

"it's because they aren't as professional as they say they are. "

It sounds like you have a target in mind.

One of the saddest thing about alt.net is the insistence of calling into question the motives of those that disagree, but refusing to name names.

So, who are you accusing, or are you another Miller/Bellware type that likes to insult the Microsoft MVP program without laying it on the line?

Post Comment

Title *
Name *
Email
Url
Comment *  
Please add 2 and 6 and type the answer here: