Is it ever ok to delay writing tests?

Joel Clermont (00:00):
Welcome to No compromises, a peek into the mind of two old web devs who have seen some things. This is Joel.

Aaron Saray (00:08):
And this is Aaron.

Joel Clermont (00:15):
What most people don't know is that this podcast is actually an opportunity not to share knowledge, but for you and I to figure out what we think on things.

Aaron Saray (00:26):
They always say, "If you want to learn something, you should teach it."

Joel Clermont (00:30):
That's right.

Aaron Saray (00:30):
It's the same sort of thing.

Joel Clermont (00:32):
But no, we've definitely had the experience where we came into something with some vague ideas and then by the end of the discussion it's like, "Okay. Yeah, it's solidified a little bit." Anyways, this one will be a little more freeform but around the topic of testing. And more so when to test and why sometimes you might make what seems like a bad decision to delay doing some testing. Because this is a decision, full confession, we made on a recent project. We are adults, we did come back and fill out the tests but we had some reasons for doing it. So I thought it might be an interesting thing to discuss on the podcast today.

Aaron Saray (01:11):
So to explain what you sort of did. We had a brand new project that we were rewriting an old project from, you know making some enhancements, making things better. But there's a significant amount of testing that happens throughout that entire process. Whether it's unit testing, feature and user acceptance, all these different things because it's a brand new everything. And when you get contracted to rewrite something new some of the stuff that was just okay for the old version or that was fine, that excuse doesn't happen anymore. So it has to be tested all the way through.

Joel Clermont (01:45):
Yeah. And the original version had no tests. It wasn't like we were throwing away tests from the old system either.

Aaron Saray (01:53):
Well, and when I say tests here... You know, we're using the same word over and over. But I mean fully tested as in the users need to agree on every single piece of functionality. Because sometimes people feel trapped by their old system, you can only make it so much better. Where a brand new one, well, you could do whatever you want as long as you're paying the bill, right?

Joel Clermont (02:11):
Mm-hmm (affirmative). Good point.

Aaron Saray (02:13):
So in this particular case, we were working with a client and we're trying to keep up some momentum of programming. And one of the things about unit tests and stuff like that is unless you're very, very, very diligent and you only do test driven development or whatever, it can sometimes feel like a little bit of extra work. Now, we've talked many, many times about the value of them and how we can't do a project without them, but it doesn't mean that it doesn't feel like a chore when you're doing it, right?

Joel Clermont (02:39):
It can, yeah.

Aaron Saray (02:41):
So we were working this large project and one of the things I remember asking you was like, "I think I'm going to put off the unit test till two thirds through even further on for some of these things because I kind of want to get these out the door and work through this and whatnot." And there was a particular reason why we did that. The reason was basically that we had so much user testing that was necessary and we had so much unknowns about the project even though we wrote up some wireframes and some documentation. That some of these things wasn't worth testing unless we actually had the user say that, "This is a functionality I expected you to make just in general." Whereas depending on the client and depending on... Some of them can visualize stuff, some of them need to have it physically in front of them. No matter what you do, they still won't be able to give you the proper feedback until they can touch and play with the thing.

Joel Clermont (03:37):
Yeah, I think that is probably the crux of the decision here. Because you had started out talking about momentum and time, but in reality we were making some improvements from our perspective to the design of a feature or to the workflow that they do as some of the more common workflows in the system. But we didn't necessarily have buy-in even though we made that wireframe and we talked about it on a meeting like you're saying. Sometimes it just doesn't click in the brain so why do all this extra work of testing when the answer might be, "oh, that's not going to work. We got to redesign this and do it differently."

Aaron Saray (04:18):
Yeah. And an aside, a little tangent, as a junior developer or a new developer, sometimes you get this fear of... not a fear but just an irritation or whatever. The client isn't getting it, you just want to get off the call, you just want to... I see people fall in this all the time, like, "As long as the client didn't say no, they must have said yes." And that's not always the case, sometimes the client is like, "This is just so out of my area, I'm feeling really anxious. I'm going to "trust you" but really I haven't agreed to anything actually, I don't feel like I have." And you can tell what that is, you can tell that feeling I think if you really pay attention.

Joel Clermont (04:54):

Aaron Saray (04:54):
But it's just an irritating feeling as a technical person. So too many people they just try to move on with it, where we try not to have those situations. But if we are in a situation, this particular client was a little bit more like that, we try to get them something in front of them sooner than later so they can communicate to us in the way that they communicate.

Joel Clermont (05:13):
And if you can step out of that frustration for a minute, because I felt it. I could totally relate to that. It's like, "Ah, just fine. I'll do what I think is best and it'll be fine." Would you rather have that frustration and that surprise as a developer at the beginning of the project or at the end of the project? Like, now you're ready to ship it and then they finally get it. Like that's way, way worse. So I agree with you. Kind of suck it up, get through that frustration, try to actually make sure everyone's in agreement. Good tangent, I like that one.

Aaron Saray (05:46):
That was one of our decisions then is like, "We recognize this particular client and setup is in this manner." I said to you, Joel, I'm like, "Do you have a problem with me skipping the test knowing that I'm going to come back to them and work on them and you might have to participate with some of them too?" And you're like, "That's fine, that makes sense." So we did that, we wrote that. Now, I still think it's the best solution maybe for this particular project but it's rarely ever the right solution for a project.

Joel Clermont (06:11):
Sure, yeah.

Aaron Saray (06:12):
And even though I still feel like it's the best, I feel it's... I'm 70% certain it was the best. There's a very large portion of me that even though all the facts and everything are pointing that this was how we needed to do this, I still feel like maybe I did it wrong.

Joel Clermont (06:27):
Yeah, I can relate to that. Because it is so counter to how we normally work. Like, neither one of us strictly practice TDD, where you write a failing test before you write a line of code. But we do strongly believe in testing and I think just kind of working against that ingrained habit and practice, even when you had some logical reasons to do it. Yeah, it did feel a little bit like swimming upstream or doing something forbidden or wrong.

Aaron Saray (06:52):
So just to kind of wrap this up. You kind of started out saying that we don't always know the answer to what we're talking about when we start talking. And that's kind of what I realized when I was explaining this as well. Is, when I first came to you as the reasons why I need to skip testing I hadn't fully formulated that it was an audience related thing. I mentioned momentum and things like that.

Joel Clermont (07:17):

Aaron Saray (07:18):
But that isn't an excuse. It was the audience solely I think, and I just didn't know how to classify that. But once I had to explain it to you what I was really thinking... Because when you work with someone too, you end up having a shorthand and so you can say something and you're reasonably starting that person has thought it through. You know, when I had to explain it longhand here, it makes more sense that it was really just... it had to do with the client themselves and how they would wanted to work. And one last thing here is, it doesn't mean that client is bad it's just a different type of client.

Joel Clermont (07:48):
Sure, yeah.

Aaron Saray (07:48):
And that's just a different way to communicate. And as developers, like I always say, "We're in service." We build things for people, those people are important and sometimes the way they communicate is different than how we do. And if it wasn't, then we wouldn't have jobs.

Joel Clermont (08:01):
That's an excellent point. I think also... I know you're wrapping it up, but in our defense too-

Aaron Saray (08:09):
Yeah, I thought I'd wrapped it up. What are you doing?

Joel Clermont (08:10):
No, I'm unwrapping it and I'm going to rewrap.

Aaron Saray (08:12):
You're trying to get the last word.

Joel Clermont (08:14):
No. But I just want to clarify what we actually did. First of all, you did write a bunch of tests, like this decision was made partway through the project. And then there were quite a few to-do tests, we did that. And then the last thing, which I think wasn't necessarily the original part of the plan but how it ended up being executed is, I did come in and fill in some of those tests on features you had wrote. And it actually was kind of an interesting exercise because we always do code review, so I maybe reviewed the code you wrote. But you look at it with a different angle when you're coming there to write tests for it. So it was sort of an interesting exercise in terms of knowledge sharing and kind of spreading the knowledge of the system. And even maybe more rigorous testing, having like a different person doing it. I don't know, these are just some random ideas. And now you have to try to wrap up that random tangent that I threw on you at the end.

Aaron Saray (09:10):
Good job, Joel.
The other morning I was in the car driving somewhere and listening to the radio, which I try to never do. But I thought about how difficult or how weird it would be to explain to a 15th century musician what the job of a pop radio DJ is.

Joel Clermont (09:35):
Why would you pick 15th century? I just got to-

Aaron Saray (09:36):
I don't know. I was just thinking like back in the day when the musicians were good at their craft but they could barely have enough money to survive and maybe they would go in king�s court. And this idea that maybe I've just watched The Witcher too many times and I have an idea that there's just traveling musical people.

Joel Clermont (09:54):

Aaron Saray (09:54):
But the point is it's kind of a hard job. You know, they're talented so you try to explain them that, "Okay. Well, there's this radio DJ involved with music and they get paid pretty well." And they're like, "Oh, so they play an instrument like I do?" "Oh. Well, no, they don't. They don't play an instrument." "Oh, okay. Well, so you have your music on... I think I don't understand a record or whatever it is. So they must put those on the thing like we used to do a couple years ago?" And like, "Well no. These days they don't actually spin records. They don't have to touch the music." "Oh. So they at least curate and pick a nice list of music?" And you're like, "Well, some do. But for the majority, these are bought by the record companies for space on there." And they're like, "Okay. Well, so what do they do?" "Well, sometimes they just talk and people call in on a telephone and they might run little scenarios. Like, my wife sees the birds nest up..." This is a real scenario. �sees a birds nest by our security camera." And the husband said, "I wanted to knock it down." And the question for today is, is the husband a jerk? And people were calling in saying whether or not he was a jerk for not wanting to have a bird's nest covering up the security cam. And you finished explaining all this too. You look at that musician who has really mangled hands from playing all the time. Callouses everywhere. You say, you say, "That's a job." And they're like, "What?"

Joel Clermont (11:15):
Yeah, that would be a difficult conversation. When you went with 15th century, I was also thinking like the angle. What if it was one of these classical musicians that... I'm trying to think of that guy. There's that one song that's the root chords of so many pop songs and be like-

Aaron Saray (11:32):
Oh, Canon in D?

Joel Clermont (11:34):
That's right, yeah. "And, you know, half the songs rip off your chord progression."

Aaron Saray (11:38):
Right. "Remember that you spend so much time learning and you got orchestras together and whatever? Now we hold two fingers on two strings, we hit it really hard, we run it through an amp really loud. And that's a power cord on a guitar. Basically it's just static and that's music."

Joel Clermont (11:53):
That's right.

Aaron Saray (11:58):
Joel and I are proud of you. You're on your journey of being a competent programmer to becoming a confident programmer. And you do that by listening to this podcast, reading our content, anything like that.

Joel Clermont (12:08):
And we'd like you to get more of our content, we want you to be more confident. Head to and sign up for our newsletter. We have a link on the homepage.

No Compromises, LLC