Joel Clermont (00:00):
Welcome to No Compromises, a peek into the mind of two old web devs who have seen some things. This is Joel.

Aaron Saray (00:08):
And this is Aaron. Hey Joel, do you want to talk about the most exciting thing there is to talk about in software development?

Joel Clermont (00:20):
Always.

Aaron Saray (00:22):
Let's talk a little bit about Unit Testing. Depends on who you are, some people are thinking like, "Oh, this is really exciting, we're going to talk about unit testing." And their name is Aaron. Then everyone else is probably like, "Oh, not this topic." But I thought we'd bring it up because I think there's some interesting things that we ran into over our careers that we found that unit testing has really saved our bacon. The first thing I kind of want to set expectations is, why do we even unit test? One of the reasons why I unit test, and I joke about this a lot, is so I can go to sleep at night.
When you're deploying software and you have a number of tests that cover all the different paths through your software and they all pass, it's much less stressful of a deploy. Because you know that everything I've planned for is going to happen probably, so it's much easier to push that out there. Another reason that I do unit testing, it's actually sometimes faster and quicker to run your code through a little bit of a unit test as you're developing it, than it is to load something in the UI. Think about like you might have screen where you have to fill in a bunch of different choices.
I want to have this on my pizza, I want that on my pizza, I want to add some breadsticks. You got to checkout and then as you're checking out, you process your stripe connection or whatever. There's a lot to do whenever you have to do that. You have to check this all, you have to refresh the screen, you have to wait for this, all that kind of stuff. If you were at a unit test, maybe you're kind of testing out to different ways that you can call stripe. You can probably just pass in the data that you would have gotten from the user after it's been 'validated'.

Joel Clermont (02:05):
Sure, yeah.

Aaron Saray (02:05):
And run that test over and over, to see that your code is actually doing what it needs to. What kind of value do you find in unit testing?

Joel Clermont (02:14):
I agree with both those points. I will build maybe a little bit on the first one because you were talking about confidence that it works the way you think it does. But I would say it's also confidence that it will forever continue to work the way you think it does, or the way it does right now. Whether you later change some of your own code, or you update a library, or you update the version of Laravel, or any number of different things, those tests will still make me feel confident. Like, "Yes, at least I didn't break that other thing seemingly four steps removed from what I changed, but it actually could have broken it." You know?

Aaron Saray (02:54):
Right.

Joel Clermont (02:54):
And having tests that catch that stuff periodically reminds me, "Yes, this is a big reason I invest time in doing it, because it has caught things that I would not have expected."

Aaron Saray (03:05):
Well, I think you make a good point there too. Because I'm going to ask everyone who has written code that has never had a bug in it to raise their hand-

Joel Clermont (03:12):
Sure.

Aaron Saray (03:12):
And can't see any hands here. The issue is we sort of have an implied trust in the packages we bring in that they're going to be bug-free, but you can't guarantee that. If you're going to update a minor version or a patch version, it's really nice that you can do that. You can keep current, maybe they'll patch a security hole, you want to update those. But then you can run all your tests, all the different checkout types for your pizza, all the different types of pizza. You know all of that in order to make sure that their change in someone else's code, that you've not had a chance to audit probably, is not going to cause an issue in your application. Because, I mean that's the hard thing to explain to the boss too, right? Or, to the client. Is, "Well, I updated someone else's code in my project and now your product doesn't work, and not really my fault." It's your fault.

Joel Clermont (04:03):
Yeah, that would not go over very well. Yeah, the value, I think we can agree on the value. And maybe even for somebody that hasn't tested yet, they can conceptually agree with the value even if they haven't experienced it yet. But when it gets down to writing the tests, how do you tackle those? What's your approach to what sorts of tests do you write? What sorts of things do you cover? What's your strategy?

Aaron Saray (04:29):
Yeah. I mean that's a huge topic and maybe we'll talk about it in some other podcasts too. But I can just focus on one that I think is particularly interesting. Is, when we do a feature test or we're testing user input, and then we're testing the effects on the output of that. The idea of some sort of feature test or end-point test, or however you want to call it, is not necessarily we're testing the browser. We're testing that we're going to send some data in or interact with that product in a certain way. We send that data in and then we don't really care what happened, we just care that... We don't care how it happened, we care what happened, right?

Joel Clermont (05:11):
Right.

Aaron Saray (05:12):
We'll test maybe the return of that true or false, or some JSON or whatever. We also want to test, "Did this model get created? Did it pick the right model in my query?" All these different things by kind of validating the before and after state a little bit. I think if you're going to do any sort of testing, just having that one endpoint to input-output sort of test is good. I think the other thing with that is it kind of opens up this ability to test all the silly things that users might do.

Joel Clermont (05:46):
Never. You mean users, they always have the exact same idea you do when you built the form and they would always use it the same way. There would never be somebody making a mistake, or just like typing a book into a field just to see what happens.

Aaron Saray (06:02):
Yeah. Or, like just holding down the keyboard by accident, suddenly you're filling out a form and then something happens and you slam your hand down and type in 257 As into the field and then hit submit.

Joel Clermont (06:17):
Your cat lays down on the keyboard.

Aaron Saray (06:19):
That's probably a better example.

Joel Clermont (06:21):
If you have a cat. I mean-

Aaron Saray (06:23):
Right.

Joel Clermont (06:23):
... they do love sitting on keyboards.

Aaron Saray (06:26):
Where did this cat come from? Why is it in my house?

Joel Clermont (06:30):
That's a bigger question. Yeah, so the first sort of test you were mentioning, I've heard some people refer to that as the happy path. Like, if everything goes right, here's what it should do and here... There's variations on a happy path. Like, if they have this permission or they're this type of user, different things happen. Kind of covering all those paths. Then the next thing you started talking about is sort of the unhappy path, where things are unexpected and you want to make sure they're handled in a sane way.

Aaron Saray (07:00):
That's a great summary of the rambling I just did.

Joel Clermont (07:03):
That's my job, that's why I'm here.

Aaron Saray (07:05):
No, I think you're right. The second one is actually where I spend most of my time. I write a happy path and then later on, like I said, there's a whole discipline in unit testing. But I'll write a happy path, end-point test and then I'll go and I'll unit test those individual components to my heart's content. But when it comes to validating user input and output, I spend most of my time sending in particularly incorrect data because I want to make sure that my application handles all those scenarios properly.

Joel Clermont (07:39):
Yeah.

Aaron Saray (07:40):
For example, if I have an email address field or whatever, I'm going to definitely have a test where the email address is fine. I'm going to have one where it's invalid format, I'm going to have one where it's just not there at all. I'm going to have one where maybe it's a duplicate of someone else's email address, maybe I don't allow... I want unique. I'm going to have all these different tests that test all these different scenarios. I might even have a test that sends in an email address that is a thousand characters long, and what happens then?

Joel Clermont (08:12):
Yeah. We've worked together, I've seen how these test suites come together and you'll bunch things together. Because somebody might listen to this and like, "Well my form has 30 fields and that sounds like thousands of tests." But generally you have one set of tests for validation of length and another for validation of required. It sounds worse than it is, it's not that hard to write. How would you respond? Somebody might be thinking, "Well, aren't you testing the framework then?" Like, "Aren't you testing that Laravel is doing the validation the way it should? Or how do you look at that?" Because I know we agree on this but I want to hear you explain it because I'm sure other people are maybe having this thought.

Aaron Saray (08:58):
Right. Well, I think there are ways to test validation that are testing Laravel and that's not what I'm talking about. I've passed in these values into my form request, now when I call rules on that form request, does it return the same values? That is what Laravel is supposed to do. This endpoint is testing that whatever you have typed in, in the format that is a valid Laravel syntax, is actually doing what you think it is. There are many, many, many times when I've typed in something that after reading the docs I said, "Oh, this should do this thing." It should limit something a certain way.
And when I go on the user interface it looks like it's working fine, and when I run unit tests and I put some different information in there, I find Laravel is doing exactly what I told it to do. Turns out I told it to do the wrong thing. It's not about testing whether Laravel works or not, Laravel does. It has tons of tests and tons of people using it, it's working great. It's about the way you configured that. And a lot of times you have to understand that it is okay to test certain configuration if that configuration is used for your business process.

Joel Clermont (10:09):
Yeah, with all those validation rules� Maybe you could make the argument, a length validation. Kind of hard to mess that up, right? Like, implementing it. But there's others that are required_if, and it's got to be in. And you can write your own custom rules, so why wouldn't you test that logic? It's essential business logic. Just because it happens to be implemented using bits and pieces of the framework, it still seems like a valuable thing to test.

Aaron Saray (10:37):
Well, I'll give you an example of where you're almost right.

Joel Clermont (10:40):
Okay.

Aaron Saray (10:41):
But technically, you could have a bit of a challenge. Imagine a scenario where you're trying to validate an incoming number and you need a maximum of 10. So you go max:10, and you said, "Well, that's a really easy thing not to mess up."

Joel Clermont (10:59):
Sure, yeah.

Aaron Saray (10:59):
But what if someone sends in 0X0010? Which is actually not correct, that's a 10th. But, I mean, if they send in a octal formatting or whatever, that rule will say, "Yes it's good to go." When really, it wasn't. Again, it's more about the combinations of different things you put in there as well. You're right, it's easy to type max:10, but that works for both strings and numbers, and arrays and whatever, so that's not as easy.

Joel Clermont (11:30):
No, and so sometimes with these length or range validators, I ran into a particularly weird issue recently. Because like you, I will test what happens if I type in something that's too long? Like, this field is supposed to be 255 characters so what happens if I throw in 256? Will it give me the message I expect? Well, I had another one where I did a really bone-headed thing and I set the validation to a thousand, and I had everything in the UI proper for a thousand. But then when I wrote the migration, for some reason, I didn't set the length and it defaulted to 255, so I'd actually encountered a bug in production where somebody tried to type in 500 characters like they should have been able to and it failed. It blew up because it couldn't fit it in the database. So how would you tackle something like that? I know what I did there.

Aaron Saray (12:24):
Yeah, I know what I did. I hope so. I think that's part of the point I'm trying to make. Is we want to validate what are sometimes referred to as bounds or sanity bounds and things like that. Where do we retrieve those bounds from? You can get the bounds from two different locations. One is business rules. So does our business allow this? For example, will the business ever sell something that's a million dollars? If not, maybe don't let someone charge a million dollars. Because if they are, maybe there's something going wrong, there's some sort of program error. You can put a bounds on that for your business. The second thing is more like you mentioned, the other binding thing is the technology we're using to persist or to communicate, or whatever.
If the database only allows 255 characters, well then the user should only be allowed to send in those 255 characters. Even though the business rules says it could be unlimited. It's when you combine those two, that's kind of how you start to develop these different validation things. Same thing with when you're working with a third party API too. They might say, "We have a limitation of 16 characters for city name," or something crazy like that. You need to validate your user's information on that then. Because I'd rather have them make the decision of how to shorten the name in my user interface than me just chop it off randomly.

Joel Clermont (13:51):
Right. Yeah, that's a good point and it's a balancing act. You could drive yourself nuts trying to think of every single possible way something could go wrong. But in this particular case, I encountered it in production. I'm not going to necessarily write a test every time that I can fill in up to the exact amount that I'm expecting. But maybe if I deviate from the normal 255, like the default in a migration, maybe I will write a test for that. I did in this case, because I like to always write a failing test when there's a bug in production and then watch it pass. But, yeah, it's a little bit of an art, not so much as science, to figure out where to draw that line. And it might be different for different teams and different projects too. My one concluding thought is that your earlier illustrations talked about pizza and I think I'm going to get pizza tonight.

Aaron Saray (14:44):
Do you ever find that there are things that you do still that have proven that you're just going to be a kid forever or that you've never grown up?

Joel Clermont (14:53):
Maybe. Give me an example.

Aaron Saray (14:56):
I cannot help it. I am now in my late 30s, and whenever I receive a card I still look for money in it.

Joel Clermont (15:05):
Okay. All right.

Aaron Saray (15:08):
I know I shouldn't be expecting any. And then when I open it and there is none, I'm not upset. But-

Joel Clermont (15:15):
A little bit.

Aaron Saray (15:15):
... there could have money in there. Yeah.

Joel Clermont (15:18):
Sure.

Aaron Saray (15:20):
There's other things that, again, this is not overly serious, but it goes through my mind just a tiny little bit. Is when I go in the dark in the bathroom, I kind of feel like there might be a dinosaur in the shower behind the curtain. I mean, I know there's not. But I think there has to be from being at a certain age and watching Jurassic Park. I'm like, "There's definitely a velociraptor in the shower obviously."

Joel Clermont (15:48):
You might be confusing movies. Because I don't think there was a shower scene in Jurassic Park, but I get your point. Okay.

Aaron Saray (15:57):
Yeah. I mean, what about you? Are there any things that you do that you're like, "Hmm, still a child."

Joel Clermont (16:02):
I don't know if this is a childish thing, but there are definite habits I have. One of the advantages of being married is that your wife will call these things out.

Aaron Saray (16:11):
Advantage.

Joel Clermont (16:11):
For example, it is literally impossible for me to go put gas in the car and not go in the store and buy something to eat. That is-

Aaron Saray (16:26):
Really?

Joel Clermont (16:26):
Yeah. I've even gone just to get gas for the lawnmower. I'm in the middle of mowing the grass and I run out of gas and I go get it. I'll come back and I'll have a soda and she's like, "You went into the gas station, didn't you?"

Aaron Saray (16:40):
That's a measurement of if you were incredibly poor growing up or not. For me, go to the gas station and I would be looking and I'm like, "Can I get some candy?" And dad's like, "No."

Joel Clermont (16:54):
That is where it started.

Aaron Saray (16:54):
"You cannot go in there because you're going to," I don't know, "rot all your teeth out."

Joel Clermont (16:59):
But as a kid, when went someplace and you knew inside that building was candy and soda, you wanted that always. Now it's like I'm an adult, who's going to stop me? That's one thing that occurred. Otherwise, no. I can't think of any specific fears. I have this odd fear of sunflowers. Let me clarify. Not like they're going to get me or something, but they're just super ominous-looking to me and I think as a kid they kind of looked down at you.

Aaron Saray (17:33):
Oh, yeah.

Joel Clermont (17:33):
My family grew a giant sunflower in our garden this year, so I had like post-traumatic stress all summer. But that's probably the closest I could think to the examples you cited. I could air all sorts of other weird things about me, but I'll stop there.

Aaron Saray (17:50):
Do you need a little guidance on how to set up your test suite and what to test?

Joel Clermont (17:54):
We can help. Contact us for a free consultation on our website, nocompromises.io.

No Compromises, LLC