What should I test?
Joel Clermont (00:00):
Welcome to No Compromises, a peek into the mind of two old web devs who have seen some things. This is Joel.
Aaron Saray (00:08):
And this is Aaron.
Joel Clermont (00:15):
Testing is a popular topic in the Laravel world. And one of the things I've noticed, especially coming into different projects, and I'm talking specifically about projects that have tests, not the ones that have like zero tests or just example tests.
Aaron Saray (00:31):
Yeah. Because just to clarify, if there's a library shared on any sort of website, like a news-related site, I'll click through and I'll look at a test folder. If the test folder's empty or it just says example test, I don't care what the library is I'm not going to use it. But anyway, sorry,
Joel Clermont (00:46):
I've done that too. Okay. So we're talking about like, we're onboarded into a project, it does have some tests. What I've observed is that very rarely is there any sort of consistency as to what is being tested. And let's kind of start at the feature test level. Which is mainly thinking about how a user would interact with the system through a browser. Not the dusk-style test, but the typical CRUD actions.
Aaron Saray (01:15):
Yeah, it's user input and it's measuring the side effect to that input.
Joel Clermont (01:21):
Okay, yeah. The way I think about it and kind of the habit or pattern we've settled into is our tests follow a certain structure. Like, the way I think of it is kind of like from the outside of the request in. So starting all the way with authentication, to me, that's the very first thing that'll probably get checked and will either pass or fail. So when we're writing tests, we would write one maybe more tests that assert. Like, you have to be authenticated or you have to have this role and if that fails, it should be-
Aaron Saray (01:56):
Those are two separate things. That's authentication and authorization.
Joel Clermont (02:00):
Correct, two separate tests. But I'm just talking about the sorts of things to test. Because so often and maybe I'll I'll generalize here, a developer will jump right into the happy path of a test, which is great, right? Like, you want to make sure the successful path that hopefully the majority of your users will take is well tested. But even in the file, like I'm not looking at a test file but in my mind I'm seeing like one of our PHP unit classes, right at the top it will be test fails authorization, test fails authentication if there's a specific role. So that's important and it's easy to overlook and you probably want to have that for all of your authenticated routes, right? I mean, it seems kind of repetitive or stupid.
Like, well, if I have a hundred routes all in one group and they're all using the same middleware, should I just check one or two of those? Like, we check all of them, right? It's just like you're in the feature file, maybe you're testing the index method. Yeah, test authorization then do it again on the Git method, do it again on the show. Like, you know what I'm saying? Like, these are the things that are easy to overlook and may seem stupid but I would argue it takes like 10 seconds to write a test like that. And down the road, if you refactor your routes file or you add more complexity, now you have the safety net, right? That's kind of the first example. We'll do kind of those AuthN, is that the hip term people use to say authentication and authorization? Aaron's looking at me like I'm acting crazy.
Aaron Saray (03:38):
I don't know. You said hip and cool and I just immediately turned off.
Joel Clermont (03:42):
Your eyes glazed over. But we'll start there. Then generally, we'll do like the happy path, like the success. And depending on the complexity of the controller action, there might be a couple, right? Like, to jump ahead a little bit, if we're creating a record there may be some optional fields. So we'll do one success path with just the bare minimum fields that are required, then we'll do another success and maybe call it like, success with optional that has all of the fields. And if there's interactions between those fields, we'll kind of test some varieties of those interactions all as part of the happy path. And then this is maybe a little controversial one but we'll test validation, like a lot of our tests are-
Aaron Saray (04:29):
Well, before we get into that. The other thing is not only like the optional stuff but when you get into the actual controller itself part of the success path is like, are there any if statements in that success path?
Joel Clermont (04:41):
Sure.
Aaron Saray (04:41):
Or one thing that people often miss is optional or no coalesce. So those don't look like standard if branch but they are because using the optional method around something means that you already have two scenarios now that both are successful.
Joel Clermont (04:58):
I like that.
Aaron Saray (04:59):
So before we go to validation, it's usually like successful bare minimum success, with optional added-in stuff in the request incoming and then try to follow the success paths inside the controller. Because a lot of times there's just one but there could be more though.
Joel Clermont (05:15):
Yeah, definitely. And I'm purposely not talking about the guts of the test. Like, "Here's what we assert and here's how we do the set." To me, that's a different topic. This is kind of high-level, like, "What are the things we test?" Maybe in a future topic, we can drill into some of these in more detail. But, yeah, good interjection because I think that's an easy thing to miss when you're looking at the code and scanning it for things to test. But validation-
Aaron Saray (05:43):
Which I don't agree that testing validation is controversial or whatever you just said. I don't know.
Joel Clermont (05:48):
Well, maybe controversial isn't the right word or it's like too strong of a word, but like overlooked or undervalued or... You know, some people might say, "Well, you're testing the framework," and we really aren't. Like, we sort of are by proxy but that's not the intention. Like, validation is some of your most important business logic in my view.
Aaron Saray (06:09):
Well, with that logic almost all tests are testing in a framework because they use the framework.
Joel Clermont (06:12):
Right.
Aaron Saray (06:13):
So I guess the way I look at it is there's nothing special about your app that requires you to use Laravel to build it, that's the tool that you use. So when you talk about solving a problem, you don't think like, "Well, I could only solve that with Laravel." You think, "I want to solve a problem and I could use Laravel to do that." So it's the same thing with tests, just because it's in Laravel doesn't mean the test has to understand what Laravel is. In fact, we write those tests a lot to not understand what they are.
We might use the tools that are available in Laravel but just because we happen to know in our brains that if you configure validation this way, Laravel is going to use it this way, you don't want to know that in your test. You want to assume... I mean, you wouldn't do that. But theoretically, you could toss out the application layer of something and replace it with a... You could have a Python app and still call the same endpoint sort of like in a test. You wouldn't but you could.
Joel Clermont (07:10):
Yeah, I was just like, "Please, don't do that." But, yeah. So maybe I will lie a little bit about what I said before and get into the details just to demonstrate what you're talking about. Like, in a normal HTTP post if something fails validation, what Laravel will do is send back a redirect, right? So we're not an API we're like actually in a browser. It's going to redirect back to the page you came from and in the session, there's going to be an errors key and there's going to be fields in that errors key with messages. We don't assert a 302, that's a framework implementation detail. We don't assert, "Oh, it's a 302, back to the page you came from," that's not the important part of the test. I think that would be more testing the framework, right?
Aaron Saray (07:58):
Mm-hmm (affirmative).
Joel Clermont (07:58):
That you're asserting a specific behavior. Now, validation is a little tricky because you do have to rely on some framework implementation details to make a proper assertion. Like, if I'm testing that my required validation is working we would submit an empty form request or an empty payload and expect that all of the fields that have a rule that says required would come back in that error as key. We are asserting a little bit knowing how Laravel is implementing that. And we have to go a step further, and I think this is where maybe it's more a little controversial, because like in our case we assert the message coming back and the only reason we do that is to make sure that it's the required validation that failed and not some other rule on that field. It'd be great if Laravel gave us a way to access the rules that failed, but we don't have them in the session. So the only thing we do have is the message that was returned.
Aaron Saray (09:02):
Yeah, if the error key was firstname.required and then it had a message or something-
Joel Clermont (09:08):
That'd be great.
Aaron Saray (09:09):
I mean, that'd be really hard to implement programming-wise but I don't think we're asking for that. But, yeah, that's the reason why. Because otherwise if you just say, "Well, there's an error in that field, which one was it?" And maybe you have custom a rule class and it's just not even ever calling that too.
Joel Clermont (09:28):
Right, exactly. Required is probably not the most interesting rule to talk about because that's pretty straightforward. But when you get into things like required with or exclude if, or like you have these interactions, I think those are really valuable tests to have. Kind of like what I was saying about the Auth tests earlier. Yeah, maybe it's dumb to write a required test, like how could you ever get that wrong? But it's very simple to do. That test will not take you any time at all to write and it just gives you some extra confidence. Like, you didn't forget it or somebody didn't actually delete it later, or something along those lines.
Aaron Saray (10:04):
Yeah. I don't know where this mythical group of programmers is that does everything right? But I know I have been writing Laravel apps and PHP apps for many years and I consistently messed things up and thank you for tests. When you say required it's not that hard, I'm like I've forgotten required on fields even though... Like I put it on HTML, forgot to put it on the rules for some reason. I don't know what happened. That's the first thing I go for and then I write my test and I'm like, "Oh wait, whoops."
Joel Clermont (10:34):
Yeah. And I don't necessarily practice this 100% of the time but if you wanted to experiment with doing like a test-driven flow, you shouldn't write the ideas. You don't write the code until you have a failing test. I have done it, I don't do it all the time, but it is kind of interesting to write the failing test then add the validation rule to make it pass. And sometimes in doing that I've even been surprised like, "Oh, I got something wrong there that I didn't think..." That if you would asked me before, I'm like, "Yeah, that's really simple. I would never get that wrong". Well, I did, so yeah. Anyways, that's kind of a high level and we could go into a lot of other depth. Like if you're interacting with APIs or if you're queuing up jobs or you're doing notifications, those are the guts of what you want to test. And a lot of that stuff would be kind of in that success path that we talked about.
But sort of structurally, like a feature test we group it by the actions whatever CRUD actions are in that controller. And within each group, like the index or the Git or whatever, we have authorization, then all the happy paths, then a bunch of failure modes. Most of which are validation but some could be other things too. It also makes it easy to scan your test file. Like, when you're looking for something they can get big. Well, it's like, yeah, just search for test create, you'll jump right down to that section. Or search for fails validation required, boom, you can get right to that method. Like, it's pretty easy to navigate a file when you have sort of a consistent approach.
Aaron Saray (12:05):
I think the last thing I'd add to this, just... I don't want to go too much out on a limb here. But by writing these tests and going so far into things like validation and stuff, it really helps when tech support or anyone, like users, come to you and they're like, "Well, I didn't do this, or, "it made me enter this." The standard messages that users have when you're like, "Hmm, I get that that's what you think happened, but that's not true."
Joel Clermont (12:34):
Didn't work.
Aaron Saray (12:34):
Well, no. But I mean, when they say things that are actually wrong. Like, "It never required me to put in a title." Like, "It sure did." I had the tests run four times a day so I know it absolutely is a required thing. But you don't say that to you... That will at least trip you off to look at something different.
Joel Clermont (12:57):
Yeah, there's multiple benefits of being thorough in your tests.
Aaron Saray (13:08):
It amazes me that babies can talk. Well, not baby but (unintelligible 00:13:15).
Joel Clermont (13:15):
Give me an age.
Aaron Saray (13:16):
I don't know when they learn to talk. But what the point is, I was visiting a friend and I saw her spawn and it was learning to talk and everyone was like talking in this high-pitched weird voice back-edit.
Joel Clermont (13:29):
Sure.
Aaron Saray (13:30):
I'm like, "Isn't that messed up?" Like, we expect them to learn to talk but when our first interactions with them are not how you talk. People think it's weird when... I've talked about this before. When I see a baby, I'm like, "Hey, nice baby." But I also talk to babies in the same way as I talk to you. "Well, hello baby. How are you doing? How's work? Nope? You don't work yet? Okay," and then I walk away. But my thought process is that I'm teaching them. Actually, I'm teaching them better than their parents really how to speak.
Joel Clermont (14:01):
You're right. I have to go back, did you say her spawn?
Aaron Saray (14:07):
Oh, yeah.
Joel Clermont (14:09):
I'm sure parents love hearing their kids talked about that way. But you know as you were talking about that, the other parallel I see is the way people talk to puppies, right?
Aaron Saray (14:20):
Yes.
Joel Clermont (14:22):
You know what? And that's why puppies grow up to talk that way. And it could have been different, Aaron.
Aaron Saray (14:28):
Well, no, it's just like dogs don't understand and so when we come and see them, we're like, "Oh dog." And they're all excited and they're like basic... Since they don't understand what we're saying, what they hear is bark, bark, bark. Because we can't understand what they're saying. So basically we bark at them and then they bark back and we're like, "Stop barking." No wonder dogs sometimes get all messed up because they're like, "Well, you barked at me, I barked back. And then you said, 'Don't bark.' What do you want from me?"
Joel Clermont (14:56):
So confusing. Well, next time you see a baby just say, "Who's a good boy?" And scratch their tummy or something. Oh, would that be weird.
Aaron Saray (15:06):
We're all smart and we can figure stuff up but maybe you just don't have the time and you're looking for some pros that have done it before and can save you some time and jumpstart the project.
Joel Clermont (15:15):
Well, I know two guys that can help, their names are Joel and Aaron. If you want to talk to us, head over to nocompromises.io and book a free call, see how we can help.