Joel Clermont (00:00):
Welcome to No Compromises, a peek into the mind of two old web devs who have seen some things. This is Joel.

Aaron Saray (00:08):
And this is Aaron.

Joel Clermont (00:16):
I love rules. And I think maybe sometimes I take it too far that I thought we could talk about it today. When I say rules I don't mean, "I wear this blue t-shirt on Tuesday," I mean when I'm programming. But now that I say that, maybe more rules in more areas of my life would be good. But to me, a decision is a little bit of a point of friction so if there are things that we've done a certain way and we've made a decision, like "This is how we do it," I just like not having to think about that. And I don't think I'm alone in that, I know you agree with that to a certain extent. But-

Aaron Saray (00:55):
Especially as the teams get larger, you know when everyone has their own opinion which may or may not seem wrong to you.

Joel Clermont (01:05):
Correct. Right, it can remove, not just the friction of me making a decision, but the friction when somebody on the team disagrees with my one-off decision. I was thinking, today more we could talk about going too far with that. I'm guilty and that's why it's a topic that was on our list to talk about today. I'll set up the scenario. We had a bug and it was kind of a dumb bug, right? It didn't really... I'm trying to think of how much context to give. I don't think I have to give too much. But let's just say there was even a test that was passing and then the test started failing when more data was added to the test. I would call that a flaky test. I hate that, it's annoying. Thankfully, I didn't have to spend too much time trying to figure it out. But let's cut to what the bug was.

Aaron Saray (01:59):
Well, either failing when you add more data? Or, another scenario I ran into with the same sort of problem is that, when you run a test by itself it works, but when you run it in the whole suite it fails.

Joel Clermont (02:11):
You know, now that you say that I think that might have been the actual issue. It wasn't more data in the test, but it was more tests dealing with the same underlying database table that that test was dealing with. Which is like very bad code smell, like something is not right there. And usually it would be the test data, right? Like, "Oh, you're being too rigid with your test." We weren't even doing that. But I'll explain. It was a POST endpoint to create something, if I remember right. And when the model was created, we called `-> save()` Eloquent, right? That's what you do. But we were assigning the result to, whatever the thing was called, $survey. And then we were doing a redirect to the thing you just created and using route model binding to pass $survey into that. So to save the suspense, there's probably a portion of people that are yelling at their podcast player right now, like, "That's stupid. Why would you do that?" And there's another portion that are like, "Well, why didn't that work?" Like, "What you said sounds reasonable." Well, the underlying reason is that calling save() does not return the model you created, it returns a true or false. Lovely PHP will take true and say, "You know what? That looks an awful lot like the number one." So when you do route model binding, if you pass true to a parameter that expects a model or a model ID it's going to cast it to one, which when you have a model in the record that passes. But then all of a sudden if you're asserting, "Hey, I should have got redirected to /2," and now it's like, "Nope, it's still /1." And I'm like, "What is going on here?" That was the bug. Did I explain that in a sensible way there, Aaron? Or is there more you want to add to it?

Aaron Saray (03:55):
Yeah. I mean, I think the only other bit of context is, it used to be a statement where we were just passing validated data into the create() method and then assigning it to the variable. And then it was updated that we had some guarded properties that we were not accepting from the user, but we knew about them in the controller. So we had changed it to fill() and then forceFill() just for the one guarded property.

Joel Clermont (04:21):
Got it, yeah.

Aaron Saray (04:22):
And then chain that to Save. So it was going from a Create, which returned the model to a set of Fills and a Save which returned to Boolean.

Joel Clermont (04:30):
Okay.

Aaron Saray (04:31):
So in case you're wondering how we got to there, yeah.

Joel Clermont (04:33):
"We're not idiots," is what you're saying.

Aaron Saray (04:36):
I hope not.

Joel Clermont (04:36):
You had to provide a reasonable path that we could have got ourself into that kind of awkward situation. But no, that's good context. And I think when I was troubleshooting this, I didn't even have the context. I was just looking at it... And maybe this is just me, but it's like, "Oh, save() that returns a model." Like, without stopping to pause to think about it. That's why it took me more than a half a second to figure out, it took like 15 minutes because I was glossing over that important detail.

Aaron Saray (05:03):
I want to go with one tiny tangent. We might have talked about this before but one of the things that we do is we tend to assign properties, like ID properties, in our factories and stuff, anything but the first element. Because, one, it equates to true and vice versa in a non-strict environment. Which a lot of the PHP is configured that way. That's another reason. I mean, this scenario that wouldn't have really affected-

Joel Clermont (05:34):
No.

Aaron Saray (05:34):
... but that's one of the reasons why. Because we've ran into things like this before where you're like, "Oh, well, the first one also equates to ID one, but also true."

Joel Clermont (05:44):
Right. Yeah, that's a very common root cause for a flaky test, is relying on a certain number being there.

Aaron Saray (05:52):
Oh, and that's why I think we never noticed it when we were developing until we got the tests and stuff too. Because, you know, in a perfect world you write your test first and all that stuff. But honestly sometimes we'll mix those up so you'll write some code, you write some tests back and forth. So I think a lot of times it was like making the first Survey or blog post or whatever it was and then blowing away all the data and trying it again, and trying it again. It was always the first one.

Joel Clermont (06:19):
Right, yeah. And that's good. So creating a record, it's a little weird because you can't explicitly set the ID and that's one of the reasons we got into this scenario. All right. Me liking rules, I said, "You know what, Aaron..." Oh, one more detail is PHPStan didn't catch this. Like, we use PHPStan and at the level we're on it didn't even complain about this. I'm like, "You know what, it should." Like, if I, for example, did Survey, I think that was the variable, if I do $survey->id PHPStan will catch that and be like, "There's no ID property on a Boolean value." And I said, "You know what we should do? We should have a rule that anytime you do a route model binding, you can't just pass the model in. You have to do model->id." And what did you say, Aaron?

Aaron Saray (07:10):
No.

Joel Clermont (07:11):
And my reason, "PHPStan would've caught it then."

Aaron Saray (07:14):
Yeah, my answer was no. I mean, the first thought is because before I even fully understood model route binding on earlier Laravel versions, I did that. I would go and say, "Take the model and pass it in." So that property, if it was named like Survey, I would pass in $survey->id. Then I was like, "Well, the ID isn't always the key." So then I-

Joel Clermont (07:41):
That's true.

Aaron Saray (07:41):
... through saying like, "getKey, so $survey->getKey." Then either Laravel got better or I learned more. I can't remember what the timeframe was where I was like, "Oh, wait, I don't even have to do that. I can just pass in the model and it's smart enough to know how to retrieve its key." And that's what route model binding is, I shouldn't have to understand that there's a pointer ID.

Joel Clermont (08:02):
Right. When you said that it clicked for me because it's like, "Why would we take, you know, an ergonomic step backwards as a developer because this bug happened one time?" You know, it'd be different if every other week we were bumping into something like this and we're like, "You know, this is just frustrating, it happens all the time." But this was really an edge case of an edge case. We normally don't... Number one, we don't code ourselves into the situation. Number two, it just doesn't happen that frequently. You talked me off the ledge that this was not the right leap to take to add all... First of all, not add a rule, but then also the rule required us to do something that had more negative effects than positive effects. Is that a fair way of summarizing it?

Aaron Saray (08:51):
Yeah, I think that's the case. I think what I had ended up saying too was like, "You're right. This is a problem that I wish we could fix, this is a problem that may be fixed in the future with Laravel. But at this time, is this a problem we actually need to address? Because it probably isn't happening." So that was more, the whole point is like, "I agreed that needed to be fixed," and I think that's a huge difference there. Like, some people when they don't like the solution, they'll say, "Well, I don't agree that that's the problem then."

Joel Clermont (09:20):
Oh, okay.

Aaron Saray (09:21):
It's like, "No, I see the problem. I'm just not satisfied with your solution yet." And that's perfectly fine because it's only happened once. Now, like you said, if it becomes a pattern then we need to put our heads together and think of multiple solutions and see what is the best one to apply for our projects.

Joel Clermont (09:38):
Such a reasonable person. I love it, Aaron. All right. One more example, it's a slightly different example. Not so much about rules of how you write... Well, I guess it is rules of how you write code, but it's more about syntax. I got a question recently. "What do we prefer to do in a conditional where you have a variable but you're testing that it's not that? You know, !$var. Do you put a space between the exclamation point and the dollar sign? Or, do you not put a space? And just to summarize, like you would put a space because it makes it more easy to see it and maybe not gloss over the fact that you're negating that expression and not just testing it. But it also looks a little weird, at least to my eye. So should we create a rule here? What do you think? It's a coding standard, it's not really like how we... I don't know what's the difference? Because the first thing is not really a coding standard, it's more of like a development standard maybe.

Aaron Saray (10:40):
Yeah. For this one too, I was like, "Well, I don't know if we need to make that that rule yet." First of all, let's make sure that we look at the PSR-12 stuff and then the newest updates-

Joel Clermont (10:51):
I did, yeah.

Aaron Saray (10:51):
... that are coming out too. Which I think there's another standard that's being worked on. Just trying to understand, you know, is there anything in here that we're missing already?

Joel Clermont (11:00):
Yeah. And I did look into it, right? Well, I guess I'll admit one thing first. The project I was in, I just did a global find and I actually found both of them. The one without the space was far more frequent but that's kind of the thing that even triggered it as a discussion. Is like, "Well, we're not being consistent, so should we fix that?" But it is not declared in PSR-2 or PSR-12. It's not even an available sniff in PHP CodeSniffer nor in the Slevomat extensions that we use that seem to have a sniff for everything already built. It's not even in there. I guess, to me, that was also another thing, like, "Hmm. Yeah, we could build one and it probably wouldn't take that long," but it's not really at the top of the priority list right now.

Aaron Saray (11:45):
Yeah, it's not on top of the priority. And I also kind of look at it as one of those things where, when do you stop making rules? And is someone really going to have some arguments about the space before or after that, you know? And is that really going to change the output of your work? Another reason for the coding standard and stuff is that we want to make sure that reasonably most people can develop the same way, can read it, can start developing a pattern matching in their head when they're doing code reviews and stuff like that. There are certain things that aren't going help that. I think about tiny little bits of spacing around a variable or, like when you do line breaks where do you break up multiple concatenations of a string? Is it always one line or is it kind of bringing them together and stuff? There's a couple things in there that don't really matter. And if that's really getting to you, then maybe you should go into QA and not as a programmer.

Joel Clermont (12:40):
You know, while you were saying that? I had a half a thought, which made me laugh. Which is we should have a rule for when we define rules. And I'm like, "Joel, just stop it."
I noticed a peculiar habit, maybe compulsion I have. And I'm just really hoping, Aaron, I'm not alone so I'm going to share it with you and I'm going to let you react to it. Have you ever been maybe at a friend's house, family member's house, and you're just kind of sitting around and the TV is on and you're like, "I have to change their TV settings, I can't take this." Like, the resolution is wrong or it's stretched or it's got that smooth motion, you know what I'm talking about on newer TVs where it kind of like? It just looks weird to me. So I did this the other day and I'm like, "Hey, where's your TV remote?" Because they had a different universal remote but it didn't have enough buttons on it to change stuff. And everyone made it seem like I was weird for making their TV look objectively better. Looking at your face, I can feel like you're not going to back me up on this one, Aaron.

Aaron Saray (13:55):
No, I'm not a jerk.

Joel Clermont (13:56):
Oh, so you view it as a power move. Like, "Let me tell you how your TV should be."

Aaron Saray (14:01):
Yeah. It's like when people come in and they tell you how to be a programmer and make a whole podcast about telling you what their thoughts and stuff.

Joel Clermont (14:08):
I would hate somebody like that.

Aaron Saray (14:09):
How arrogant.

Joel Clermont (14:11):
Well, I would say that's true if they had consciously chosen those settings, but most people they unbox the TV, they plug it in and that's what it is. Or, they sat on the remote one time and it changed it to stretch mode somehow and they didn't notice that, I can't imagine how you don't notice that, and they just left it. Does that save me at all?

Aaron Saray (14:33):
No, this sounds like someone who TV maybe wasn't important to them anyway,

Joel Clermont (14:38):
Right, maybe.

Aaron Saray (14:41):
So whatever they set up, let them live their life.

Joel Clermont (14:45):
I can't. I've even done this in a hotel room. I'm never going to be back there again, I'm never going to look at the TV again. I'm like, "Oh, this has got to go."

Aaron Saray (14:55):
Well, that's different because you're using it or whatever. But that's a shared TV or something like that. It'd be like if you went over to my house and then I'm watching, you know, Top Gear and you're like, "I like to watch football," and you just change the channel without asking me. It's like, "No, but you can do that in your hotel room. Doesn't matter because you're the only one using the TV at that time so you can change the channel."

Joel Clermont (15:18):
You were no help at all.

Aaron Saray (15:20):
Yeah. I don't know why you're such a jerk.

Joel Clermont (15:24):
Ah.

Aaron Saray (15:29):
We've talked about going too far with coding standards, but most teams don't have enough.

Joel Clermont (15:33):
That's something we can help with. Give us a call by heading over to our website at nocompromises.io.

No Compromises, LLC