News & Media


Roush drivers, computers finally on same page

January 31, 2011, Mark Aumann, NASCAR.com

Issues with algorithms led to slow start; fixes have team hitting on all cylinders

After struggling to get out of their own way each week during the first half of the year, Roush Fenway Racing cars suddenly were a threat to win at every race in the late summer and fall. Greg Biffle and Carl Edwards each won twice, with Edwards taking the final two races of the season.

So what changed? Blame it on a programming error.

"He makes a change and the software says it's going to make the car do this. if I come in and say unprompted, 'Hey, it's doing this' and it's the same, then we know we're on the right track."

--CARL EDWARDS on Bob Osborne

Since NASCAR outlawed on-track testing, teams like Roush Fenway have used software which uses simulation to predict the best setups for each track. However, the resulting data was apparently leading Ford's flagship teams in the wrong direction. And it wasn't until the midpoint of the year that the errors were corrected, team owner Jack Roush said during last week's media tour stop.

"The effort we made over the winter with our computer algorithms to support our simulations did not work out," Roush said. "They didn't correlate with real-time, actual impact of changes to the race track. We figured that out early on. By Bristol we knew we were in trouble.

"The changes we made over the winter through our third-party vendors didn't correlate and that ... relegated the crew chiefs and drivers to make single variable changes to be able to evaluate: Is it the wedge, is it the camber, is it the nose weight, is it the swaybar front or rear? By mid-year, by Chicago, we had that sorted out. We got back on track."

Biffle explained it with an analogy.

"If you're going to a resort and I give you the wrong directions, and say, 'Here you go. I'll print them out for you,' and you go off in your car, you're not going to have a very good time," Biffle said. "You're going to struggle. You're going to stop at three or four gas stations and say, 'Where am I going?'

"The worst thing is, if we didn't tell you where to go, if we didn't hand you that piece of paper with the wrong directions, you'd have been much better off. You'd have been able to find your way and know you're not following a road map that's wrong."

Biffle said when teams get to the track, they try to match their findings with the simulation. But sometimes the variables are so different that you might not realize the data is misleading.

"That's part of simulation. That's part of testing," Biffle said. "When you get false information and you go to a track -- the track's real green and has a lot of grip -- and you go and figure out here's the right bump stop, here's the right shock, here's the right spring. You come back, it's hot, it's sunny, it's slick -- all those things are wrong. But we know in our minds, they're right because we tried everything else and this was the best."

The most difficult thing, according to Biffle, is not knowing when to trust the computer and when to trust your own instincts.

"Same with simulations, same with testing. You have to be able to wade through what's right and what's wrong," Biffle said. "A lot of times when you can't predict it properly, you just never know when you're predicting it right. The only way we know is if they predict we should use 'this and this and this,' and we go to the race track at Daytona and 'this is faster and this is faster and this is better.' Then we know the prediction was correct. That's all we can do."

Roush engineers believe they solved the problem, and Biffle agreed.

"We feel like at the end of the season, we got better at the directions," Biffle said. "They weren't 100 percent. They'll never be 100 percent. But instead of going from a 50 percent success rate, we maybe moved to a 70 percent. I don't know if my ratios are right, but it was better. I think your maximum achievement might only be 80 or 90 percent, because the weather and tire and things like that can change slightly. We think we've even learned more from that and will be better once the season starts."

Edwards said the best thing he can do is offer constructive feedback to crew chief Bob Osborne and not try to outguess the simulation.

"As a driver, once I start worrying about shocks and springs and swaybars, I can get myself into such a knot that whatever they change, I already think I know what it's going to do," Edwards said. "And it messes me up. Bob and I have a really good program with each other. He goes and he works on the software, comes back to me, we go through and verify.

"He doesn't tell me what to expect. He goes through and we validate what he's seen in the software through practice, and it seems to be working better than ever. When he makes a change, what I say is what he expects."

It's all about verification, Edwards said.

"He makes a change and the software says it's going to make the car do this," Edwards said. "If I come in and say unprompted, 'Hey, it's doing this' and it's the same, then we know we're on the right track. There's been more of that."

That's very similar to the way Biffle and crew chief Greg Erwin use the data. And as teams get more confident in their ability to forecast how the current chassis will react to changes in stops, springs, shocks and sway bars, the more information crew chiefs will have at their fingertips. It wasn't that long ago that crew chiefs relied on an "old faithful" handbook stuffed with notes for each track.

"You have to have a game plan going in," Biffle said. "At the same time, we've used this car enough and we've gotten to where we're starting to develop some consistency, that if we didn't have that information, we have 'old faithful.' At this type of race track, we're probably going to use this bump stop, and these springs are probably going to be close, the track bar should be set about right here. We have a general idea. That's how we used to go to the track in the old days.

"As we get better and better, and our simulation gets better in its predictability, when it isn't there or does fail and lead us down the wrong road, we'll have a damn good idea where we need to be at this kind of race track. Once things get more consistent -- the tire kind of quits changing -- once things kind of stay the same and you go to the track two, three or four times, you're going to get it nailed down with experience and a handbook. Then the simulation is less important at that point."