I was looking through the test system config files for Linux (and then Mac) yesterday, and noticed something suspicious. In the DetectCrash.m file for Linux, there is a 5 second pause at the start of each script run (obj.StartupDelay = 5 here):
% wait for GMAT to start
On Linux, GMAT starts up pretty much right away. The console app, which is what we use in the test runs, starts even faster.
We run 12,920 scripts in test runs (that number has increased with commits yesterday). That means we are imposing a 17.9 hour overhead on the run from this pause. I tried a couple of things to see if that is needed. First, I ran Smoke tests and checked the run time – 13+ minutes. Then I cut the delay to 2 sec, and saw significantly better run times – 8 minutes or so. Since I was waiting for a code commit to rerun the full test system, I kicked it off using the same configuration as used in the 8/27 run. The results are attached. The file shows one line that is different from the 8/27 run. This one:
8/27 (5 sec delay): Run Time: 03 hours, 05 minutes, 03 seconds
Note that that is actually 1 Day, 3 Hrs, 5 min, 3 sec. Yesterday's run reported (date reflects when the data was collected) this:
8/31 (2 sec delay): Run Time: 16 hours, 30 minutes, 27 seconds
The run time is 6.5 hrs faster.
I also tried removing the delay from the run completely, and running smoke tests. That run took about 6.5 minutes.
Why do we have that delay penalty on the Mac and Linux runs? I do not see that setting in the Windows files, but maybe it occurs in a different location there.