I got there in plenty of time, but the test never happened. Seems that someone else on my employer's side did not show. However I learned what was going on, and actually was able to contribute something.
Seems this is not just some gonzo development work. My management had promised that a big data load would get done this weekend. Well, it didn't (it ran way too slowly), and the blamestorming has started. Is the problem with us or with the vendor (IBM!)? If with us, where in our system? I was asked about some errors in a log. The answer was obvious to me and I said so: Buggy SQL code on the IBM side. We had a long conference call with three of the IBM folks, one of whom was in a car with his wife and small child. They not dispute my argument that this problem was on their side. They responded like true veterans: They passed the blame around. Seems the SQL is not directly coded, but is passed through two different code generators, so the problem could be in any of three different areas on their side.
It is not clear that these errors are the actual source of the problem, though they certainly get me off the hook for now. The real delay may not be these errors, but something else. Perhaps a network error--the on-call guy from our network area was the no-show. He may have a difficult morning tomorrow.
The other thing the IBM people were suggesting was that our antivirus software was the problem, and would we please be so kind as to turn it off. No, thank you, said the managers on our side, and I am with them on this. These days that is not a reasonable request. Our AV software is perfectly standard and IBM should be able to cope.
Tomorrow could be interesting. I have done my best to cover my a** with kevlar, so I hope I can be reasonably detached.