Date: Thu, 05 Mar 92 19:30:01 EST
From: Steve Bellovin <email@example.com>
Subject: Re: A RISK architecture? (DEC's Alpha)
Brian Randell describes how the Alpha uses imprecise arithmetic traps, and
speculates that it's a risk to program correctness. With all due respect, I
disagree. Based on my experience with imprecise interrupts on the 360/91, lo
these many years ago, I would classify imprecise interrupts as more of a hassle
when localizing faults, rather than any risk to the program's correct behavior.
That is, the interrupts -- which typically signified erroneous program behavior
-- still happened, and still caused the program to abort. But it took rather
more debugging effort to figure out which instruction caused the trap. Unless
one is relying the interrupt handler to perform the appropriate fix-up -- a
technique that I regard as far more risky and non-portable than imprecise
interrupts -- correct programs should not behave any differently.
He also describes the barrier instruction as a ``sop to DEC's technical
conscience''. Not so. Its purpose is to help the programmer identify the
offending instruction. And compilers can (and were able to) generate such
instructions on appropriate boundaries. I recall vividly, 20+ years later,
finding that a zero- divide fault took place 11 instructions after the
offending divide, and after the divisor register had been overwritten with a
non-zero value. But it had to be that instruction; there were only two divide
instructions in the entire program, and the other referenced a still-intact
If there is a danger here, it's from the hardware design itself. Pipelined
architectures imply parallelism, of course, and that's harder to get right.
But the hardware designers seem to do a better job which such things than do
the software designers...