The Root Of All Evil, Part One

People often quote Knuth's famous statement that premature optimization is the root of all evil. Boy, has that ever been the theme of my life these last few weeks as I've been banging my head against the compiler trying to figure out how we're going to make it work with LINQ without breaking backwards compatibility. Some seemingly harmless premature optimizations in the compiler are eating up precious time dealing with how to work around them.

Before I can explain what exactly the premature optimization is that's drivin' me nuts, a short quiz. Suppose you've got an enumerated type E with value X.

E e1 = E.X;
E e2 = 0 | E.X;
E e3 = E.X | 0;
E e4 = E.X | 0 | 0;
E e5 = 0 | E.X | 0;
E e6 = 0 | 0 | E.X;

All of these are legal according to the compiler. One of these is illegal according to the specification. Before you read on, take a guess as to which one, and why.

Now, producing sensible behaviour when we should be producing an error is the least bad of all possible spec violations, but still, it's irksome. It means that we have to ensure that we never comply with the spec, because if we suddenly turn this into an error, we might break an existing program.

Which of the above should be illegal? The first one is obviously good. The second and third are legal because section 13.1.3 says that literal zero is implicitly convertible to any enum, and section 14.10.2 says that you can binary-or two enumerated type values. We'll therefore convert the zero to the enumerated type and use the enum binary-or operator.

Since binary-or groups on the left, the fourth and fifth are legal. The left-hand part of the expression is evaluated first, and the right-hand zero is converted to the enumerated type.

The sixth case is illegal according to the spec, because this is the same as (0|0)|E.X – (0|0) is not a literal zero, it's a compile-time-constant expression that happens to equal zero, and there's no implicit conversion from that thing to any enumerated type!

And there's the premature optimization. After we've got a complete parse tree we walk through the parse tree making sure that all the types work out. Unfortunately the initial type binding pass bizarrely enough does arithmetic optimizations. It detects the 0|something and aggressively replaces it with just something , so as far as the compiler is concerned, the sixth case is the same as the second, which is legal. Argh!

This kind of optimization is way, way premature. It's the kind of thing you want to do very late in the game, possibly even during the code generation stage. Someone tried to do too much in one pass – the type annotation stage is no place for arithmetic optimizations, particularly when they screw up your type analysis in subtle ways like this!

Next time I'll talk about how these same premature optimizations screw up definite assignment checking. There is a much related situation where there is a variable which the compiler treats as definitely assigned, and it IS definitely assigned, but the definite assignment specification requires us to produce an error. Bonus points to anyone who can suss it out from that vague description!

Comments

  • Anonymous
    March 28, 2006
    The comment has been removed

  • Anonymous
    March 28, 2006
    Personally, I think you should change the spec to match the existing behavior here. Constant folding should have absolutely no bearing on whether the program matches the spec or not, and if the spec says it does, the spec should be changed.

  • Anonymous
    March 28, 2006
    The simplest example I could think of:

    int a;
    int b = (a * 0) + 1; // Should be an error: (a * 0) is not a constant expression and a is not definitely assigned

    P.S. Welcome back!

  • Anonymous
    March 28, 2006
    Wes: send me a repro of your bug, I'll have a look at it.

    Anthony:  The spec specifies precisely what constant foldings must be performed so that folded constants can be used for switch labels, constant fields, etc.  So it's a nonsequiter to say that "constant folding should have no bearing on whether the program matches the spec".  The spec is all about constant folding.

    The problem with changing the spec to match the behaviour is that the existing behaviour is a complicated and weird interaction between an optimizer (which is NOT specified by the specification) and the constant folding code (which is specified.)  For example, (0-0) | E.X  also fails to produce an error, but (7-7)|E.X does correctly produce an error.  Basically we have a few choices:

    1) Do nothing -- document the spec violations internally and ensure that we maintain them
    2) Rewrite the spec so that it is a description of the current complex and bizarre behaviour
    3) Extend the spec even further, so that, say, any integer constant could be implicitly converted to enum.  

    Option (2) is gross, option (3) would make unfortunate expressions like (6 + 8 - 2 * 7) | E.X legal, so (1) is what we're stuck with.  

    Hence, root of all evil.  We wouldn't have this evil choice to make if someone had made the optimization pass the LAST thing that runs rather than the FIRST.


  • Anonymous
    March 28, 2006
    Carlos: This is in fact a bug that you've found, so, good work.  However, it's not the bug I asked for.  I asked for a bug where the compiler treats a var as definitely assigned, it IS definitely assigned, but the definite assignment spec says that it should be an error.

    Your bug is one where the compiler treats a var as definitely assigned, it is NOT definitately assigned, and it should be an error.

    But thanks for pointing this out, because this means I'll need to ensure that this case is maintained as well!

  • Anonymous
    March 28, 2006
    Wes: the compiler is smart enough to treat if(true) as something which always executes the consequence, so s will be definitely assigned and this is not in violation of the spec.

  • Anonymous
    March 28, 2006
    How about:

    int y = 2;
    int x;
    if ((y * 0) == 0)
     x = 1;
    int z = x + 1; // Should be an error: (y * 0) is not a constant expression so the compiler shouldn't consider the "if" as always taken.

  • Anonymous
    March 28, 2006
    Ding!  Give that man a cigar!

  • Anonymous
    March 28, 2006
    Stupid question, but...why not break non-compliant programs? They're non-compliant. I admit that it's cold comfort to the broken people when you tell them that they were wrong, but otherwise it rather degrades the relevance of a spec to say that, well, we're never going to comply with it.

  • Anonymous
    March 28, 2006
    The comment has been removed

  • Anonymous
    March 28, 2006
    The comment has been removed

  • Anonymous
    March 28, 2006
    In the past VS has always seemed to flag discovered violations as warnings for a version, then promote them to errors, giving at least some lead time. Heck, VS2005 broke badly a number of apps that relied too much on VS6 and VS2003's bad scoping rules. (Alternately, a special subparser designed to scan imported code for specific patterns like this could correct them, since you'll be documenting them anyway. It won't work for everything, but it's something. I know you already have a lot of code in upgrading project files, but as far as I know, not for actual code.)

  • Anonymous
    March 28, 2006
    Very simple:

    a) Some validation thing, possibly upon first import of "legacy" code will detect situations where there is possibility for a break to occur and warn about them

    b) a compiler switch could be used or a define added that specifies the compiler to use non-conforming, legacy behaviour on these kind of special cases. A validation/import tool could then automatically add this to old projects/files.

  • Anonymous
    March 28, 2006
    Indeed, one of the things we are researching for the next release of the compiler is what breaking changes are appropriate for this kind of "switched compiler" treatment.   Again, though, just having a switch is not a magic panacea.  Additional switches greatly complicate the test matrix, which means that we have less confidence that we're giving you a quality product.

    Adding warnings is a good technique.  I'll soon be talking in this space about some of the warnings I've added to the compiler in the last few weeks.

  • Anonymous
    March 29, 2006
    "IS definitely assigned, but the definite assignment specification requires us to produce an error."
    E e6 = 0 | 0 | E.X;

  • Anonymous
    March 30, 2006
    Eric.

    Your answer is roughly what I expected you to say about maintaining application compatibility, and roughly what I'd have said were I in your job rather than my job. However, since I don't work for Microsoft, I can't code to what your programs do: I have to code to the spec. So, when I code to the spec, my programs don't work. This makes the point of a spec a touch nebulous!

    As a side note, why does "breaking the app causes customer pain and updates" get to be a good argument when "breaking the spec by changing it, causing customer pain and updates" not get to be?

    I suspect the answer here is: changing the spec doesn't break anyone's program, because no-one's program can be spec-compliant, because spec-compliant programs don't work. (I appreciate that your example above is a massive edge-case, but you're viewing this, I believe, as a specific example of a generic problem and discussing it as such.) However, when us poor external developers clamour for documentation (which MSFT is pretty darn good at providing) and then use it, only to find that it doesn't work and the truth is embedded in your code rather than the spec, it causes people to throw books at walls.

    Naturally, "never make any mistakes or write code with bugs in again" is not an answer, even for you :) But it would be nice to see programs move toward the specs sometimes, rather than the specs move toward the programs, meaning that I can work from the docs rather than have to check interoperability by constantly testing and upgrading to the latest releases.

  • Anonymous
    March 30, 2006
    Clearly there is a spectrum here, and there are definately times when we do make breaking changes and implement the spec.  For example, I earlier wrote about a breaking change from 1.1 to 2.0 where we fixed a bug in the implementation of enumeration subtraction.  That ended up breaking an important customer who was most put out, but understood when we explained what had happened.

    As for the question of whether this is a specific example of a generic issue -- yes it is, but the generic issue I was intending to address was the pain of premature optimization.  We haven't yet decided whether we're going to fix this one or not, but either way, I will have to write wodges and wodges of code in order to make LINQ work around this premature optimization.

  • Anonymous
    March 30, 2006
    Good call, and I do completely and utterly agree with you about premature optimisation :-)

    I must have missed the earlier note about a breaking change. Thanks for the pointer!

  • Anonymous
    April 04, 2006
    Eric, you know more than I do about compilers, being a pro at this and all while I'm a mere undergrad, but I've been writing a compiler for a small C-like language with type inference for my thesis, and I couldn't agree with you more. In fact, I'd take it farther and say that prematurely doing anything is the root of all evil - put off doing things as long as you can!

  • Anonymous
    April 06, 2006
    What was the context for "0 | 0 | E.X" appearing in code?  The first two are actual zero numeric literals?  It seems like a lot of extra typing for nothing.

  • Anonymous
    April 06, 2006
    The context was a table mapping one kind of flag to another.  A particular set of flags had five "flavours", and was mapped to a different set with four flavours, so to make it clear which mapped to which:

    PMtoAF[( int )( PM.F | PM.CF |   0   |   0   |   0   )] =  AF.CI |   0   |   0   | AF.NP ;
    PMtoAF[( int )(   0  | PM.CF |   0   | PM.GF |   0   )] =  AF.CI |   0   | AF.IO |   0   ;
    PMtoAF[( int )(   0  | PM.CF |   0   |   0   |   0   )] =  AF.CI |   0   | AF.IO | AF.NP ;

    etc.  One of these happened to have 0 | 0 | enumtype

  • Anonymous
    April 06, 2006
    Thanks for clarifying...  My initial reaction to the post was "forget back-compat, that's a dumb thing to type", but in context it's pretty reasonable.  I guess one lesson is that it's hard to spec out a syntax in the abstract - you have to decide on the legality of various constructs in a vacuum.  It's hard to argue that 0 | 0 | E.x should be illegal with such a reasonable code sample (and when such very similar things are legal).  And part of it is aesthetic (not wanting warts), which makes it even harder.

  • Anonymous
    April 18, 2007
    Reader Larry Lard asks a follow-up question regarding the subject of Monday’s blog entry . Why is it

  • Anonymous
    December 14, 2007
    The comment has been removed

  • Anonymous
    December 14, 2007
    > how does it make sense to just do nothing? Ultimately we decided to take option 3.   However, "just do nothing" is often the correct course of action when faced with a bug. I have fixed a great many specification violations in the last twelve years, and rolled back a number of those fixes when it turned out that fixing the violation broke an important customer's code. We constantly strive to be both consistent and correct, but it is not always possible to do both in an environment where well-meaning humans have produced erroneous code. When faced with a situation in which consistency and correctness are opposites, my team acts like professional engineers. We evaluate the specific situation, consult with the affected customers when possible and make the technical calls that we believe best serve the interests of our customers and therefore our business. If you honestly believe that we do not take the costs you mention into account, then frankly you're not reading this blog very carefully.  And frankly, there is probably no developer on this planet whose productivity is more deeply impacted by mismatches between the spec and the implementation than me. I understand that cost at a far more visceral level than the vast majority of users will ever need to, and thank goodness for that. If my pulling the curtain back and showing what factors go into these hard decisions decreases your trust level, that's unfortunate. I write this blog because I firmly believe that a lack of transparency has bred mistrust in the past. I'm seeking to remedy it in part by being brutally honest about our mistakes and the difficult situations which are their consequences. I am deeply committed to engineering a solid product that advances the state of the industry and adds value to the economy; being standards compliant is an important part of that commitment but it is far, far from the only part. Standards are a means to an end -- a tool. They're not an end in themselves. I am committed to making the "standard tool" a useful tool, but I am more committed to making the compiler itself a useful tool since that's what actually gets the customer's job done.

  • Anonymous
    December 14, 2007
    The comment has been removed

  • Anonymous
    December 17, 2007
    The comment has been removed