"Don't use magic numbers," is a good rule for programming. But like any rule, you shouldn't blindly apply it. We know what happens when people do, however: we get constants that might as well be magic numbers.

Still, there are sometimes novel versions of this old song. Shmuel F sends us this one in C:

unsigned int ReadMemory(unsigned int address, char size) { switch (size) { case BIT3: // read byte-size case BIT5: // read int-size } }

The cases of the switch statement are a clear threat- we have constants used that are just magic numbers. But the developer responsible went a little above and beyond in defining this:

#define ZERO 0 #define ONE 1 #define TWO 2 #define THREE 3 #define FOUR 4 #define FIVE 5 #define BIT0 (1 << ZERO) #define BIT1 (1 << ONE) #define BIT2 (1 << TWO) #define BIT3 (1 << THREE) #define BIT4 (1 << FOUR) #define BIT5 (1 << FIVE)

Shmuel writes:

Seeing in the code ZERO and ONE is annoying. but this? this is just picking a fight.

All of this leaves us with one more question: why on Earth is size a bitmask?

[Advertisement] Keep the plebs out of prod. Restrict NuGet feed privileges with ProGet. Learn more.