Making Wrong Code Be Wrong
I was recently perusing when I came across a link to a Joel Spolsky blog post . In it, he describes how a form of Hungarian notation can be used to make wrong code look wrong. In the process, he describes the interesting history of how Hungarian notation came to be, and how the common use of it today was not the way it was originally intended. He describes how Hungarian notation was created to convey application-level information about a variable. It could be used to distinguish between two different types of dimensions that might both be typed as int s. He termed this Apps Hungarian. While the compiler could not tell them apart, the human eye would be able to with a cursory glance. Later, the notation was instead conflated with the type that the compiler saw. An int iWidth carried redundant information that could actually get in the way if the type ever needed to change. He termed this Systems Hungarian. He gives several examples in the post. Notably, he describes how Apps Hunga...