About me

Michael L Perry

Improving Enterprises

Principal Consultant


User login

Q.E.D. Code Podcast, Episode 1: Predicates

In the inaugural episode of the Q.E.D. Code Podcast, we introduce the concept of using predicate calculus to prove the correctness of software. Listen to the show at http://qedcode.libsyn.com.


A predicate is a statement that has a truth value. It is either true or false. Predicates can be simple statements, or they can be build up using operators. The “implies” operator is particularly important to software, because it allows us to tie the state of the system to the line of code being executed.

The application of predicates to proving software correctness is covered in detail in Bertrand Meyer’s book Object Oriented Software Construction. He defines a system called Design by Contract.

We follow predicates through lines of code until we reach a method. There, it is up to us to prove that it is correct to call the method at that time. We do so by validating the preconditions of the method. These become predicates that the method can apply to the first line of its code, and follow through its execution.

When a method is finished, it asserts some postconditions. These become predicates that we as the caller can use to assert preconditions of later method calls.


Algebra teaches us methods of rewriting equations to derive a solution to a problem. Refactoring is a similar process done on code.

Martin Fowler formalized the process in his book Refactoring: Improving the Design of Existing Code. Refactorings are modifications to code that do not change its behavior. He provides systematic ways of applying these changes, in a way that gives us confidence that we do not break the system along the way.

The opposite of a refactoring is a transformation. This is a change that modifies the behavior of the code. Robert C. Martin has organized them in the Transformation Priority Premise, which favors simpler changes over more complex ones. If we apply changes in this order, he reasons, we will make smaller changes and end up with simpler code.

John von Neumann

Born in Budapest Hungary in 1903, John von Neumann was a mathematical prodigy. He contributed to a broad range of subjects, including algebra, set theory, and economics. He even defined whole new lines of study, most significantly of game theory.

Game theory tells us how players will respond in non-cooperative scenarios. The simplest of scenarios is a zero-sum game of perfect information. These are games like tic-tac-toe, Connect Four, and chess, where all of the information is available to all players at all times. And for one player to win, the other must lose.

But his theory expands to more than just the chess board. Games can have non-zero sums and imperfect information. These types of games arise in business negotiations, and during war time. John von Neumann applied game theory to some very difficult problems in these areas, and even coined the phrase Mutually Assured Destruction (MAD). An example of the application of game theory to war time can be found in the 1983 movie War Games.

His contributions to the field of computing were no less revolutionary. He brought the idea of stored programs to the designs of the ENIAC and EDVAC computers. Before this, computers were constructed to solve a specific problem. They had their instructions hard wired into their processing units. But von Neumann designed a machine code that could be stored in the computer’s memory. This allowed the computer to be reprogrammed to solve new problems.

Now all digital computers that we use today employ the von Neumann architecture.