Cursors seem like a great idea. A cursor, for my non-programming friends, is like running your finger through a book as you’re reading it. It helps you keep your place and navigate a set of data. Imagine you’re making a recipe. It is really nice to keep your place in the recipe while you add some eggs to the mixer, for example. It makes it easy to walk step by step through whatever you’re reading. Of course, most recipes consist of two sets of data to navigate: a list of ingredients and the set of steps required to combine the ingredients to make your delicious treat. Have you ever gotten to the end of a recipe and discovered you skipped an ingredient? This is only one way cursors can mess with your lovingly crafted program.

Cursors become truly evil the same way so many programming metaphors become evil. By allowing you to write where the cursor is. Let’s reverse the metaphor for a second. Suppose you have a recipe that you’re changing based on how the cook actually performs the steps. Using your finger, you move step by step as the cook performs the existing steps in the recipe. When the cook does something different from the step you’re expecting, you make some room and insert the additional things he does. Sounds as easy as using a word processor, doesn’t it?

The problem is that few computer programs are so simple. Suppose now the cook just starts doing the steps in a different order. It’s now a much more complex task to make sure you’re tracking the steps in the recipe. Now think of a computer program doing something similar. Even a reasonably good programmer makes mistakes. They’re called bugs, and every single piece of software I have ever encountered has them. In the real world, let’s say you get distracted while the cook does something differently and you don’t notice. It doesn’t matter for this analogy whether you missed him doing something you expect or if you missed a new step. Now you’re out of sync. There are only two possible outcomes: you become back in sync (in the real world you can talk it out with the cook) or you experience a hopeless cascade of failures. “Did you ever add the cinnamon?”

The evil that cursors represent is when you try to correct the problem, which requires the discovery of that one step that pushes you out of sync. The program that uses the cursor relies on the cursor to hold the state of the data, but the cursor relies on the rest of the program (the logic) always to agree with that state. Because there is no explicit connection between the cursor and the program, if they start to diverge on the perception of reality, it can be extremely difficult to discover where the divergence starts (which, by the way, is the only way to fix a bug like this). A complex recipe has on the order of 50 steps. A computer program has millions. It’s not hard to see the potential impact. So the next time you use a cursor (or anything like it), keep these dangers in mind and program defensively against them. Here are some techniques I used:

  • Enable debugging (logging, really) of your cursor state. When you change, move, insert, delete, make sure there’s a way to make sure the cursor is at least in a valid state. Don’t do the debugging in production, though. Better yet, make a system property or environment variable control whether it’s enabled. If you can track operations on your cursor, consider keeping an expected state handy in debug mode. Enable it at all times inside your IDE. This will help you recognize a problem during your build-test cycle.
  • Make sure each block of code that manipulates the cursor or the underlying data is consistent within itself. Unit testing is a great way to accomplish this in an automated fashion. I’m a fan of unit testing, especially when it can save literally days of debugging.
  • If you end up encountering a synchronization issue, make creative use of your debugger. Conditional breakpoints, evaluating expressions during debug time, and (if your debugger has it) historical playback can be extremely valuable. I set a breakpoint when a known synchronization error occurred. In my case, the error only presented itself after a write was made, so this step alone told me only that the error happened before I hit the breakpoint. It sure was a good start, though. Don’t be afraid to have complex conditionals, either. You’re debugging, so use any horsepower you need. It beats stepping through the whole program line-by-line.
Advertisements