Optimization is a powerful tool, but that doesn’t mean all code optimization is created equal. Knowing what to optimize in a software system is one part logic, one part analysis, and one part black magic. Here are seven things to keep in mind when you’re considering how best to optimize a project.
Photo source: Sci-Fi Mafia
Optimizing software is a good thing, but it is not an unmitigated good thing.
If you optimize your software for the wrong things, or in the wrong way, optimization can run up costs, slow down production, and actually make the software sub-optimal for its purpose.
In software development (or, really, anything else) costs are associated with every benefit. In general, optimized software takes longer to produce, since you spend time trying to make it work better. You're not always optimizing for speed. Sometimes it’s memory, as in embedded systems, or resources, as in a hand-held device. It is harder to debug and maintain, since you are motivated to sacrifice readability. Effectively optimized software has more advantages than disadvantages, but if you do the optimization wrong, the opposite is true.
Here are some things to keep in mind as you prepare to make your application run faster.
What, Exactly, Are You Optimizing For?
You can go most wrong at the beginning of an optimization project if you don’t decide what you’re optimizing for. You need to start out with a clear understanding of what you’re trying to accomplish and how the various optimizations relate to those goals. This goal needs to be stated clearly and simply – simple enough that the least tech-savvy department manager can understand and articulate it – and you need to stick to those goals throughout the process.
Now, obviously, change is a constant in software development. You may start out optimizing for one thing and then discover you need to optimize for something else instead. That’s fine, but make the changes in goals clearly and loudly. Make sure everyone understands that the goals have changed.
Be Careful What You Measure; You Will Get It
Choosing the right metrics is an important part of optimization. You’re going to be measuring your progress against those metrics. If the metric chosen is irrelevant, or just plain wrong, your optimization effort is going to be, well, sub-optimal.
Even the right metrics have to be applied with some discrimination. In some cases devoting the most effort to the parts of the code where the program spends the most time is a useful metric. Just remember that the Unix/Linux kernel spends most of its time in the idle loop.
The problem here is that if you’re not careful you end up choosing a metric that’s easy to measure that doesn’t really address the problem rather than going for a more complex metric that comes closer to encapsulating your goals.
Optimize Where It Counts and Only Where It Counts
This is the key to effective optimization. Look for the areas that currently work against your goals (performance, resources, or whatever) and concentrate your efforts there.
One classic example is spending your time optimizing something like a database when the real performance killer is a slow Internet connection.
Don’t get distracted by low-hanging fruit. It might be easy-to-pick, but that doesn’t mean it’s necessary or aligns with your goals. Just because something is obvious and easy to optimize doesn’t mean it’s worth the trouble.
Higher is Better than Lower
In general, the higher the level of optimization, the more effective it is. By that standard, the best optimization is a more efficient algorithm.
Sometimes this is glaringly obvious. In one case, in an organization with an – ah - “interesting” IT department, an important application’s performance was unacceptable. After several months of working to speed it up and not getting much of anywhere, the entire IT staff was replaced. When the new guys took the time to examine the code from the top down, they discovered that the heart of the program was using a bubble sort on a table that had grown to hundreds of thousands of entries.
Barring a lucky break, take the time to look at the basic architecture. Look at the overall structure of the program to see where you can improve it.
Still, high-level optimization isn’t a silver bullet. A lot of basic techniques, such as moving everything you can outside loops, produce measurable gains that add up incrementally. But in general, it takes a lot more low-level optimizations to produce the dramatic effect of a higher-level optimization.
There's another important reason to start with high-level optimization and work down. Optimization, especially at higher levels, eliminates chunks of code. There's no point in optimizing those chunks only to eliminate them later.
Don’t Optimize Prematurely
There’s a real temptation to dive right in to optimization while you’re still coding. In general, this is not a good idea.
For one thing, it’s hard to be sure that what you’re optimizing is worth the effort. It may seem intuitively obvious, but intuition is a tricky guide, especially in the early stages of a project. For another, optimized code is usually harder to read and to work on.
For example, in some cases you can improve performance by replacing a multiplication with a series of shift operations. Even where this is effective (and it isn't always) it produces very confusing code.
It’s better to strive for a clean implementation at the beginning and then go back and do the major optimization as a separate step. Get it working right, then optimize.
Of course this is the counsel of perfection. Given the realities of the software development process, sometimes you have to do at least some optimization as you go. If you have to, you have to. But use restraint, and comment the heck out of everything.
Depend On Performance Analysis, Not Intuition
You think you know where the system needs tweaking, but especially in complex software systems, intuition takes second place to data in deciding what to work on. Do a solid performance analysis of the code before you roll up your sleeves and dive in. Otherwise, you might find that you have made the such-and-so routine run incredibly fast – which would be far more impressive except that it’s hardly ever called.
It’s important to be guided by the facts as revealed in the analysis, and not depend too much on rules of thumb or general intuition. Amdahl’s Law, for instance, says that in general you should concentrate on the parts of the program where code spends the most time executing. That’s very good general advice, but it may not be appropriate in your situation. This is especially true if your optimization goal is not speed but, for instance, local storage requirements on a mobile device.
Also, remember the Unix kernel in the idle loop.
One effective strategy for optimization is to prioritize what you’re going to work on according to the impact it has on your goals. Get the biggest things right before you start work on the things that are lesser roadblocks.
You Can’t Optimize For Everything
One of the most important rules of optimization is that you can’t optimize for everything, or even for two things. Improving, say, speed may cost you in resource utilization, and making more efficient use of storage easily can slow things down. Consider what tradeoffs you are willing to make in other areas in order to achieve your primary goal.
In optimization, selection is perhaps 90% of the game. It’s worth taking the time to decide what you’re doing and to do it right. Of course: That’s also where the black magic comes in.
About the author:
RICK COOK has been working on and writing about computers since the days of ferrite cores and punched paper tape. He has written hundreds of articles about computers, IT, and technology. He is also the author of the “Wiz” series of fantasy novels full of bad computer jokes, available from Baen Books.