One of those neat things I’ve always wanted to know but had never had a chance to learn. The ‘net really does have everything. 😉
Version Control: The Big Picture
During my most recent software development project, which I’ve already blogged about extensively, I finally learned to use a version control system (VCS from here on) right.
To fully understand this, you have to know where I was coming from. I learned to program in the early-to-mid eighties, and never even heard of a VCS until about 1998, when I picked up a copy of Visual C++ 6.0. I couldn’t see much use for it — I was archiving a zip-file copy of the source code for my software at every version release, and it seemed to me that VCS did the exact same thing.
At that point, I viewed the “working copy” of the source code as sacred. I’d quickly learned that whenever I made a change to it, I had to be able to reverse it if it didn’t work out, so for small changes, old code was just commented out and left there — that, and the zipped-up backups, were my VCS. It made for some messy code, because I rarely went back to remove the old commented-out stuff, but it worked.
When I sold my earlier software company in 2003 and went to work for the new company for a while, I was shocked at just how casually developers at the new company treated source code. They made changes to it willy-nilly, without leaving the old code intact! What if they screwed something up? The five-minute introduction to CVS that I got explained how to check in and check out source code, but nothing about the big picture of how or why to use it. To me, it was still just a glorified backup program.
Fast forward to a few weeks ago, when I started making some extreme source code changes to a large twelve-year-old project. That’s when I started really learning how to use version control.
The first lesson: the working copy is temporary. The sacred version of the source code is the one in the version-controlled repository, and if you’re using it properly, you can always fall back on it if your changes have introduced problems. You can also use it on multiple machines if necessary, with no problems, the source of a future compile-speed-related blog entry.
There’s a corollary to this: check in early, check in often. Make small and focused changes, get them working, and check them in as soon as they’re done. That way, if a change introduces a bug, you can easily locate the exact source-code alterations that caused it. And if all else fails, you can easily throw it away and revert to the last checked-in copy without losing valuable work. If I’d known this a few weeks ago, and practiced it, I could have saved several days of debugging huge check-ins when they later turned out to have problems.
And finally, if you have to make large, sweeping changes that must all be completed before you can see whether they’ll work, make a separate branch until it’s finished. Then you can keep checking in the smaller changes as they’re completed, even though they haven’t been tested, and all the check-ins on the main branch are known to always contain good working copies of the program. With a modern version-control system like Git, you can easily merge the branch into the main code once it’s done.
Now that I see the big picture, VCSs make a lot of sense. I’ll be using mine a lot more in the future, and quite differently.
“Make a Squirrel-Proof Bird Feeder from a Bottle and Candy Tin”
I’ve seen lots of “squirrel-proof” bird feeders before. In most cases, the squirrels managed to work around them and get the bird food anyway. But this one looks like it just might work. And it’s a lot less expensive than the others I’ve seen too. 🙂
1,000 Posts!
Without even noticing, I made Geek Drivel’s one-thousandth post yesterday.
My only stated goal in starting Geek Drivel was to give interested friends and family, and anyone who stumbled onto it, the questionable benefit of my experience. My unstated goal was to give myself an outlet for expression. I think I’ve achieved both, and I know I’ve reached a few other people during that time… or at least, I hope that a few of those page-hits I’ve been getting all these years weren’t just automated spambots. 😉
I considered shutting it down, as a successfully completed project, but I think there’s still more that I can contribute on occasion. And of course, I like to store interesting data on it for my own future reference. So I’ll keep it going, for now. Maybe I’ll revisit that decision after the next thousand posts. 😉
“Create Your Own Luck by Changing Your Perspective”
Doom, despair, and agony on me!
Deep dark depression, excessive misery!
If it weren’t for bad luck, I’d have no luck at all.
Doom, despair, and agony on me!
— Recurring skit from the TV show Hee Haw
This sounds like a great idea. But I have to wonder about it.
I know a woman, a friend of ours, with totally abysmal luck. She suffers minor — and very odd — disasters on a regular basis. And it’s not from anything she does — they’re just random events that could happen to anyone, but that happen to her with far more regularity than mere chance would allow for. On a related note, she loves to complain about her life, and these little visits from the misery fairy seem to give her plenty of fodder. It’s an open question whether the complaints come from the bad luck, or the bad luck comes from the complaints.
On the other hand, I know one person who could fall into a sewer truck and discover a gold nugget, and I can’t attribute that to any particular perspective on his part.
I also know people who technology hates — they get near a watch and it starts gaining time, and woe betide them if they try to use a computer. And others (like myself) who technology adores, and whose mere presence often fixes wonky machines.
It’s enough to make you seriously consider that there might be an element of truth to The Matrix, that the entire universe is nothing but a simulation that we’re all in. Which gives rise to some interesting philosophy, and might explain quite a bit, when you think about it.
Speedier Visual C++ Compiles
As I mentioned recently, I’ve been having some issues with the speed of Visual C++ compiles. Some adjustments to my VMware Fusion settings reduced the time it takes for a complete rebuild of my project, from roughly an hour and five minutes to about 43 minutes, but I knew there was still room for improvement.
Several of the sub-projects had monolithic header files that each source code file was including; I broke these up and set each source code file to only include the ones that it was actually using. That helped some, because fewer source code files had to be compiled most of the time, but it wasn’t enough.
Several of the sub-projects shared many source code files, and the same settings. The shared files were being compiled separately for each subproject, which was inefficient. I gathered all the common source code files into several static libraries, which eliminated the redundant compiles. Again, it helped, but not enough.
Finally, we get to the subject of precompiled headers.
I never had much use for precompiled headers. They never seemed to do much to reduce my compile times… in fact, they seemed to increase my compile times, which is why I had them turned off in all of these projects. But I found a page that explained the proper use of them (hint: MSVC’s default setup doesn’t use them right).
In a nutshell:
- Create a single header file (I’ll call it
precompiled.h
here) which#include
s only those header files that never (or almost never) change —windows.h
,stdio.h
, the STL library, the Boost library, etcetera. Add it to your project. - Create a source code file (I’ll call it
precompiled.cpp
, to go along withprecompiled.h
) that contains nothing but one line,#include "precompiled.h"
. Add it to your project too. - Add
#include "precompiled.h"
as the very first non-comment line in all source code files in the project. This is vital, because MSVC will ignore anything before it. - Set the entire project to use (not create!) precompiled headers, and set the PrecompiledHdrs.cpp file to create them.
The result: the compiles (after the precompiled.cpp file, when it needs compiling, which is seldom) are now ludicrously fast — at least ten times as fast as they were without precompiled headers! I assume most of that speed increase comes from the fact that I use a lot of stuff from the Boost library, but precompiled headers should help a lot even if you’re only using the raw Windows API functions (windows.h
is huge, and pulls in a bunch of other header files too).
So compiles are now blazingly fast. Not as fast as I’d get on a good desktop machine (or even this one running on bare metal instead of virtualized), but much, much more tolerable.
My productivity on that project has soared. With those three changes combined, a compile usually takes less than a minute now, so I can try over fifty times more compiles in a day. I find myself a lot more willing to experiment with the project too, now that I know I’m not going to have to wait so long for the results. All in all, it was well worth the effort.
“Pack a Gun to Protect Valuables from Airline Theft or Loss”
I had trouble believing this headline when I first read it, but when you read the entire article, it makes a lot of sense.
Crazy?
This is from yesterday’s Dilbert Blog entry:
A Muslim, a Christian, and a crazy guy walk into a room. The one thing you can know for sure is that at least two out of three of them organize their lives around things that aren’t real. And that’s the best case scenario. Atheists would say all three have some explaining to do. And atheists are the minority, which is the very definition of abnormal.
Hm… is computer software “real”? 😉
“Watching the Clock Can Be Just As Distracting As the Web”
I’d tend to agree with this. I have an old battery-powered analog office clock, not so that I can see the time (that’s what I have a watch for), but mostly so that I can listen to it tick while I work… I find it soothing.
But the main reason I wanted to write about that post was to draw attention to the completely awesome picture at the beginning of it. I want a clock like that! 😉