If you're anything like me in your programming proclivities, there's a kind of singular impatience that leaps into your mind around the subject of documentation. On the consuming side, whether it's an API or a product, you might think to yourself, "I don't want to read about it, I just want to start hacking away at it." On the flip side, when it comes to producing documentation, that seems an onerous process; you've already finished the work, so why belabor the point?
Those are my natural impulses, but I do recognize the necessity for producing and consuming documentation surrounding the work that we do. Over the years, I've developed strategies for making sure I get it done when it needs to be done. For instance, even as someone who makes part of my living writing and teaching, I still segment my days for the sake of efficiency; I have writing days and I have programming days.
In order for this work not to get shortchanged in your group, it's important to develop similar strategies or commitment devices. The work needs to get done, and it needs to get done well, or else it won't be useful. And peer review is a vital part of that process. After all, you create processes around peer review for code — the developers' strategy for sanity checks and spreading of knowledge. Doesn't it stand to reason that you should also do this with the documents that you create around this process?
Let's look at some documentation that your group may be producing and explore the idea of having peer review to go along with it. We'll look at an answer to the question, "what technical documents should you review?"
Ramping up with New Review Types
Instructions for Your Product/API
Let's start with an easy and obvious form that documentation makes. If you ship software as a product, then it likely has some form of a user manual or getting started guide. Same if you offer a service or an API. These are the documents that your users refer to for success with the code you're writing, and, as such, they are important to your business.
Many organizations will have technical writers on staff to generate these sorts of manuals, or else they'll farm out the effort. Some smaller organizations may ask a business analyst, project manager, tech support rep, or even a developer to do it. But no matter who they're asking to do it, you and your team should be reviewing it.
Nobody knows the code better than you do. Don't you think that at least merits you looking to make sure everything's in order?
In an ideal world, your deployment would happen after a commit to source control triggered your continuous integration/deployment setup to build, do code analysis, and run all tests. In this ideal world, your 'documentation' would thus be more code. But the world is not ideal, and its likely part of your deployment process is manual, even if you have the aspirational goal of automating your entire deployment.
I would argue that vis a vis build and deploy, there is no worse situation for a group than to have "the build troll." Is the build a mystery to you and everyone else in your group, and jealously guarded by one guy with all the knowledge? Does he emerge from his corner of the building only periodically to yell at people for writing code that somehow broke InstallShield? Okay, maybe it's not this extreme, but if there's a specific person with knowledge of everything deployment, that's not good.
This is a critical part of your function as a software development team, and, as such, it should be reviewed. There's a continuum toward the ideal, and the first step is getting the knowledge out of the build troll's head and onto a document or a wiki somewhere. Then the absolute essential next step is to start reviewing that document so that other people learn how it works, and so that multiple brains can be working toward improvement. Once you've got this in place, then you, as a group, can start methodically automating it.
This one is a bit different from the others in that everyone in the group has probably already 'reviewed' it, in the sense of looking at it and processing the information that it contains. In many groups, the coding standards document is drawn up by someone, such as the architect, and rolled out to the rest of the team with the simple instruction, "do it this way."
This is not what I'm talking about. Rather, I'm suggesting that the coding standards be reviewed by the team in the same way that code is reviewed — active participation, offering of suggestions, changes, removals, and additions.
There's no doubt about it; having the group collaborate on a coding standard document is a recipe for a good bit of churn and argumentation. But I still think it's worth it. With the alternative, you get a group that passively accepts what they're told because they have no choice. When this document is subject to review and revision, you have engagement and group ownership. Even if someone doesn't carry the day with her preference, at least she's been heard and said her piece.
And there's more at stake than just enfranchising the team. Having the coding standards document be subject to challenge and earnest review will ensure that you don't continue doing things simply because you've always done them that way. It will help keep the standard relevant and optimized.
How the NVDIA Tegra Team Accelerated its Review Process
I'll round out the field with the one that I think is, perhaps, the most critical. Requirements must come to your group in some fashion or another. This could be anything from a cocktail napkin to user stories to reading customer feedback emails to formal, waterfall-style requirements documents/specifications. But they come from somewhere, and they're almost always written down somewhere.
It is critical that the software development team review these requirements. And, as with the coding standards, review doesn't mean, "read what's being done to you." Requirements should never be a one-way street. This documentation demands your feedback and push-back if it's steering the effort on a course to budget overruns, slipped deadlines, or general failure.
Just as you might have a business analyst review requirements for making sure they align with business goals, have the development team make sure that they won't create undue problems with the software or that there isn't a better, less effort-intensive alternative.
Apply Your Judgment
Do you have other documents that are critical to your team? The idea in this article is not limited to what I've listed — those are just some examples. Apply your judgment to make sure that you're getting the proper eyeballs on things for spreading knowledge around and providing sanity checks.
There are very few instances where an additional set of eyes isn't a help, and where knowledge-sharing among the team isn't a win. You apply these principles as you perform code reviews. Extend them beyond that to the other things that matter to your group, and you can expect similar benefits.
Supercharge Communication Throughout The Development Process
With most development teams being distributed globally, your development process requires an efficient feedback mechanism within a tool to ensure you aren’t losing any velocity or time while collaborating across time zones. The rapid arrival and subsequent shift to remote work culture over the past 18 months has only exacerbated this growing necessity. A developer collaboration tool, like Collaborator, makes it easy to review user stories, requirements, and code, communicate across the team and deliver quality software all from one interface.