Cooking Up Code Reviews: Bitbucket vs. Collaborator
In early September (How is it Halloween already?), I had the pleasure of representing SmartBear Software and our code review tool Collaborator at the Atlassian Summit in San Jose, CA. Over those three days, I had the opportunity to meet a lot of passionate developers and admins and hear about the challenges they face in their day-to-day. At the end of the conference when the demos were winding down, I had flashbacks of cramming for college finals with the amount I had learned about how our customers use our suite of software quality products. I also learned that the Haberdasher, where we hosted a happy hour, is a great place for a curated cocktail if you’re ever in town.
Out of all the questions I fielded, one continued to catch my attention; “SmartBear has a code review product? Can’t I do that in BitBucket?” A fair question – tools like that do have code review features built-in, and can be the right tool for many teams and projects if the conditions are right. What those conditions are I won’t go into here – I’m not their sales guy, after all. But for many customers, those tools don’t provide the features they need in a code review tool. I won’t go feature-by-feature and explain what Collaborator has that its competitors don’t – that just doesn’t make sense in the ever-changing world of software – but here are the three ways our customers say we are different.
With editable roles, customizable templates, support for over 11 source control management systems (including BitBucket), and review groups; Collaborator users can support a diverse range of workflows within their organization, and all with the same tool. This means teams can tailor their review experience in Collaborator to fit their development process. In addition, our API and CLI interface make fitting Collaborator into your existing ecosystem of SDLC products extremely easy.
Custom Review Templates:
Collaborator is continuously collecting data on reviews, and storing it for easy access through a variety of reports with uses such as meeting compliance standards, auditing requirements, or internal process improvement. This means you can leave the realm of qualitative analysis – I feel like this would help – and move into quantitative analysis – if I make this change, I can expect to have this measurable outcome. Which, by the way, is a huge component of process compliance for standards like CMMI.
Custom Reporting & Metrics:
- Document review
Collaborator supports viewing diffs of .doc, .docx, .xls, .xlsx, .pdf, and image files within the diff viewer, allowing teams to use the same software to apply their peer review process to items like requirements docs, test plans, architecture diagrams, wire frames, and UX designs. This means you can bring all aspects of product development into one area – helping break down silos and share knowledge.
In addition to these areas, we’ve received a lot of great feedback on creature comforts like due dates, automatic notifications, custom syntax highlighting, and automatic links.
A Microwave or a Stove
When I originally wrote this comparison, it was to help articulate the differences internally to some peers who weren’t familiar with the product. To do that I made the analogy that performing code review in a remote repository is a bit like cooking with a microwave. You can easily warm stuff up in a microwave, but if you really want to cook you better get a stove. I think that analogy sums it up well; if you want to just DO code review, a pull request review may be the right tool for you. But if you want to empower code review, build a culture of quality and ownership, and foster knowledge sharing; you need a tool that can do more. That’s exactly what Collaborator is.
Start a Free 30 Day Trial of Collaborator.
Don’t miss out on our upcoming Webinar: Going Beyond the Pull Request for Better Code Reviews.
Register to attend.