Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
Unordered list
Bold text
Emphasis
Superscript
Subscript
One of the great benefits of working with Datamole is the culture of curiosity, continuous learning and improvement. That is why I really enjoyed a Knowledge sharing session by our colleague Petr N. on the art of review. Applicable not only on the code, but any work output (products such as documents, slides or briefs), the thoughtful and effective review can make our lives much easier. And calmer - since usually, the review is associated with feeling of frustration, hesitancy, apprehension.
If you have ever submitted a piece of work for review, you know this feeling. It is not about fear of being wrong. It is about not knowing what kind of response is coming. That uncertainty is what we set out to improve during our recent team session on the art of review.
Reviewing doesn’t apply to just code but to any work artifact shared for feedback. This includes documents, dashboards, proposals, diagrams, presentations, and visuals. A review is when someone else looks at your work and offers input to help improve it. The goal is clarity, quality, and shared understanding.
A good review supports the author and improves the outcome. A poor review delays delivery, creates friction, and discourages collaboration. Reviewing well is not just a technical task. It is a communication skill and a team habit.
You shape the review experience before the reviewer even sees your work. Here is the checklist we recommend before hitting submit:
These small habits reduce confusion and help reviewers give thoughtful feedback faster.
When submitting for review, I try to anticipate my reviewers questions and provide as much information as I can.
Petr N
Reviewing is not about approval. It is about clarity, guidance, and support. A good reviewer looks at the work with fresh eyes and asks, “Is this ready? If not, how can I help?”
Here is our internal checklist for reviewers:
Avoid perceived authority bias. Do not assume something is correct just because a senior person submitted it. Everyone makes mistakes.
We also looked at some habits that turn reviews into a painful experience. These are the patterns to watch for:
LGTM (looks good to me)
If there is nothing to point out, it is a great achievement - so say so. Or give your feedback if it’s not the case.
Delaying major concerns
Raising small issues first, then later introducing a large blocker. This causes rework and signals that the reviewer did not prioritize or respect the author’s time.
And even though not mentioned during the knowledge sharing, we compiled a list of other pitfalls:
Premature commenting
Reviewers sometimes stop reading after the first issue and begin commenting too early. This causes fragmented feedback and unnecessary iterations. Read the whole submission before giving input.
Bundling unrelated demands
Adding requests that are unrelated to the submitted work. This shifts the focus and makes the review feel like a power move, not a collaboration.
Uncoordinated feedback
Multiple reviewers give conflicting advice without aligning. The author ends up bouncing between contradictory requests. Reviewers should coordinate before pushing changes back.
Vague criticism without guidance
Pointing out a problem without explaining what is wrong or how to improve it. This forces the author to guess and often leads to frustration or wasted effort.
Remember, there is always a human on the other side. Reviewing is not just technical. It is social. Feedback is easier to hear when it is clear and kind. Praise matters too. While criticism should stay focused on the work, good work deserves to be acknowledged. Positive feedback brings real satisfaction to the author and helps build trust.
Not every comment has equal weight. What matters in a company-wide guideline may not apply in the same way to a quick internal draft. Consider the context.
As AI-generated content becomes more common, the reviewer’s role is shifting. Tools like Copilot can help draft and review, but they cannot take responsibility. The final accountability lies with the human reviewer. That means reviewers must fully understand what they are approving.
The reviewer is now the last responsible person before the work is released. This role matters more than ever.
Reviews do not have to be frustrating. With a bit of structure, empathy, and attention, they can become one of the best parts of the process. They make work clearer, decisions better, and teams stronger.
We are all still learning how to do this well. But when we treat reviews with care, we are not just improving artifacts. We are improving how we work together.
During his Knowledge Sharing session, Petr N. asked how it feels to submit and review work. The almost complete absence of frustration says a lot about the culture we’ve built at Datamole.
If you want to follow the recommendations we presented, you can use (and bookmark) the corresponding compact checklists:
Checklist - review as a submitter
https://pnevyk.github.io/posts/checklist-review-as-submitter/
Checklist - review as a reviewer
https://pnevyk.github.io/posts/checklist-review-as-reviewer/