> rm -f pentest_shells.log && echo 'the only thing left is the PDF'_
A penetration tester spends ten days assessing an organisation's internal network. They're exceptionally skilled. They discover 34 findings, chain three of them into a path to Domain Admin, identify a zero-day in a bespoke internal application, and demonstrate that a phishing email can bypass the email gateway and compromise a finance director's account. It's a thorough, high-quality engagement.
On the final day, the tester's laptop is stolen from a car park. Every screenshot, every note, every piece of evidence is gone. There is no report. The tester can remember the broad strokes — LLMNR poisoning worked, there was a Kerberoastable service account, something about Backup Operators — but can't recall the specific hostnames, the exact attack chain, or the remediation steps they'd planned to recommend.
What does the organisation have? Nothing. Ten days of billable time. A tester who vaguely remembers finding some things. No evidence. No remediation plan. No artefact that the CISO can present to the board, the IT team can work from, or the auditor can review. The testing happened. The value didn't.
Now reverse the scenario. Imagine a moderately skilled tester who produces a clear, well-structured, evidenced report with specific remediation steps, an attack narrative, a prioritised roadmap, and an executive summary the board can understand. The testing was competent but not exceptional. The report is excellent. Six months later, 28 of 34 findings have been remediated. The board approved the budget for MFA deployment based on the executive summary. The IT team hardened Active Directory using the specific Group Policy paths from the findings. The detection gap analysis led to seven new SIEM rules.
Which engagement delivered more value? The brilliant test with no report, or the competent test with an excellent one? The answer is obvious — and it reveals a truth the industry doesn't always acknowledge: the report is not a by-product of the engagement. It is the engagement.
A penetration test is an ephemeral event. It happens over a defined window — a week, two weeks, a month — and then it's over. Every artefact of the test is temporary except one.
| Artefact of the Test | What Happens to It | Persistence |
|---|---|---|
| The tester's access | Credentials revoked. VPN disconnected. Shells closed. Test accounts deleted. The tester's presence in the environment is erased — as it should be. | Zero. Gone the day testing ends. |
| The tester's knowledge | The tester moves on to the next engagement. Within weeks, the details of your environment blur into the dozens of others they've tested. Within months, they couldn't reconstruct the attack chain from memory if they tried. | Weeks. Fades rapidly. |
| The compromised systems | Reset. Passwords changed. Temporary files cleaned up. Persistence mechanisms removed. The systems return to their pre-test state — with the same vulnerabilities, awaiting remediation. | Zero. Restored immediately. |
| The verbal debrief | A conversation that conveys understanding in the moment. But memories fade, attendees change roles, and nobody takes comprehensive notes. Within three months, the details are lost. | Months. Degrades with staff turnover. |
| The report | Filed. Referenced during remediation. Presented to the board. Provided to the auditor. Reviewed before the next engagement. Compared against subsequent reports. Used as evidence in insurance claims, regulatory inquiries, and compliance certifications. | Years. Persists indefinitely. |
The report is the only artefact that outlives the engagement. It is the institutional memory of the test — the vehicle through which the tester's findings, analysis, and recommendations persist in the organisation long after the tester has forgotten your network topology. Every other output of the engagement is transient. The report is permanent.
Organisations that treat the report as a one-time deliverable that's read once and filed are missing the majority of its value. A well-written report serves multiple purposes across months and years — each of which justifies investment in report quality.
| When | Who Uses It | How |
|---|---|---|
| Week 1 | CISO / Security Manager | Reads the executive summary and attack narrative. Briefs the board. Requests budget for critical remediations. Sets priorities for the IT team. |
| Weeks 2–8 | IT / Engineering Team | Works through the remediation roadmap finding by finding. References the specific remediation steps, Group Policy paths, and verification instructions. Marks findings as remediated. |
| Month 3 | Internal Audit / Compliance | Reviews the report as evidence that security testing was performed. Maps findings to regulatory requirements (PCI DSS, ISO 27001, Cyber Essentials). Tracks remediation progress against audit timelines. |
| Month 6 | Next Pen Test Provider | Receives the previous report as part of the scoping process. Reviews what was found, what was fixed, and what persists. Focuses the new engagement on areas of remaining or emerging risk rather than re-covering old ground. |
| Month 9 | SOC / Detection Team | Uses the detection gap analysis to validate that new SIEM rules are catching the techniques that went undetected. Cross-references the attacker timeline with current telemetry coverage. |
| Year 1+ | Cyber Insurance Underwriter | Requests evidence of security testing as part of policy renewal. Reviews the report to assess the organisation's security maturity, remediation discipline, and risk posture. A well-evidenced report with demonstrated remediation progress can directly influence premiums. |
| Post-incident | Incident Response / Legal | If a breach occurs, the pen test report becomes evidence of due diligence — or lack thereof. "We identified this vulnerability in the pen test report eight months ago" is a defensible position. "We identified it and didn't remediate it" is a liability. The report's recommendations become a record of what the organisation knew and when. |
A report that's written for a single audience at a single point in time captures a fraction of its potential value. A report that's structured to serve multiple audiences across multiple time horizons is an asset that compounds — informing decisions, evidencing diligence, and driving improvement long after the tester has moved on.
Organisations commission penetration tests because they need to understand their risk. They don't need someone to compromise their domain controller — they need a document that explains what's wrong, why it matters, and how to fix it. The compromise is the evidence. The report is the product.
This reframing has practical consequences for how organisations should evaluate and procure pen testing services.
For many organisations, the pen test report serves a dual purpose: it drives remediation and it satisfies a compliance requirement. The report is the evidence that testing was performed, that findings were identified, and that remediation was undertaken. In this context, the quality of the report isn't just about operational value — it's about regulatory defensibility.
| Framework | What It Requires | What the Report Must Demonstrate |
|---|---|---|
| PCI DSS 4.0 | Requirement 11.4 — penetration testing at least annually and after significant changes. Internal and external. Segmentation testing if applicable. | Scope and methodology. Evidence that both internal and external testing were performed. Findings with severity ratings. Evidence of remediation and retesting. Confirmation that segmentation controls were tested if present. |
| ISO 27001:2022 | Annex A control A.8.8 — management of technical vulnerabilities. Regular assessment of exposure through vulnerability assessment and penetration testing. | Evidence that testing was performed by competent testers. Findings documented with risk ratings. Evidence of remediation tracking. Integration with the organisation's risk treatment plan. |
| Cyber Essentials Plus | External vulnerability assessment performed by a certified assessor as part of the certification process. | External scan results. Evidence that critical and high vulnerabilities are remediated within the certification window. Clear pass/fail determination. |
| FCA / PRA (Financial Services) | Senior Managers and Certification Regime places personal accountability on senior individuals for the firm's cybersecurity posture. | The pen test report is evidence that the firm identified and addressed security risks. A report that identifies critical findings which remain unremediated creates personal liability for the accountable senior manager. |
| Cyber Insurance | Underwriters increasingly request evidence of penetration testing during application and renewal. Some policies require testing as a condition of coverage. | A well-structured report demonstrating regular testing, identified findings, and remediation progress strengthens the organisation's position during underwriting — and may reduce premiums. A poor report or no report at all may result in coverage exclusions. |
In each of these contexts, the report is the only artefact that matters. The auditor doesn't watch the tester work. The underwriter doesn't sit in the debrief. The regulator doesn't review the tester's notes. They review the report. If the report is vague, unstructured, or lacks evidence, the compliance requirement is arguably unmet — regardless of how thorough the testing was.
If the organisation suffers a breach, the pen test report enters a different context entirely. It becomes part of the incident response investigation, the regulatory notification, the insurance claim, and potentially the legal proceedings. What the report says — and what it recommended — matters enormously.
This article is as much a message to pen testing providers as it is to the organisations that commission them. If the report is the primary deliverable — and it is — then the industry's investment in reporting quality should match its investment in testing quality. In practice, it often doesn't.
| What Good Providers Do | What This Produces |
|---|---|
| Allocate dedicated reporting time — typically 20–30% of the engagement — separate from testing time. The report is not written in the final hour of the final day. | Reports that are structured, evidenced, reviewed, and refined. Remediations that are specific and actionable. Executive summaries that communicate business impact. |
| Employ a quality assurance process where a second consultant reviews every report before delivery — checking for technical accuracy, clarity, remediation specificity, and audience appropriateness. | Consistent quality across engagements and across consultants. Errors caught before delivery. Findings that have been validated by a second pair of eyes. |
| Invest in their consultants' communication skills — writing training, report templates, style guides, and peer review — with the same seriousness as technical training and certifications. | Consultants who can write for non-technical audiences, structure information logically, and produce prose that communicates rather than obscures. |
| Offer a verbal debrief as standard — not as an optional extra — where the tester presents the findings, demonstrates the attack path, and answers questions from all stakeholder groups. | Understanding that the written report alone cannot deliver. The debrief is the second communication channel that ensures the report's key messages are understood and internalised. |
| Provide remediation support after report delivery — answering questions about specific findings, clarifying remediation steps, and validating that fixes have been correctly implemented. | A relationship that extends beyond report delivery. The report is the starting point of a remediation conversation, not the end of the engagement. |
The penetration test is an event. The report is the asset. The event is temporary — the tester's access is revoked, their knowledge fades, the compromised systems are restored, and the engagement becomes a line item in last quarter's budget. The report is permanent — it drives remediation for months, satisfies auditors for years, informs insurance renewals, benchmarks against future engagements, and serves as legal evidence of due diligence (or the lack of it) indefinitely.
An exceptional test that produces a poor report is a poor engagement. A competent test that produces an excellent report is a good engagement. This isn't because testing quality doesn't matter — it does. It's because testing quality only reaches the organisation through the report. The report is the transmission mechanism. If the mechanism fails, the quality doesn't arrive.
When you commission a penetration test, you're not buying an event. You're buying a document that will drive decisions, evidence diligence, and shape your security posture for years. Treat it accordingly — in the procurement, in the SoW, in the evaluation of providers, and in what you do with it after it arrives.
Our reports are structured for boards, CISOs, and engineering teams — with attack narratives, chain analysis, specific remediations, detection gap analysis, and remediation roadmaps. Because the report isn't a summary of the work. It is the work.