Penetration Testing

The Report Is the Deliverable, Not the Test

> rm -f pentest_shells.log && echo 'the only thing left is the PDF'_

Peter Bassill 12 August 2025 14 min read
reporting pen test deliverables ROI communication compliance

Imagine the test happened but the report didn't.

A penetration tester spends ten days assessing an organisation's internal network. They're exceptionally skilled. They discover 34 findings, chain three of them into a path to Domain Admin, identify a zero-day in a bespoke internal application, and demonstrate that a phishing email can bypass the email gateway and compromise a finance director's account. It's a thorough, high-quality engagement.

On the final day, the tester's laptop is stolen from a car park. Every screenshot, every note, every piece of evidence is gone. There is no report. The tester can remember the broad strokes — LLMNR poisoning worked, there was a Kerberoastable service account, something about Backup Operators — but can't recall the specific hostnames, the exact attack chain, or the remediation steps they'd planned to recommend.

What does the organisation have? Nothing. Ten days of billable time. A tester who vaguely remembers finding some things. No evidence. No remediation plan. No artefact that the CISO can present to the board, the IT team can work from, or the auditor can review. The testing happened. The value didn't.

Now reverse the scenario. Imagine a moderately skilled tester who produces a clear, well-structured, evidenced report with specific remediation steps, an attack narrative, a prioritised roadmap, and an executive summary the board can understand. The testing was competent but not exceptional. The report is excellent. Six months later, 28 of 34 findings have been remediated. The board approved the budget for MFA deployment based on the executive summary. The IT team hardened Active Directory using the specific Group Policy paths from the findings. The detection gap analysis led to seven new SIEM rules.

Which engagement delivered more value? The brilliant test with no report, or the competent test with an excellent one? The answer is obvious — and it reveals a truth the industry doesn't always acknowledge: the report is not a by-product of the engagement. It is the engagement.


Everything about the test is temporary. Only the report persists.

A penetration test is an ephemeral event. It happens over a defined window — a week, two weeks, a month — and then it's over. Every artefact of the test is temporary except one.

Artefact of the Test What Happens to It Persistence
The tester's access Credentials revoked. VPN disconnected. Shells closed. Test accounts deleted. The tester's presence in the environment is erased — as it should be. Zero. Gone the day testing ends.
The tester's knowledge The tester moves on to the next engagement. Within weeks, the details of your environment blur into the dozens of others they've tested. Within months, they couldn't reconstruct the attack chain from memory if they tried. Weeks. Fades rapidly.
The compromised systems Reset. Passwords changed. Temporary files cleaned up. Persistence mechanisms removed. The systems return to their pre-test state — with the same vulnerabilities, awaiting remediation. Zero. Restored immediately.
The verbal debrief A conversation that conveys understanding in the moment. But memories fade, attendees change roles, and nobody takes comprehensive notes. Within three months, the details are lost. Months. Degrades with staff turnover.
The report Filed. Referenced during remediation. Presented to the board. Provided to the auditor. Reviewed before the next engagement. Compared against subsequent reports. Used as evidence in insurance claims, regulatory inquiries, and compliance certifications. Years. Persists indefinitely.

The report is the only artefact that outlives the engagement. It is the institutional memory of the test — the vehicle through which the tester's findings, analysis, and recommendations persist in the organisation long after the tester has forgotten your network topology. Every other output of the engagement is transient. The report is permanent.


A report's journey long after delivery.

Organisations that treat the report as a one-time deliverable that's read once and filed are missing the majority of its value. A well-written report serves multiple purposes across months and years — each of which justifies investment in report quality.

When Who Uses It How
Week 1 CISO / Security Manager Reads the executive summary and attack narrative. Briefs the board. Requests budget for critical remediations. Sets priorities for the IT team.
Weeks 2–8 IT / Engineering Team Works through the remediation roadmap finding by finding. References the specific remediation steps, Group Policy paths, and verification instructions. Marks findings as remediated.
Month 3 Internal Audit / Compliance Reviews the report as evidence that security testing was performed. Maps findings to regulatory requirements (PCI DSS, ISO 27001, Cyber Essentials). Tracks remediation progress against audit timelines.
Month 6 Next Pen Test Provider Receives the previous report as part of the scoping process. Reviews what was found, what was fixed, and what persists. Focuses the new engagement on areas of remaining or emerging risk rather than re-covering old ground.
Month 9 SOC / Detection Team Uses the detection gap analysis to validate that new SIEM rules are catching the techniques that went undetected. Cross-references the attacker timeline with current telemetry coverage.
Year 1+ Cyber Insurance Underwriter Requests evidence of security testing as part of policy renewal. Reviews the report to assess the organisation's security maturity, remediation discipline, and risk posture. A well-evidenced report with demonstrated remediation progress can directly influence premiums.
Post-incident Incident Response / Legal If a breach occurs, the pen test report becomes evidence of due diligence — or lack thereof. "We identified this vulnerability in the pen test report eight months ago" is a defensible position. "We identified it and didn't remediate it" is a liability. The report's recommendations become a record of what the organisation knew and when.

A report that's written for a single audience at a single point in time captures a fraction of its potential value. A report that's structured to serve multiple audiences across multiple time horizons is an asset that compounds — informing decisions, evidencing diligence, and driving improvement long after the tester has moved on.


You're buying a report. The testing is how it's produced.

Organisations commission penetration tests because they need to understand their risk. They don't need someone to compromise their domain controller — they need a document that explains what's wrong, why it matters, and how to fix it. The compromise is the evidence. The report is the product.

This reframing has practical consequences for how organisations should evaluate and procure pen testing services.

The Statement of Work Should Describe the Report
Most pen test SoWs describe the testing: duration, methodology, scope, tester qualifications. Very few describe the report: its structure, its sections, its intended audiences, the level of remediation detail, whether it will include an attack narrative, whether it will include a detection gap analysis. If the deliverable is the report, the SoW should specify what the report will contain — not just how the testing will be performed.
Report Quality Should Factor Into Price Evaluation
A 5-day pen test from Provider A costs £8,000 and produces a 40-page scanner-output PDF with generic remediations. A 5-day pen test from Provider B costs £12,000 and produces a structured report with an executive summary, attack narrative, chain analysis, specific remediations with GPO paths, a detection gap appendix, and a prioritised roadmap. Provider B is not 50% more expensive — they're delivering a fundamentally different product. The report quality is where the value difference lives.
Reporting Time Is Testing Time
Some providers allocate 80% of the engagement to testing and squeeze reporting into the final 20%. The report suffers — written hurriedly on the last day, with incomplete evidence, vague remediations, and a rushed executive summary. A high-quality report for a complex engagement requires significant dedicated time: writing, reviewing, structuring, evidencing, and refining. Providers who invest in reporting time produce better deliverables.
Writing Skill Matters as Much as Testing Skill
An exceptional tester who can't communicate their findings clearly produces less organisational value than a good tester who writes with precision, clarity, and audience awareness. The best penetration testing firms invest as heavily in their consultants' reporting skills as in their technical skills — because both determine the quality of the deliverable.

When the report isn't just a deliverable — it's evidence.

For many organisations, the pen test report serves a dual purpose: it drives remediation and it satisfies a compliance requirement. The report is the evidence that testing was performed, that findings were identified, and that remediation was undertaken. In this context, the quality of the report isn't just about operational value — it's about regulatory defensibility.

Framework What It Requires What the Report Must Demonstrate
PCI DSS 4.0 Requirement 11.4 — penetration testing at least annually and after significant changes. Internal and external. Segmentation testing if applicable. Scope and methodology. Evidence that both internal and external testing were performed. Findings with severity ratings. Evidence of remediation and retesting. Confirmation that segmentation controls were tested if present.
ISO 27001:2022 Annex A control A.8.8 — management of technical vulnerabilities. Regular assessment of exposure through vulnerability assessment and penetration testing. Evidence that testing was performed by competent testers. Findings documented with risk ratings. Evidence of remediation tracking. Integration with the organisation's risk treatment plan.
Cyber Essentials Plus External vulnerability assessment performed by a certified assessor as part of the certification process. External scan results. Evidence that critical and high vulnerabilities are remediated within the certification window. Clear pass/fail determination.
FCA / PRA (Financial Services) Senior Managers and Certification Regime places personal accountability on senior individuals for the firm's cybersecurity posture. The pen test report is evidence that the firm identified and addressed security risks. A report that identifies critical findings which remain unremediated creates personal liability for the accountable senior manager.
Cyber Insurance Underwriters increasingly request evidence of penetration testing during application and renewal. Some policies require testing as a condition of coverage. A well-structured report demonstrating regular testing, identified findings, and remediation progress strengthens the organisation's position during underwriting — and may reduce premiums. A poor report or no report at all may result in coverage exclusions.

In each of these contexts, the report is the only artefact that matters. The auditor doesn't watch the tester work. The underwriter doesn't sit in the debrief. The regulator doesn't review the tester's notes. They review the report. If the report is vague, unstructured, or lacks evidence, the compliance requirement is arguably unmet — regardless of how thorough the testing was.


When the pen test report becomes legal evidence.

If the organisation suffers a breach, the pen test report enters a different context entirely. It becomes part of the incident response investigation, the regulatory notification, the insurance claim, and potentially the legal proceedings. What the report says — and what it recommended — matters enormously.

The Report as Due Diligence
"We commissioned a penetration test. It identified this vulnerability. We remediated it within 30 days as recommended." This is the statement of an organisation that exercised reasonable care. The report is the evidence. The remediation tracker is the proof. Together, they demonstrate that the organisation identified the risk and acted on it — which is the standard regulators and courts apply.
The Report as Liability
"The penetration test identified this vulnerability eight months ago. The report recommended immediate remediation. The organisation did not remediate. The breach exploited the exact vulnerability the report identified." This is the statement that creates personal liability for the CISO, legal exposure for the organisation, and potential grounds for the insurer to dispute the claim. The report's recommendations become a record of what the organisation knew and chose not to act on.
The Report as Evidence Quality
A well-structured report with clear findings, specific evidence, timestamped actions, and documented remediation recommendations stands up to scrutiny. A vague report with generic findings and "implement best practices" remediation is difficult to use as evidence of anything — either for or against the organisation. Report quality determines evidential value.

If the report is the product, treat it like one.

This article is as much a message to pen testing providers as it is to the organisations that commission them. If the report is the primary deliverable — and it is — then the industry's investment in reporting quality should match its investment in testing quality. In practice, it often doesn't.

What Good Providers Do What This Produces
Allocate dedicated reporting time — typically 20–30% of the engagement — separate from testing time. The report is not written in the final hour of the final day. Reports that are structured, evidenced, reviewed, and refined. Remediations that are specific and actionable. Executive summaries that communicate business impact.
Employ a quality assurance process where a second consultant reviews every report before delivery — checking for technical accuracy, clarity, remediation specificity, and audience appropriateness. Consistent quality across engagements and across consultants. Errors caught before delivery. Findings that have been validated by a second pair of eyes.
Invest in their consultants' communication skills — writing training, report templates, style guides, and peer review — with the same seriousness as technical training and certifications. Consultants who can write for non-technical audiences, structure information logically, and produce prose that communicates rather than obscures.
Offer a verbal debrief as standard — not as an optional extra — where the tester presents the findings, demonstrates the attack path, and answers questions from all stakeholder groups. Understanding that the written report alone cannot deliver. The debrief is the second communication channel that ensures the report's key messages are understood and internalised.
Provide remediation support after report delivery — answering questions about specific findings, clarifying remediation steps, and validating that fixes have been correctly implemented. A relationship that extends beyond report delivery. The report is the starting point of a remediation conversation, not the end of the engagement.

Treating the report as the asset it is.

Specify the Report in the SoW
When procuring a pen test, describe the report you expect: executive summary, attack narrative, chain analysis, specific remediations with verification steps, detection gap appendix, remediation roadmap. If it's not in the SoW, you can't hold the provider to it.
Archive Reports Systematically
Pen test reports are legal and compliance artefacts. Store them securely with version control, access logging, and retention policies. An auditor asking for your 2023 pen test report shouldn't trigger a search through email archives.
Track Findings Across Reports
Maintain a remediation tracker that maps every finding from every engagement. Track status: remediated, in progress, accepted risk, or deferred. When the next engagement is commissioned, provide the tracker — so the tester validates fixes rather than re-discovering known issues.
Budget for Reporting Quality
If a provider's quote seems low, ask how much time is allocated to reporting. A 10-day engagement with 1 day of reporting will produce a different deliverable from a 10-day engagement with 2.5 days of reporting. You're buying the report. Invest accordingly.
Read the Report Before Filing It
This sounds obvious. It isn't. We've delivered reports that weren't opened for weeks. We've seen critical findings that weren't read for months. The report's value is zero until someone reads it, understands it, and acts on it. The fastest way to waste a pen test investment is to file the report unread.

The bottom line.

The penetration test is an event. The report is the asset. The event is temporary — the tester's access is revoked, their knowledge fades, the compromised systems are restored, and the engagement becomes a line item in last quarter's budget. The report is permanent — it drives remediation for months, satisfies auditors for years, informs insurance renewals, benchmarks against future engagements, and serves as legal evidence of due diligence (or the lack of it) indefinitely.

An exceptional test that produces a poor report is a poor engagement. A competent test that produces an excellent report is a good engagement. This isn't because testing quality doesn't matter — it does. It's because testing quality only reaches the organisation through the report. The report is the transmission mechanism. If the mechanism fails, the quality doesn't arrive.

When you commission a penetration test, you're not buying an event. You're buying a document that will drive decisions, evidence diligence, and shape your security posture for years. Treat it accordingly — in the procurement, in the SoW, in the evaluation of providers, and in what you do with it after it arrives.


Penetration test reports built for every audience, every use case, and the long term.

Our reports are structured for boards, CISOs, and engineering teams — with attack narratives, chain analysis, specific remediations, detection gap analysis, and remediation roadmaps. Because the report isn't a summary of the work. It is the work.