> operator@c2:~# while true; do echo "$(date) — callback received from $RANDOM_HOST"; sleep $((RANDOM%600)); done<span class="cursor-blink">_</span>_
There is a question that has been asked at security conferences for twenty years: if you drop a USB drive in a car park, will someone plug it in?
The answer, confirmed repeatedly by academic research and our own experience, is yes. Reliably, predictably, and quickly. Not because people are foolish. Not because security awareness training has failed. But because human curiosity is a deeper, more persistent drive than any policy document or e-learning module can override.
On this engagement, we distributed twenty prepared USB drives across the client's premises. Fifteen produced callbacks to our command and control infrastructure. The first callback arrived forty-seven minutes after the drives were placed. The last arrived three days later. In between, we watched as curiosity, helpfulness, and simple habit systematically bypassed the organisation's security controls.
The client was a legal services firm — a mid-sized practice with approximately four hundred staff across two offices. They handled commercially sensitive and legally privileged material daily. They had completed a security awareness programme six months prior, which included a module on removable media risks. They wanted to test whether the training had been effective.
We were engaged to conduct a controlled USB drop exercise as part of a broader physical and social engineering assessment. The scope authorised us to place prepared USB devices in common areas of the client's primary office — the reception, kitchen areas, meeting rooms, printer rooms, and the building lobby — and to monitor whether the devices were connected to corporate workstations.
The USB drives did not contain malware. They did not exploit any software vulnerability. They contained files designed to generate a network callback to our infrastructure when opened — allowing us to record when and where each drive was connected, without compromising any system or accessing any data. The exercise was designed to measure behaviour, not to demonstrate exploitation.
The client's IT security team and the managing partner were aware of the exercise. No other staff had knowledge of it.
The effectiveness of a USB drop exercise depends on the plausibility of the drives and the files they contain. A blank USB drive with a single executable named 'malware.exe' tests nothing except whether people are willing to run obvious malware. A carefully prepared drive that looks like it was lost by a colleague tests something far more realistic — the intersection of curiosity, helpfulness, and trust.
We prepared twenty USB drives across four configurations, each designed to test a different aspect of human behaviour and a different technical control.
| Configuration | Quantity | Appearance | Payload Mechanism |
|---|---|---|---|
| Type A — 'Lost Personal' | 5 | Branded consumer USB with keyring; label: 'Holiday Photos'. Placed in car park and reception. | HTML file disguised as photo index. Opens browser, loads tracking pixel from external server. Tests: will users open files from unknown USB? |
| Type B — 'Corporate Document' | 5 | Plain black USB; label: 'Q4 Salary Review — CONFIDENTIAL'. Placed in printer rooms and kitchen areas. | Word document with embedded macro. On enable, macro calls home to C2 server. Tests: will users enable macros on documents from unknown USB? |
| Type C — 'IT Department' | 5 | USB with printed label: 'IT Dept — Windows Update Tool'. Placed near IT service desk and meeting rooms. | Windows shortcut (.lnk) file disguised as application. Executes PowerShell callback on click. Tests: will users run executables from USB based on trusted branding? |
| Type D — 'Rubber Ducky' | 5 | USB HID device (keystroke injection). Visually identical to standard USB drive. Placed in hot-desking areas. | On insertion, types pre-programmed keystrokes: opens PowerShell, executes encoded callback. No user interaction required. Tests: device control and USB class filtering. |
Each drive was uniquely identified with an embedded serial number, allowing us to track which specific drive called back, from which network location, and at what time. The drives were placed during early morning — between 06:30 and 07:00 — before the majority of staff arrived, using the building access we had obtained during the social engineering phase of the assessment.
The placement of each drive was deliberate. USB drop research consistently shows that location significantly affects pickup rates. Drives placed in high-traffic areas with a plausible 'lost' narrative are picked up more frequently than drives placed in obviously staged locations.
We monitored our callback infrastructure continuously for five business days following the drop. The results were recorded in real time.
Fifteen of twenty drives produced callbacks — a seventy-five per cent success rate. Eleven callbacks occurred on the first day. The first callback arrived forty-seven minutes after placement. The fastest human-initiated callback (excluding the Rubber Ducky auto-execution) arrived sixty-five minutes after the drives were placed.
| Drive Type | Placed | Callbacks | Rate | Key Observation |
|---|---|---|---|---|
| Type A — Holiday Photos | 5 | 4 | 80% | Highest human interaction rate. Curiosity about personal content. |
| Type B — Salary Review | 5 | 4 | 80% | Users enabled macros despite security warning. Confidential label increased engagement. |
| Type C — IT Update Tool | 5 | 2 | 40% | Lowest rate. Technical branding may have created suspicion in non-IT staff. |
| Type D — Rubber Ducky | 5 | 5* | 100%* | 3 auto-executed on connection. 2 blocked by USB device class policy. *3 successful. |
The results by type revealed distinct behavioural patterns. The 'personal' and 'confidential' drives (Types A and B) had the highest engagement — eighty per cent. The 'IT Department' drives (Type C) had the lowest human interaction — forty per cent — suggesting that the technical framing may have triggered more caution. The Rubber Ducky devices (Type D) achieved one hundred per cent insertion but only sixty per cent execution — two were blocked by a USB device class policy that filtered HID devices on some workstations.
Fifteen of twenty USB drives placed across the client's premises were connected to corporate workstations within three business days. The security awareness programme completed six months prior included specific guidance on removable media risks. The training did not measurably reduce the rate of USB insertion.
The human behaviour results tell one story. The technical control assessment tells another — and in many ways, a more actionable one. Because whilst training people not to plug in USB drives has limited effectiveness, configuring systems to safely handle USB drives when they are inevitably plugged in is entirely achievable.
We assessed the client's endpoint controls against each payload type.
The assessment revealed a layered but inconsistent control environment. AutoRun was disabled — a positive baseline control. Office macro execution was blocked by default — but users overrode this by clicking 'Enable Content' on every occasion. PowerShell execution policy was set to Restricted — but was trivially bypassed using the -ExecutionPolicy Bypass flag. USB HID device filtering was deployed — but only on some workstations, not consistently across the fleet.
The EDR was the most effective control, detecting and blocking several payloads. However, its coverage was inconsistent — it caught the PowerShell callbacks on newer endpoints with updated signatures but missed them on older endpoints. And it could not detect the HTML tracking pixel at all, because opening an HTML file that loads an external image is not, by any technical definition, malicious.
The most significant gap was the absence of a USB device control policy. There was no restriction on connecting USB mass storage devices to corporate workstations. Any USB drive could be inserted, mounted, and its contents accessed without restriction. The organisation relied entirely on user behaviour — backed by awareness training — to prevent USB-based attacks.
The Type B (Salary Review) results deserve particular attention, because they illustrate a paradox that undermines one of the most widely deployed technical controls in modern endpoint security.
Microsoft Office's default macro policy blocks macros in documents from untrusted sources and displays a prominent yellow warning bar: 'Macros have been disabled.' followed by an 'Enable Content' button. This control exists specifically to prevent macro-based attacks. It is enabled by default. It works.
And on every single occasion in this exercise — four out of four — the user clicked 'Enable Content'.
This is not a training failure. It is a design failure. The security warning competes with the user's objective — I want to read this document — and the warning provides a one-click mechanism to dismiss it. The 'Enable Content' button is a self-destruct mechanism disguised as a convenience feature. Users have been conditioned, through years of encountering this warning on legitimate documents shared by colleagues, to click it reflexively. The warning has become wallpaper.
Microsoft has recognised this problem. Recent versions of Office block macros entirely in documents downloaded from the internet (Mark of the Web), with no user override option. However, this protection does not apply to documents opened from USB drives on some configurations — the document is treated as a local file, not an internet download, and the legacy 'Enable Content' workflow applies.
Understanding why the USB drop succeeded at seventy-five per cent is as important as understanding that it did. The motivations are not uniform — different people connected the drives for different reasons.
The client had invested in a comprehensive security awareness programme. The programme included a specific module on removable media threats. Staff had completed the training six months prior. The training materials explicitly stated: do not connect unknown USB drives to corporate devices.
Seventy-five per cent of the drives were connected anyway.
This does not mean awareness training is worthless. It means awareness training alone is insufficient. Training creates knowledge — people know that unknown USB drives are risky. But knowledge does not reliably change behaviour, because behaviour is governed by context, habit, and motivation in the moment, not by information retained from a training module completed six months ago.
The most effective defence against USB-based attacks is not training people to resist their curiosity. It is removing the technical conditions that allow curiosity to be exploited.
The most impactful single control is a USB device control policy that blocks USB mass storage devices by default. This can be deployed via Group Policy (Windows) or Intune (managed endpoints) and takes effect immediately across the fleet. Approved USB devices — encrypted corporate drives issued by IT — can be whitelisted by hardware ID. All other USB storage devices are blocked at the operating system level, before any file can be accessed.
USB HID device filtering must be deployed consistently across all endpoints. The Rubber Ducky devices succeeded because HID filtering was applied to only a subset of workstations. A consistent policy that blocks unknown HID devices — or that requires administrator approval before a new HID device is accepted — eliminates the keystroke injection vector entirely.
A secure USB scanning kiosk provides a safe alternative for staff who find USB drives. Rather than connecting the drive to their workstation, staff are directed to hand it to IT or scan it on a dedicated, isolated kiosk that examines the contents in a sandboxed environment. This accommodates the helpful impulse — the desire to identify the owner and return the drive — without exposing the corporate network.
PowerShell access for standard users should be restricted. The majority of the payloads in this exercise relied on PowerShell for their callback mechanism. Restricting PowerShell to Constrained Language Mode for non-administrative users — or blocking it entirely for roles that do not require it — eliminates the execution mechanism used by the macro, LNK, and Rubber Ducky payloads.
Finally, awareness training should be updated to include the results of this exercise — anonymised and presented constructively. Telling people 'don't plug in USB drives' is abstract. Showing them that seventy-five per cent of their colleagues did exactly that, despite the same training, makes the risk concrete and personal. The most effective training is not a module — it is a story.
USB drop exercises produce uncomfortable results. A seventy-five per cent success rate, six months after targeted awareness training, is difficult to present to a board as evidence that the security programme is working. But the discomfort is the point. The exercise does not measure whether the training was good — it measures whether the training changed behaviour. And the answer, consistently, is: not enough.
This is not a failure of the people. It is a failure of the strategy. A strategy that relies on four hundred individuals making the correct security decision every time, in every situation, under every emotional condition, is a strategy that will fail. A strategy that deploys technical controls to ensure that the incorrect decision has no consequences is a strategy that succeeds regardless of human behaviour.
Block USB storage by default. Filter HID devices. Restrict PowerShell. Deploy application whitelisting. Then train people as well — not because training is the primary control, but because it is a valuable layer in a defence that does not depend on it.
Until next time — stay sharp, stay curious, and if you find a USB drive labelled 'Holiday Photos' in your car park, hand it to IT. They are expecting it.
This article describes a controlled USB drop exercise conducted under formal engagement with full written authorisation from the client. No malware was deployed. No data was accessed, exfiltrated, or compromised. The USB payloads generated network callbacks to attacker-controlled infrastructure for measurement purposes only. All identifying details have been altered or omitted to preserve client confidentiality. No individual employees are identified in this article or in the assessment report. Unauthorised deployment of USB devices intended to compromise computer systems is a criminal offence under the Computer Misuse Act 1990. Do not attempt to replicate these techniques without proper authorisation.
Hedgehog Security conducts controlled USB drop exercises that measure both human behaviour and technical control effectiveness. We test whether your staff will connect unknown devices — and whether your endpoint controls will contain the impact when they do. The results are reported constructively, without naming individuals, and with actionable recommendations that focus on technical controls rather than blaming people for being human.