A week on from the Yahoo and eBay attacks and the picture is clearer. By the end of the week, Amazon, CNN, ZDNet, E*TRADE, Datek and several smaller commercial sites had been hit. The cumulative outage time across the targets adds up to several days of major-site unavailability. The press coverage has been continuous and global.
The Royal Canadian Mounted Police are now investigating leads suggesting a single individual — apparently a teenager — operating from Quebec, using compromised hosts at universities and other institutions. The handle being reported is Mafiaboy. If this identification holds up, the attacks will turn out to have been launched by a 14-or-15-year-old running standard tools from his bedroom.
Let me try to extract the structural lessons.
Lesson one: the asymmetry has tilted decisively
A single individual, with limited resources, has produced what is on any sensible reckoning the largest sustained denial-of-service campaign in internet history, against the most well-resourced commercial sites. The targets have, between them, enormous engineering capability, large security budgets, and direct access to the major carriers.
The defenders were, against this attack, helpless. Outages of three to four hours at each target. Targets dropping back online when the attacker chose to switch to the next one, not because the defence had succeeded.
The asymmetry between attacker resources and defender capacity to respond has, in a single week, become front-page in a way the technical community has been writing about for over a year. The actual cost ratio — what the attacker spent in time and effort versus what the targets lost in revenue and engineering response — is grotesque. This is not a sustainable equilibrium for commercial internet operations.
Lesson two: the toolkit is now packaged
From the technical reporting, the tools used appear to be variants of Stacheldraht and Tribe Flood Network, running across compromised hosts at North American universities and a smaller number of commercial sites.
The attacker did not write these tools. They obtained them — either from a public underground source, or from a private one — and used them. The skill required to use the toolkit is, on the available evidence, modest. The skill required to create the toolkit is significantly higher. The two skills are separated in the current ecosystem.
This separation is what has shifted the threat model. A determined attacker with high technical skill is rare. A teenager who has obtained someone else's tools and has the patience to use them is not rare. The barrier to entry for large-scale denial of service has effectively collapsed.
For the next decade or more, this is going to be the shape of the threat: tools written by a small number of skilled developers, used by a larger number of operators, with the operators' skill being mostly in target selection and patience rather than in technical depth.
Lesson three: capacity defence does not scale
The defensive response Yahoo and the other targets used appears to have been broadly: absorb the flood by having sufficient bandwidth and processing capacity, while working with upstream carriers to apply emergency filters.
This worked, eventually, in the sense that each target was eventually back online. It did not work, in the sense of preventing the outages.
The capacity-based defence has economic limits. Doubling Yahoo's bandwidth doubles the cost; doubling the attacker's effort costs essentially nothing. The cost curve favours the attacker by orders of magnitude.
For the next iteration of this trajectory, capacity will not be enough. Other defences — coordinated upstream filtering, source-address validation, attack-traffic identification at carrier scale — have to be the answer. None of these are deployed at scale today.
Lesson four: BCP 38 enforcement is now actually possible
For over two years, source-address validation has been a known operational best practice that almost nobody has actually adopted. The reasons were structural — operator cost for someone else's benefit, no enforcement mechanism, no commercial pressure.
This week's events have changed the commercial pressure. Yahoo's outage cost them, by various estimates, several million dollars in direct revenue plus a substantial stock-price hit. Yahoo has the resources, the relationships, and now the motivation to demand BCP 38 enforcement from anyone they peer with.
Within weeks, I expect, the largest internet operators will be requiring source-address validation as a condition of peering. Smaller operators will follow. By year-end I would not be surprised to see a published industry-norm standard making egress filtering effectively mandatory for any network that wants to interoperate with the major commercial sites.
This is the first time I have seen the commercial pressure aligned with the operational best practice. It might actually shift the deployment.
Lesson five: the legal and regulatory response is now inevitable
Distributed denial of service was, until last week, a topic for technical lists and academic conferences. It is now a topic for Senate committees and MPs.
Within a year, I expect:
- Specific legislation in the US, the UK, and probably the EU criminalising DDoS attacks at higher penalty levels than general computer-misuse statutes.
- International cooperation frameworks specifically for cyber-attack investigation.
- Funding for law-enforcement training in this area, which is currently sparse.
- Some kind of national-CERT-style coordination capability in jurisdictions that do not yet have one.
The substantive work behind these will be the harder thing. Statutes are easy. International cooperation in real-time response is hard. We will see how much actually gets built.
Lesson six: the offensive engineering will continue
The most important lesson, and the one I am writing down most reluctantly. The attack toolkit will improve. The defences I have just listed — egress filtering, coordinated response, capacity defence — will be deployed against the current attack pattern. The next iteration of attack will be designed against the deployed defences.
We should expect, over the next 12 to 18 months:
- Tools that do not use spoofed source addresses, defeating BCP 38 as a single defence.
- Tools that use legitimate-looking traffic patterns (slower attacks, well-formed packets, traffic that mimics real users) to defeat statistical anomaly detection.
- Tools that exploit application-layer weaknesses rather than network-layer ones — burning CPU on the target rather than bandwidth on the link.
- Larger compromised host populations, drawn from the rapidly-growing pool of always-on home computers with cable and DSL connections.
The defenders' task is Sisyphean in the technical sense — every defence eventually becomes obsolete and the work continues at the new level. The same was true of memory-corruption defences, fingerprinting defences, mail-filtering defences. The shape of the discipline is to continually pay this cost.
What I am taking from this for my own work
A few things, written down for my own future use.
The honeypot project remains worth pursuing. The captures it has produced are giving me visibility into post-compromise behaviour I cannot get any other way. As the offensive landscape evolves, the honeypot is the most-likely source of intelligence I have at my scale.
Off-host logging remains the right discipline. Every time the attack ecosystem evolves, the on-host artefacts become less reliable. Off-host logging continues to be the only structurally sound way to capture what happened.
Egress filtering on my own networks remains correct. Whether or not the major operators end up deploying BCP 38 widely, my own contribution is to keep my networks from being part of the problem. This is small in itself; it is the right thing to do anyway.
The discipline of writing about it remains valuable. Two years of writing have given me a structured place to think through these incidents. The aggregate of those posts has, at this point, become a useful private archive for me — when I want to remember what I thought when something happened, the post is there.
More on this through the rest of the year. Whatever 2000 turns out to be, it is going to be more interesting than 1999. The first six weeks have already proved that.