Read the screenplay: FANNIEGATE — $7 trillion. 17 years. The biggest fraud in American capital markets.

🔓

20 Biggest OPSEC Failures in Military History

Stolen code books. Fitness app disasters. Classified documents on Discord.
Scored by severity, preventability, and historical impact.

Scoring System

/10

Severity

How bad was the damage?

/10

Preventability

How avoidable was it?

/10

Historical Impact

Did it change history?

Total = Severity + Preventability + Historical Impact = /30

The Rankings

#1

29/30

The Zimmermann Telegram

1917
SEV 10PRV 9IMP 10

Germany sent a coded diplomatic telegram to Mexico proposing a military alliance against the United States — offering Texas, New Mexico, and Arizona as incentives. British intelligence intercepted and decrypted it. The message was sent via a cable that Germany had been given permission to use for diplomatic communications, meaning they literally broadcast their war plans through enemy infrastructure.

Consequence: The telegram was published in American newspapers and became the single biggest catalyst for the US entering World War I. Germany lost the war. Mexico never got Texas.

OPSEC Lesson: Never send your most sensitive war plans through communications infrastructure controlled by your adversary. If you must, assume it will be read.

#2

28/30

The Battle of Midway Code Break

1942
SEV 10PRV 8IMP 10

US Navy cryptanalysts at Station HYPO cracked Japan's JN-25 naval code and discovered the target of Japan's next major offensive was "AF." To confirm AF meant Midway, they had Midway send a fake uncoded message about a water purification problem. Japan's encrypted traffic soon reported AF had a water shortage. The code was broken, and the ambush was set.

Consequence: The US Navy destroyed four Japanese aircraft carriers at Midway — the backbone of Japan's offensive naval capability. The battle turned the entire Pacific War. Japan never recovered the strategic initiative.

OPSEC Lesson: Rotating encryption keys and codes is not optional. Japan used JN-25 far too long without a full reset, giving cryptanalysts the volume of intercepts they needed to break it.

#3

27/30

The Capture of Enigma Machines and Code Books

1940–1945
SEV 10PRV 7IMP 10

Over the course of WWII, the Allies captured multiple Enigma machines and code books from German U-boats, weather ships, and captured vessels. The seizure of U-110 in 1941 gave Bletchley Park the materials to crack Naval Enigma, while the capture of U-559's short weather signal codebooks in 1942 broke the upgraded four-rotor Enigma (Shark) that had blinded Allied intelligence for months.

Consequence: Breaking Enigma allowed the Allies to reroute convoys away from U-boat wolf packs, saving millions of tons of shipping and an estimated 14 million lives. Historians estimate it shortened the war by two years.

OPSEC Lesson: Physical security of encryption hardware is as important as the algorithm itself. Germany never fully accepted that Enigma had been compromised, attributing intelligence leaks to spies rather than systemic cryptographic failure.

#4

28/30

The Snowden Leaks

2013
SEV 10PRV 9IMP 9

Edward Snowden, an NSA contractor working for Booz Allen Hamilton, walked out of an NSA facility in Hawaii with approximately 1.5 million classified documents on thumb drives. He had system administrator access that allowed him to access files far beyond his clearance level. The NSA's internal monitoring systems failed to detect the largest intelligence breach in American history.

Consequence: The leaks revealed the global scope of NSA surveillance programs including PRISM, XKeyscore, and the bulk collection of phone metadata. Major tech companies implemented end-to-end encryption. US-EU data sharing agreements were invalidated. The intelligence community's relationship with Silicon Valley was permanently damaged.

OPSEC Lesson: The insider threat is the most dangerous threat. System administrators should never have unchecked access to the entire classified network. Two-person integrity rules and proper access controls exist for a reason.

#5

28/30

Chelsea Manning and WikiLeaks

2010
SEV 9PRV 10IMP 9

Army intelligence analyst Chelsea Manning downloaded approximately 750,000 classified military and diplomatic documents from the Secret Internet Protocol Router Network (SIPRNet) and leaked them to WikiLeaks. Manning used a Lady Gaga CD case to smuggle out data from a SCIF. The classified network had virtually no data loss prevention controls.

Consequence: The leaked documents included the Collateral Murder video, the Afghan and Iraq War Logs, and 250,000 diplomatic cables. The releases exposed civilian casualties, diplomatic back-channels, and intelligence sources. Multiple informants in Afghanistan were potentially compromised. The diplomatic fallout lasted years.

OPSEC Lesson: Classified networks need data loss prevention just like corporate networks. If a junior analyst can download three-quarters of a million documents to a CD labeled "Lady Gaga," your security architecture has fundamental design flaws.

#6

27/30

Strava Heatmap Exposes Global Military Bases

2018
SEV 9PRV 10IMP 8

Strava published its Global Heatmap showing the aggregated GPS activity of 27 million users. In populated areas, it was data art. In Afghanistan, Syria, Somalia, and other conflict zones, it lit up secret forward operating bases, black sites, and patrol routes in brilliant neon on an otherwise dark map. Classified bases that did not officially exist became visible to anyone with a web browser.

Consequence: The Pentagon banned GPS-enabled devices in deployed environments. Every branch of the US military rewrote electronic device policies. Foreign militaries scrambled to assess their own exposure. The incident became the defining case study in fitness tracker OPSEC failure.

OPSEC Lesson: Default settings are the enemy of security. Soldiers never changed Strava's default public settings because nobody told them their 5K around a classified base would be aggregated into a global intelligence product.

#7

25/30

Tracking Bin Laden's Courier

2011
SEV 8PRV 7IMP 10

The CIA spent years tracking Abu Ahmed al-Kuwaiti, a courier they believed was in contact with Osama bin Laden. Al-Kuwaiti used a cell phone that the NSA was able to track to a compound in Abbottabad, Pakistan. Despite al-Qaeda's strict communications security, the courier's pattern of life — including his phone usage and driving routes — created a trail that led directly to the most wanted man in the world.

Consequence: Navy SEALs raided the Abbottabad compound on May 2, 2011, and killed Osama bin Laden. The operation also recovered a massive trove of intelligence documents and hard drives from the compound.

OPSEC Lesson: A security chain is only as strong as its weakest human link. Bin Laden himself used no electronic devices, but his courier's phone habits betrayed the entire operation. One person's OPSEC failure becomes everyone's OPSEC failure.

#8

23/30

Loose Lips Sink Ships — And They Literally Did

1942–1944
SEV 8PRV 6IMP 9

During WWII, careless talk by military personnel and civilians leaked shipping schedules, convoy routes, and troop movements. Bars near military ports were prime intelligence-gathering locations for Axis agents. A congressman publicly revealed that Japanese depth charges were set too shallow — directly compromising a tactical advantage that was saving American submarines.

Consequence: German U-boats sank over 3,500 Allied merchant ships during the Battle of the Atlantic, killing over 36,000 sailors. While not all losses were due to loose talk, the problem was severe enough that the US government launched one of the most famous propaganda campaigns in history.

OPSEC Lesson: The original OPSEC lesson. Information shared casually in social settings reaches adversaries faster than you think. The "Loose Lips Sink Ships" campaign exists because people literally talked ships into torpedo range.

#9

26/30

Strava Reveals Russian Troop Positions in Ukraine

2022
SEV 9PRV 10IMP 7

Russian soldiers uploaded running and cycling routes to Strava from military bases, staging areas, and forward positions before and during the invasion of Ukraine. OSINT analysts used the GPS data to track unit movements, verify intelligence about the Russian buildup, and identify individual service members. Some profiles included real names and unit affiliations.

Consequence: Strava data was combined with satellite imagery, social media, and intercepted communications to create open-source intelligence products that rivaled classified assessments. The data confirmed troop concentrations before the invasion was officially acknowledged and helped track Russian movements throughout the war.

OPSEC Lesson: Four years after the 2018 heatmap scandal, soldiers from a major military were still broadcasting their positions via fitness apps. Policy changes without enforcement and cultural change are meaningless.

#10

26/30

Discord Leaks — Jack Teixeira

2023
SEV 9PRV 10IMP 7

Jack Teixeira, a 21-year-old Massachusetts Air National Guard member, leaked hundreds of pages of highly classified intelligence documents on a Discord gaming server to impress his online friends. The documents included assessments of the Ukraine war, intelligence on allied nations, and signals intelligence capabilities. He photographed classified documents and posted them to a server with about two dozen members.

Consequence: The leaks revealed US intelligence assessments of the Ukraine war, exposed surveillance of allied governments, and compromised signals intelligence sources and methods. Teixeira was arrested and sentenced to 15 years in prison. The incident triggered a complete overhaul of how the military grants access to classified intelligence systems.

OPSEC Lesson: Access controls based on clearance level alone are insufficient. A 21-year-old junior airman should not have had access to top-secret CIA and NSA assessments. Need-to-know must be enforced, not assumed.

#11

23/30

The Cicero Affair

1943
SEV 8PRV 8IMP 7

Elyesa Bazna, a valet to the British Ambassador in Ankara, Turkey, photographed top-secret documents from the ambassador's safe using a Leica camera while the ambassador was sleeping. Codenamed "Cicero" by German intelligence, Bazna sold the photographs — including plans for the D-Day invasion — to the Germans for 300,000 pounds (paid in counterfeit bills).

Consequence: The Germans received advance intelligence about Allied invasion planning but, in a twist of irony, dismissed much of it as a British deception operation. The intelligence could have compromised D-Day had it been fully trusted and acted upon.

OPSEC Lesson: Physical document security requires more than a locked safe. The ambassador's valet had access to his bedroom, his keys, and his schedule. The most sophisticated encryption in the world is irrelevant if someone can photograph the plaintext.

#12

24/30

Polar Flow Exposes Intelligence Personnel

2019
SEV 9PRV 9IMP 6

Researchers from Bellingcat and De Correspondent demonstrated that Polar Flow — a fitness app competing with Strava — was even worse for OPSEC. Its API allowed anyone to pull the complete exercise history of any user, including precise GPS locations. They identified military and intelligence personnel exercising near the NSA, MI6, the French DGSE, Guantanamo Bay, and nuclear weapons storage facilities.

Consequence: Individual intelligence officers could be tracked from their workplace to their homes. Polar suspended its Explore feature. Multiple intelligence agencies launched internal reviews. The incident proved the fitness tracker OPSEC problem was industry-wide, not just a Strava issue.

OPSEC Lesson: Every fitness platform with GPS tracking has the same fundamental vulnerability. Focusing OPSEC guidance on one app while ignoring competitors creates a false sense of security.

#13

24/30

The Walker Spy Ring

1985
SEV 9PRV 7IMP 8

John Walker, a US Navy Chief Warrant Officer and communications specialist, sold cryptographic key material and classified documents to the Soviet Union for 17 years (1968–1985). Walker recruited his son, his brother, and a friend into the spy ring. The Soviets could decrypt over one million encrypted Navy messages, including submarine movements and nuclear war plans.

Consequence: The Soviets had real-time access to US Navy communications for nearly two decades. They knew submarine patrol routes, fleet movements, and war plans. Former Secretary of Defense Caspar Weinberger called it the most damaging espionage case in US history. Walker was only caught when his ex-wife reported him to the FBI.

OPSEC Lesson: Insider threats that persist for years indicate systemic monitoring failures. Walker passed polygraphs, maintained his clearance, and was never flagged despite his conspicuous spending and lifestyle changes.

#14

24/30

French Aircraft Carrier Position Broadcast via Strava

2026
SEV 8PRV 10IMP 6

A sailor aboard the French aircraft carrier Charles de Gaulle posted a jog on Strava while the ship was at sea. The carrier's exact GPS coordinates appeared on the public activity map. A nuclear-powered aircraft carrier — one of the most strategically valuable military assets on Earth — had its position broadcast to anyone with a web browser because someone wanted to log their 5K.

Consequence: The French Navy launched an investigation. Eight years after the original 2018 heatmap scandal, the same fundamental problem remained completely unsolved. A three-billion-euro warship, defeated by a free fitness app.

OPSEC Lesson: Technology policies without cultural enforcement fail indefinitely. Eight years of memos, briefings, and policy updates did not prevent a sailor from broadcasting an aircraft carrier's position for a 5K time.

#15

23/30

US Secret Service Agents Exposed on Strava

2023
SEV 8PRV 9IMP 6

Strava activities uploaded by US Secret Service agents revealed their movements and locations, including details about presidential security perimeters. Agents' running routes near the White House and at travel locations showed protective detail patterns, advance team movements, and security staging areas. Their profiles were publicly visible.

Consequence: The Secret Service issued new personal device policies. Security researchers demonstrated that any adversary could use the data to study protective detail patterns, identify advance teams, and potentially target individual agents for surveillance or recruitment.

OPSEC Lesson: Protective security personnel are high-value intelligence targets. Their fitness data reveals not just their own locations but the security architecture around the people they protect.

#16

23/30

Pearl Harbor Intelligence Failures

1941
SEV 7PRV 7IMP 9

The US had broken Japan's diplomatic cipher (PURPLE) and intercepted messages indicating Japan was preparing for war. However, intelligence was compartmentalized, warnings were not properly disseminated, and the local commanders at Pearl Harbor were not given the context they needed. A radar operator who detected the incoming Japanese attack was told to ignore it.

Consequence: The attack on December 7, 1941 destroyed or damaged 19 ships and 328 aircraft, and killed 2,403 Americans. The US entered World War II. The intelligence failure led to the creation of centralized intelligence coordination that eventually became the CIA.

OPSEC Lesson: Having intelligence is useless without proper dissemination and action protocols. The US had the signals; it lacked the system to act on them. Information silos kill.

#17

23/30

French Sahel Patrol Routes Exposed on Strava

2020
SEV 8PRV 10IMP 5

French soldiers deployed in the Sahel region of Africa uploaded patrol routes to Strava. The GPS traces showed exact paths, timing, frequency, and rest points of military patrols in active conflict zones where French forces were fighting jihadist insurgents. Anyone monitoring these routes could predict patrol schedules.

Consequence: The French military reinforced its operational security directives. The incident demonstrated that two years after the 2018 heatmap scandal, soldiers in active combat zones were still broadcasting their movements via fitness apps.

OPSEC Lesson: Lessons not enforced are lessons not learned. Policy documents that soldiers do not read or follow provide zero security.

#18

22/30

USS Pueblo Capture

1967
SEV 8PRV 7IMP 7

North Korea captured the USS Pueblo, a US Navy intelligence-gathering ship, in international waters. The crew failed to destroy classified documents and cryptographic equipment before capture. The ship carried NSA code machines, encryption keys, and thousands of classified documents that were seized intact by North Korean forces.

Consequence: The Soviets gained access to US Navy encryption equipment and code material. Combined with intelligence from the Walker spy ring, they could decrypt US military communications. The 83 crew members were held prisoner for 11 months. North Korea still has the ship, displayed as a museum.

OPSEC Lesson: Emergency destruction procedures for classified material must be rehearsed and executable under combat conditions. The Pueblo's destruction protocols failed because the crew did not have enough time or working equipment to destroy the volume of material aboard.

#19

22/30

Vault 7 — CIA Hacking Tools Leaked

2017
SEV 8PRV 8IMP 6

WikiLeaks published "Vault 7," a collection of 8,761 documents detailing the CIA's cyber weapons and hacking capabilities. The leak revealed tools for compromising smartphones, smart TVs, vehicles, and encrypted messaging apps. The documents came from a CIA network that hundreds of contractors and employees could access with minimal oversight.

Consequence: The CIA's offensive cyber capabilities were publicly exposed. Adversaries could study and defend against the tools. The leak was attributed to Joshua Schulte, a CIA software engineer who had been in a workplace dispute. The CIA's ability to conduct covert cyber operations was set back years.

OPSEC Lesson: Disgruntled employees with broad access are a predictable threat vector. Schulte's workplace disputes were documented but his access was never restricted. Motive plus access equals breach.

#20

22/30

Shadow Brokers Leak NSA Hacking Tools

2016
SEV 8PRV 7IMP 7

A group calling itself the Shadow Brokers leaked a cache of NSA hacking tools and exploits, including EternalBlue — a Windows exploit that targeted the SMB protocol. The tools were stolen from the NSA's Tailored Access Operations unit, the agency's elite hacking division. The exact method of exfiltration remains disputed.

Consequence: EternalBlue was weaponized into the WannaCry ransomware attack that hit 230,000 computers in 150 countries, causing billions in damage. It also powered the NotPetya attack that devastated Ukraine and caused over $10 billion in global damages. NSA-built weapons were turned against the world the NSA was supposed to protect.

OPSEC Lesson: Offensive cyber weapons are inherently dual-use. If they are stolen — and they will be — they become weapons against your own infrastructure. The NSA built a gun, lost it, and watched it shoot hospitals.

Glen's Take

The through-line across all twenty of these is depressingly simple: humans are the weakest link. We build billion-dollar encryption systems and then leave the code books on a captured submarine. We classify documents at the highest level and then hand system admin access to a contractor with a thumb drive. We ban fitness trackers in combat zones and then a sailor logs a 5K on an aircraft carrier.

Technology improves. Human nature does not. The Zimmermann Telegram was sent in 1917. The Strava carrier leak happened in 2026. The failure mode is identical: someone assumed their communication channel was secure when it wasn't.

The best OPSEC in the world is useless if one person decides the rules don't apply to them.

Frequently Asked Questions

What is OPSEC?

OPSEC — Operational Security — is the process of identifying, controlling, and protecting critical information that could be used by adversaries. It originated as a formal military discipline during the Vietnam War when a team codenamed PURPLE DRAGON investigated how the enemy was anticipating US operations despite encrypted communications. The answer was usually mundane: patrol patterns, supply orders, and troop movements were observable through non-classified channels.

What is the biggest OPSEC failure in history?

By historical impact, the Zimmermann Telegram (1917) and the Battle of Midway code break (1942) are the strongest candidates — both directly changed the outcome of world wars. By preventability, the Strava heatmap incident (2018) and Chelsea Manning leak (2010) stand out because basic security controls would have prevented them entirely.

Are fitness tracker OPSEC failures still happening?

Yes. The 2026 French aircraft carrier incident — eight years after the original Strava heatmap scandal — proved that the problem remains unsolved. Policy changes without cultural enforcement and technical controls have repeatedly failed to prevent military personnel from broadcasting their locations via fitness apps.

Get Glen's Musings

Occasional thoughts on AI, Claude, investing, and building things. Free. No spam.

Unsubscribe anytime. I respect your inbox more than Congress respects property rights.

Know a history buff who'd appreciate this list?

Keep Exploring