A global domain within the information environment consisting of the interdependent networks of information technology infrastructures and resident data, including the Internet, telecommunications networks, computer systems, and embedded processors and controllers.
We will also refer to the DoD’s cyberspace layer model, which consists of three interrelated layers: (1) physical network, (2) logical network, and (3) cyber-persona.
The physical network layer consists of the Information Technology (IT) devices and infrastructure in the physical domains that provide storage, transport, and processing of information within cyberspace, to include data repositories and the connections that transfer data between network components. The physical network components include the hardware and infrastructure (e.g., computing devices, storage devices, network devices, and wired and wireless links). Components of the physical network layer require physical security measures to protect them from physical damage or unauthorized physical access, which can sometimes be leveraged to gain logical access. While geopolitical boundaries can easily and quickly be crossed in cyberspace, crossing these boundaries involves the principle of territorial sovereignty tied to the physical domains. Every physical component of cyberspace is owned by a public or private entity, which can control or restrict access to its components.
The logical network layer consists of those elements of the network related to one another in a way that is abstracted from the physical network, based on the logic programming (code) that drives network components (i.e., the relationships are not necessarily tied to a specific physical link or node but to their ability to be addressed logically and exchange or process data). Individual links and nodes are represented in the logical layer, but various distributed elements of cyberspace, including data, applications, and network processes, are not tied to a single node. A website may exist on multiple servers in multiple locations in the physical domains but is represented as a single Uniform Resource Locator (URL) on the World Wide Web (WWW). There are things with intrinsic value that exist only in the logical layer, such as digital currency, non-fungible tokens, or a retirement savings account.
The cyber-persona layer is a view of cyberspace created by abstracting and combining data from the logical network layer to develop descriptions of digital representations of an actor or entity identity in cyberspace (cyber-persona). The cyber-persona layer consists of network or IT user accounts, whether human or automated, and their relationships to one another. Cyber-personas may relate directly to an actual person or entity, incorporating some personal or organizational data (e.g., email and IP addresses, websites, phone numbers, Web forum logins, or financial account passwords). One individual may create and maintain multiple cyber-personas through use of multiple identifiers in cyberspace, such as separate work and personal email addresses, and different identities on different Web forums, chat rooms, and social networking sites, which may vary in the degree to which they are factually accurate. Conversely, a single cyber-persona can have multiple users, such as multiple hackers using the same malicious software (malware) control alias, multiple extremists using a single bank account, or all members of the same organization using the same email address.
Tenets of Cybersecurity
For purposes of this class, cybersecurity refers to the protection of the following three tenets (or ‘pillars’): confidentiality, integrity, and availability.
Confidentiality:
Preserving authorized restrictions on information access and disclosure, including means for protecting personal privacy and proprietary information.
For instance, midshipmen should not be able to see each other's grades on the Midshipmen Information Database System (MIDS), only their own. If you can view your classmate's grades, then the concept of confidentiality has been violated.
Integrity:
Guarding against improper information modification or destruction, and includes ensuring information non-repudiation and authenticity.
Modifying your Order of Merit (OOM) in MIDS would violate the principle of data integrity. Integrity would also be violated if you access MIDS using someone else's credentials (authentication) or spoof an email from someone authorized to make changes, such as a professor, to a system administrator (non-repudiation). Note that data integrity is different from the honor concept of integrity. Availability:
Ensuring timely and reliable access to and use of information.
Not being able to log into MIDS to register for classes that are in high demand next semester violates the principle of availability.
People, Processes, and Technology
Cyberspace is about more than hardware, software, or technical solutions. Simply put, there is no silver bullet for achieving cybersecurity. Instead, security comes from a combination of people, processes, and technology.
People make decisions, spot anomalies, and respond to the unexpected. They shape security through their actions and awareness. Even the most advanced tools can fail without engaged, informed users and leaders.
Processes provide structure and repeatability. They link human behavior to technical controls, turning policies into consistent action. Good processes reduce chaos, support accountability, and make security scalable.
Technologies enables enforcement, automation, and visibility. It can detect threats, block attacks, and manage complexity—but only when aligned with strong processes and used by the right people in the right way.
A Brief History of Cyber
Below is a very brief history of developments in cyberspace. Cyber history is often told through inventions and inventors, but at its core, it is a story of interaction—how people design, adapt, misuse, and secure technology. At every stage, the evolution of cyberspace reflects a dynamic relationship between human behavior and technical capability. Whether building machines, creating new uses for them, or exposing their flaws, human choices have always shaped how technology functions—and how it fails.
The Beginnings of Computers: 19th Century - Early 20th Century
In 1821, Cambridge mathematician Charles Babbage (1791–1871) set out to build a calculating machine using mechanical parts. His initial "Difference Engine" was a decimal-based calculator; the successor he envisioned, the "Analytical Engine", used punch cards to execute arithmetic tasks. Ada Lovelace (1815–1852), who assisted Babbage, recognized the Analytical Engine as more than an arithmetic device—it was a general-purpose symbol manipulator capable of algebraic and alphabetical operations. For this insight, she is often credited as the first computer programmer (Swade 2001).
In 1936, fellow Cambridge mathematician Alan Turing (1912–1954) proved that a hypothetical machine—a "Turing machine"—using binary symbols could perform any computation representable by an algorithm (Turing 1936). This laid the foundation for modern digital computing. Turing applied these ideas during WWII at Bletchley Park, helping crack the German Enigma code (Thimbleby 2016).
Post World Wars
By 1946, the University of Pennsylvania built the Electronic Numerical Integrator and Computer (ENIAC), arguably the first general-purpose electronic computer. It covered 1,000 square feet and was 10 feet tall. Programs, once written, took several people to load by setting dials, cable connections, and switches. Around 50 vacuum tubes per day had to be replaced simply to keep ENIAC running (Grudin 2005). Such early computer projects employed people in three critical roles: management, programming, and operation.
Managers oversaw the design, development, and operation of the machine.
Programmers wrote the programs.
Operators carried out the tasks required to implement the program.
Electronic Numerical Integrator and Computer (ENIAC).
By the early 1950s, Grace Hopper (1906–1992) was working on the Bureau of Ordnance Computation Project at Harvard, applying Mark I technology to Navy problems. She developed the first compilers, A-0, allowing software to run on different machines. She also contributed to the creation of the Universal Automatic Computer (UNIVAC), the first all-electronic digital computer, and she famously coined the term "bug" after removing a moth from the Harvard Mark II.
Log entry describing "first computer bug." Courtesy of the Computer History Museum.
By the mid-1950s, Hopper led the development of Flow-Matic, the first English-command language, and helped standardize the Common Business Oriented Language (COBOL), still used in modern mainframe systems. She later worked as a senior consultant at Digital Equipment Corporation into her 80s.
Grace Hopper examining in front of UNIVAC magnetic tape drives. She holds a COBOL programming manual in her hand. Courtesy of the Computer History Museum.
Hopper retired as a Rear Admiral in 1986 at age 79, the oldest active-duty officer at the time. Her work earned numerous honors, including 40 honorary degrees, the National Medal of Technology, and the Presidential Medal of Freedom (posthumously). Hopper Hall, opened in 2020, was the first academic building named after a woman at any of the major service academies. She was also the namesake for the USS Hopper (DDG-70),commissioned in 1996, and in 2023, Nvidia released the GH200 Grace Hopper™ Superchip for AI and high-performance computing (HPC) applications.
Networking and Interconnecting Computers: 1960s through the 1980s
Early computers were standalone systems, so security was largely a localized concern. This changed in the late 1960s with the creation of the Advanced Research Projects Agency Network (ARPANET), the precursor to today's internet. By December 1969, ARPANET connected computers at UCLA, UCSB, Stanford, and the University of Utah. Within six months, it expanded to MIT, Harvard, and Bolt, Beranek and Newman (BBN) (IIS 2015). As networks grew, so did concerns about data loss, identity verification, and intercepted communications. Computer security pioneer Willis H. Ware warned that "[T]here are human vulnerabilities throughout; individual acts can accidentally or deliberately jeopardize the protection of information in a system" (Ware 1967, 11). NSA scientist Bernard Peters similarly cautioned in 1967 that systems with remote terminal access were inherently insecure (Peters 1967).
Public awareness of these vulnerabilities surged after President Reagan viewed the 1983 film WarGames, in which a teenager nearly triggers a nuclear war by hacking into a Pentagon computer. Concerned, Reagan asked his advisors whether such a scenario was plausible. Their answer—"unfortunately, yes"—led to National Security Decision Directive 145 (NSDD-145) in 1984, placing the National Security Agency (NSA) in charge of securing government telecommunications (Warner 2012).
1983 thriller WarGames.
A preview of such risks had already emerged in the telephone system, the most sophisticated network of its time. In the 1960s and ’70s, early hackers known as "phone phreaks" exploited their understanding of circuits, switches, and tonal signals to bypass fees and access restricted parts of the network (Orth 1971; Rosenbaum 1971). Beyond technical ingenuity, they also used "social engineering"—manipulating phone company staff into divulging confidential information (Draper 2001). This marked the emergence of the "hacker mentality", a mindset that valued technical exploration over compliance with rules (Hatfield 2018).
Steve Jobs and Steve Wozniak, both former phreakers, went on to co-found Apple Computers, helping spark the personal computing revolution of the 1980s (Lee 2001). That same decade saw the adoption of TCP/IP, the invention of the World Wide Web in 1989, and the rise of the client-server model—further merging computing and telephony systems (Denning & Denning 2016).
Phreakers and computer hackers shared a "hacker ethic" that still shapes the digital underground today (Levy 1984; Steinmetz & Gerber 2015). Its core principles include:
Universal access to computers
Skill over credentials or status
Freedom of information
Skepticism of authority and centralization
Protection through anonymity and decentralization
Sharing of innovations
This ethic endures in the open-source software movement and hacktivist groups like Anonymous, which has used decentralized tactics—such as Distributed Denial of Service (DDoS) attacks—to challenge governments, corporations, and institutions (Olson 2013).
Blue Box & Phone phreaker John "Cap'n Crunch" Draper. This device produced tonal outputs; when combined with phreakers' knowledge of how telephone calls were routed over network trunk lines, phreakers could trick the network into bypassing the normal toll collection and allow them to route calls on their own (Rosenbaum 1971). Draper discovered a small toy whistle offered in Cap’n Crunch cereal boxes at the time effectively produced the 2600 Hz tone necessary to access the Operator Mode used to connect calls anywhere in the world (Orth 1971, 28).
Computing Into the 21st Century
The first website in the world was published and hosted in 1989 by Tim Berners-Lee, a British scientist working at CERN. Within a few years, the World Wide Web (WWW) software was made available in the public domain with an open license, allowing the web to expand to what it's known as today.
Screenshot of the recreated page of the first website.
In 1991, Linus Torvalds created the Linux operating system (OS) out of a need to provide an alternative to the Microsoft (MS) Desktop Operating System (DOS). What set Linux apart was that he released the Linux OS source code for free, accessible for anyone to use anywhere in the world, on the Internet. His philosophy was that if the software was free, anyone with interest in computer programming could modify and improve the system, ultimately improving it over time. By 1999, an estimated seven million computers ran Linux (Himanen, 2001).
Official Linux logo from the linux.org website in Feb 1998. Courtesy of the WayBackMachine.
As technology advanced, so did efforts to organize and operate DoD forces in cyberspace. In 2008, a piece of malware known as agent.btz" infiltrated classified computer systems of Central Command, likely spreading through the use of a USB drive. The military’s effort to contain and eliminate the malware, known as Operation Buckshot Yankee, identified critical gaps in the military’s cybersecurity posture, and significantly contributed to creation of U.S. Cyber Command as the leading DoD entity in cyberspace, responsible for planning and executing global cyber oprations. Cyber Command continues to evolve, advancing in 2018 from a sub-unified command under Strategic Command into its own four-star,
unified combatant command, and changes continue even today.
A few years later, the Navy conducted its first officially named cyber operation, Operation Rolling Tide, in response to Iranian intrusion into an unclassified Navy network in 2013.VADM Jan Tighe, the former commander of Fleet Cyber Command, said this operation "influenced pretty much every process I have and every investment that I have." Navy cyber policy, personnel, and operations continue to evolve; in 2023, for instance, the Department of Navy released its first Cyber Strategy, created the Maritime Cyber Warfare Officer (MCWO) designator, and re-designated the Cryptologic Technician - Networks (CTN) rating into the
Cyber Warfare Technician (CWT) rating.
Major Cybersecurity Incidents
Over the past few decades, an ever growing number of high profile incidents have caught the attention of governments, organizations, companies, and individuals around the world. Understanding these incidents and their implications will help Navy and Marine Corps Officers understand the challenges they may be confronted with in the fleet and how to avoid becoming the subject of a future cyber incident.
Below is a (small) sampling of the most significant and impactful cyber incidents in recent years:
Recent Cyber Trends: COVID-19
Trends have changed a lot in this ever-evolving field-especially since COVID-19. The firm PurpleSec estimates that cybercrime increased 600% over the course of the pandemic.
Morris Worm: While simple by today’s standards, this 1988 program - named for its creator, 23-year old Robert Morris - is considered the first major computer worm on the internet. The program replicated itself across the internet, inadvertently crashing ~10% of all online machines. Morris became the first person convicted of breaking the Computer Fraud and Abuse Act, passed in 1986.
Stuxnet Worm: Uncovered in 2010, with development potentially going back as far as 2005, "Stuxnet" was a highly complex piece of malicious software (malware) targeting centrifuges used in Iran’s nuclear enrichment facility at Natanz. It was perhaps the first cyberweapon to physically destroy industrial infrastructure. Stuxnet was believed to be a joint US-Israeli effort, although neither country has ever publicly admitted to the operation.
Advanced Persistent Threat 1 (APT1): In 2013, cybersecurity firm Mandiant publicly reported a Chinese military unit - PLA Unit 61398, also known as APT1 - was responsible for over 140 cyber intrusions targeting U.S. companies and organizations. The report detailed a years-long campaign of economic espionage that focused on stealing trade secrets, blueprints, and other sensitive business information. Not only were the technical and operational details revealing, but the APT1 report was the first to publicly attribute such a campaign to a specific foreign military unit, influencing how cyber threats were discussed, increasing pressure on governments to respond to state-sponsored cyber operations more openly.
Sony Pictures Entertainment hack: In November 2014, hackers breached Sony Pictures Entertainment, stealing and releasing sensitive corporate data including emails, unreleased films, and employee records. The attackers also wiped company systems, locking out employees and crippling operations for weeks. The attack was retaliation for Sony’s planned release of The Interview, a comedy film depicting the assassination of North Korean leader Kim Jong-Un, and was publicly attributed by U.S. officials to North Korean hackers.
Office of Personnel Management (OPM) data breach: In April 2015, the Office of Personnel Management, the federal government's human resources department, announced a data breach affecting nearly 22 million government employees, many of which included military members (potentially including your instructor). Most of those affected only had their security clearance file information stolen, but an estimated 4 million employees also had social security numbers, addresses, fingerprints, performance evaluations, and job assignment information stolen. The U.S. government attributed the attack to China.
Petya/NotPetya: Petya and NotPetya were pieces of malware that targeted Microsoft Windows systems in 2016 and 2017. Petya functioned as ransomware, locking users out of their data and demanding payment for access. NotPetya, while resembling ransomware, was actually destructive malware with no recovery mechanism, designed solely to cause disruption. These attacks caused billions in damage and highlighted how countries like Russia and North Korea use cyber operations—whether for financial gain or strategic disruption—especially under economic sanctions.
Equifax data breach: In May of 2017, hackers used a known vulnerability to gain access to the personal data systems of Equifax, one of the three major U.S. credit reporting companies. The attackers stole the personal data of 143 million people and put every single one of the victims at risk for financial and identity fraud. The hack occurred due to a failure to fix/patch the vulnerability two months after it had been publicly distributed. The hack brought attention to the private sector's responsibility to protect consumer data and is still a topic of intense debate.
SolarWinds supply chain attack: In early 2020, suspected Russian hackers inserted malicious code within SolarWind’s software system called "Orion". When SolarWinds sent out software updates for "Orion" to its 33,000 customers the malicious code was able to create backdoors that enabled hackers to spy on many different organizations including the National Nuclear Security Administration which maintains the U.S. nuclear stockpile.
Colonial Pipeline ransomware attack: In the spring of 2021, hacking group Darkside was suspected of launching an attack against U.S. energy distribution company, Colonial Pipeline. This ransomware attack caused Colonial Pipeline, which supplies a majority of the gasoline to the U.S. Eastern Seaboard, to have to pause operations for several days until a Bitcoin ransom was paid. This disruption caused increased gas prices and fears of critical infrastructure security problems.
Viasat Satellite Network: Cyber attacks extend into the space domain. Moments before Russian troops crossed the border into Ukraine on February 24, 2022, tens of thousands of Viasat satellite communication modems were knocked offline throughout Ukraine and European countries by a cyber attack. Over 30,000 modems have been distributed to restore service back to customers.
Volt Typhoon advisory: In February 2024, the Cybersecurity and Infrastructure Security Agency (CISA), Federal Bureau of Investigation (FBI), and NSA released a joint advisory about the Chinese cyber group "Volt Typhoon". The advisory warned critical infrastructure firms - such as those involved in communications, energy, transportation, and water and waste management - that Volt Typhoon was positioning itself on IT networks, with the possibility of conducting disruptive or destructive attacks against the U.S. during a major crisis or conflict.
National Public Data breach: In mid-2024, a cybercriminal group calling themselves "USDoD" breached National Public Data, a data broker specialized in background checks. The group stole 2.9 billion records, which included names, addresses, Social Security numbers, and relatives going back at least three decades.
Review Questions:
How does the Department of Defense define the cyberspace domain?
What are the interrelated layers of the cyberspace model?
What is the CIA-triad, and why does it matter for cybersecurity?
Name the key historical figures in the early days of computers
Name important technical developments in the history of cyberspace
Which military command has overall responsibility for US cyber operations?
What are some of the impacts of major cybersecurity incidents in the past decade?
Supplemental Media:
Cyberspace Threat Map
Cyber Workforce
Cyberspace is taking the workforce by storm. Check out this infographic on where cyber jobs are and what a career in the cyberspace domain could look like.
D. E. Denning, "The United States vs. Craig Neidorf: A Debate on Electronic Publishing, Constitutional Rights and Hacking," Communications of the Association for Computing Machinery, vol. 34 no. 3, pp. 22-43, 1991.
P. J. Denning and D. E. Denning, "Cybersecurity is Harder than Building Bridges," American Scientist, vol. 104 no.3, pp. 1-6, 1991.
L. Greiner, "Hacking your network’s weakest link – you," Network Magazine, vol. 12 no.1 pp. 9-12, 2008.
J. Grudin, "Three Faces of Human-Computer Interaction," IEEE Annals of the History of Computing, vol. 27 no.4, pp. 46-62, 2005.
J. Hatfield, "Social Engineering in Cybersecurity: The Evolution of a Concept," Computers & Security, vol. 73, pp. 102-113, 2018.
F. D. Kramer, S. H. Starr, L. K. Wentz, and National Defense University. Center For Technology And National Security Policy, Cyberpower and national security. Washington, D.C: Center For Technology And National Security Policy, 2009.
S. Levy, Hackers: Heroes of the Computer Revolution. New York: Doubleday, 1984.
W. J. Lynn, “Defending a New Domain: The Pentagon’s Cyberstrategy,” Foreign Affairs, vol. 89, no. 5, pp. 97–108, 2010, Available: https://www.jstor.org/stable/20788647
K. Steinmetz and J. Gerber, "‘It Doesn’t Have to be This Way’: Hacker Perspectives on Privacy," Social Justice, vol. 41, no 3, p 29, 2015.
D. Swade, The Difference Engine: Charles Babbage and the Quest to Build the First Computer. New York: Viking-Penguin, 2001.
Thimbleby, "Human Factors and Missed Solutions to Enigma Design Weaknesses," Cryptologia, vol. 40, no. 2, pp. 177-202, 2016.
A. Turing, "On Computable Numbers, with an Application to the Entscheidungs (Decision) problem," Proceedings of the London Mathematical Society, vol. s2-42, no. 1, pp. 230-265, 1936.
M. Warner, "Cybersecurity: A Pre-History," Intelligence and National Security, vol. 27, no. 5, pp. 781-799, 2012.