SY110: Intro to the Cyberspace Domain


Intro to the Cyberspace Domain

Learning Outcomes

After completing these activities you should be able to:

  • Identify significant historical figures that have contributed to the development of cyberspace
  • Describe technologies that have paved the way for the advancement of the Cyberspace domain
  • Describe the importance of information and the President's 60-Day Cyberspace Policy Review
  • Define and describe the cyberspace domain
  • Recall recent examples of major cyberspace incidents
  • Describe the DIKW hierarchy and what it represents.
  • Identify the CIA triad and how it applies to cybersecurity


A Brief History of Cyber

From the early 1960s through the 1990s, the realization gradually spread that computer networks were vulnerable to information breaches, whether through spillage or offensive cyber operations. Sometimes this realization came from scientists deep within the national security infrastructure, such as National Security Agency (NSA) scientist Bernard Peters, who, in 1967, warned that security cannot be guaranteed in any “Multi-Programmed” computer system with remote terminal access (Peters 1967). At other times this realization came from quite unexpected directions. For example, the 1983 Hollywood thriller WarGames, in which a technically proficient high-schooler nearly causes a US-Soviet nuclear exchange by hacking into a top secret Pentagon computer, had so impressed President Ronald Reagan, a former actor who loved films and received a private screening of the movie in the White House, that he asked top security advisors whether US military networks were vulnerable to such intrusion. His question had surprised them. Their answer—“unfortunately yes”—which came a week later, surprised Reagan even more and in 1984 led to the first US National Security Decision Directive (NSDD-145) making the National Security Agency responsible for protecting government telecommunications systems (Warner 2012).

wargames
1983 thriller WarGames.

Unfortunately, throughout this history cyber operations had been too often understood as a purely technical activity—clever hackers attacking networks defended by equally smart technical defenders. Within this framework it seemed that technical superiority, such as faster processing, stronger encryption, or more advanced protocols, ultimately determined the security of networks. This point of view still captures the imagination of the general public. Yet this picture is completely wrong for it overlooks the human element. Almost every prominent contemporary cyber operation has targeted human vulnerabilities as its central exploit, whether it is exploiting an individual’s trust (2015 SWIFT banking hack), the ability to trick people through impersonation (2015 Office of Personnel Management), the human proclivity to fall for phishing scams (2016 DNC Hack), an inability or unwillingness to patch software (2017 Petya, 2017 WannaCry) or the human choice to use outdated cryptographic tools (2018 Under Armour).

That human beings comprise the most potent attack-vector comes as no surprise to cybersecurity researchers, who have come to see humans as the “weakest link” in any computer network (Greiner 2008). This lesson traces the history of human-computer interaction, from the earliest days of the Computer Age to how we look at and define cyberspace today.

Early Beginnings in Computers: 19th Century

In 1821, the Cambridge mathematician Charles Babbage (1791-1871) lamented the fact that the only way to check to see if an arithmetical result was correct was to either consult a pre-calculated reference table, which were often full of errors, or to have several human beings (called “computers” at that time) do the computation by hand and compare their results. Referencing the greatest invention of his day, the steam engine, a frustrated Babbage exclaimed, “I wish the sum could be calculated by steam!” He began his quest to design a calculating machine that would do arithmetic utilizing purely mechanical properties. The result was his “Difference Engine,” a decimal-based mechanical calculator that reliably computed basic arithmetical functions simply by setting up initial conditions and turning a hand crank. Eventually, Babbage’s research led him, in 1837, to conceive of the “Analytical Engine,” a greatly improved design that was never fully built within Babbage’s lifetime but was documented meticulously in his notebooks. The Analytical Engine was much more complex than the Difference Engine and utilized a punch-card system to run Babbage’s arithmetic tasks. Ada Lovelace (1815-1852), a mathematician and daughter of Lord Byron (the English Poet), assisted Babbage’s work and was the first to recognize that the Analytical Engine should not be understood simply as an arithmetical device but rather as a generalized symbol manipulator capable of algebraic and even alphabetical results. That is, Lovelace correctly recognized Babbage’s invention to be the first universal computing machine (Swade 2001). Lovelace is often credited as the first computer programmer because of her contributions.

babbage
Babbage’s Difference Engine.
https://www.youtube.com/watch?v=XSkGY6LchJs

Nearly a century later, in 1936, another Cambridge mathematician named Alan Turing (1912-1954) formally proved that a hypothetical machine (later referred to as a “Turing machine”) using a binary pair of symbols could perform any mathematical computation if it were representable by an algorithm (Turing 1936). The idea that a machine could perform such rule-based computations, previously only accomplished by humans, was revolutionary and laid the groundwork for modern digital computers. Turing later utilized this insight in his contributions to the work of WWII-era codebreakers at Bletchley Park, which led to the cracking of the Enigma code utilized by the German military.

Despite the fact that Babbage and Turing both succeeded in producing machines that could replace human cognitive activities, the human element remained essential to the design and implementation of their work. For Babbage, humans were required to conceive of the machines in the first place, manufacture its parts, write the programs, set up their initial conditions, provide mechanical force for its operation, and interpret the results (Swade 2001). At each stage errors or malicious actors would be able to thwart its intended outcome. With physical access to Babbage’s device, a crafty enemy could introduce errant initial settings or even sabotage the physical machine itself—perhaps by grinding away a single cog-tooth thereby undermining the reliability of its results. In the case of the Enigma codebreakers, despite the fact that Enigma settings allowed for over 1.5 x 1019 possible setting permutations, Bletchley analysts were able to break the code because human implementation errors and human-induced cryptographic design weaknesses limited the number of actual permutations to a number solvable by the machine-based decryption techniques available to them (Thimbleby 2016).

eniac
Electronic Numerical Integrator and Computer (ENIAC).

Post World Wars

By 1946, the University of Pennsylvania had built the Electronic Numerical Integrator and Computer (ENIAC), arguably the first general-purpose electronic computer. It covered 1,000 square feet and was 10 feet tall. Programs, once written, took several people to load by setting dials, cable connections, and switches. Around 50 vacuum tubes per day had to be replaced simply to keep ENIAC running (Grudin 2005). Through the 1950s, the Soviet Union also made critical advances in computer technology, particularly through their BESM-1, BESM-2, BESM-6, and M-20 computers, which were then the fastest and most powerful computers in Europe (IIS 2015). Such early computer projects employed people in three critical roles (Grudin 2005): management, programming, and operation.

  • Managers oversaw the design, development, and operation of the machine.
  • Programmers wrote the programs.
  • Operators carried out the tasks required to implement the program.

Each of these roles was subject to vulnerabilities. Managers could mismanage, or be caused to do so by malevolent actors seeking to undermine their activities. Programmers could write unsecure software that introduced vulnerabilities into the system. Operators could make mistakes or be induced into doing so, thereby affecting the implementation of the program. As technical sophistication and complexity grew, human-related vulnerabilities seemed to grow in tandem. From the earliest days, humans were essential elements to the functioning and security of computers.

Debug Log
Log entry describing "first computer bug." Courtesy of the
Computer History Museum.
Leading up to the 1950's, Rear Admiral Grace Hopper (1906-1992) was assigned to Bureau of Ordnance Computation Project (BOCP) at Harvard University where she worked on applying Mark I technology to solving US Navy equations and problems that led her to publishing the first compilers for programming language, known as Arithmetic Language version 0 (A-0). This paved the way for programming code that was machine-dependent to being able to "compile" software that should be written and used on multiple machines. She was involved in the creation of the Universal Automatic Computer (UNIVAC), the first all-electronic digital computer and the brainchild of the creators of ENIAC. Credited for "debugging" the Harvard Mark II when removing a moth from the relay contacts, she noted in her logbook on September 9, 1945 at 1545, "Relay #70 Panel F (moth) in relay. First actual case of a bug being found." It is this reason in which
she is credited for the use of the word "bug" to describe a
computer glitch.
Grace Hopper
Grace Hopper examining in front of UNIVAC
magnetic tape drives. She holds a COBOL
programming manual in her hand.
Courtesy
of the Computer
History Museum.

By the mid-50's, Hopper led the release of the first English-command language Flow-Matic (B-0) and served as the director of the Navy Programming Languages Group in the Navy’s Office of Information Systems Planning and developed validation software for Common Business Oriented Language (COBOL) and its compiler as part of a COBOL standardization program for the Navy. COBOL is a standardized business computer language still widely used in applications based on mainframe computers used by banks and insurance companies today. In her later years, she was a senior consultant for Digital Equipment Corporation (DEC), where she worked well into her 80's.

Her contributions to the Navy’s computing infrastructure made her an invaluable asset to the service. After retiring from the Navy Reserve in 1966, at age 60, with the rank of Commander, she was recalled and continued to serve until 1986 when she retired as a Rear Admiral. At the time of her retirement, at 79 years of age, she was the oldest commissioned officer in the United States Navy. With 40 honorary degrees, the National Medal of Technology, the Defense Distinguished Service Medal, and (posthumously) the Presidential Medal of Freedom, Hopper's work with computers not only gained national attention but was recognized internationally.

Throughout her entire career Hopper had to overcome many challenges, not only as a woman in the areas of Science, Technology, Engineering, and Mathematics (STEM) but also in the Navy. Inaugurated in 2020, Hopper Hall is the first building across the service academies to be named after a woman. The Guided-missile destroyer USS Hopper (DDG-70) was comissioned in 1996, the second US Navy warship to be named for a woman from the Navy's own ranks. Nvidia, a company that designs and manufactures computer graphics processors, released the GH200 Grace™ Hopper™ Superchip in May of 2023 for use in giant-scale AI and high-performance computing (HPC) applications (Nvidia, 2023).

USS Hopper - DDG70
USS Hopper (DDG-70). US Navy Photo.
Hopper Hall
Image of Hopper Hall in April 2021. Photo by LCDR John Paramadilok.

The Information Age

The mid-20th century kicked off the information age. The World Wars drove the need to use advanced methods for improving and speeding up mathematical computations but contributions from Babbage, Lovelace, Turing, and Hopper all set the stage for modern day computing. Bulky electronics that filled entire rooms began to condense and fit onto desks. Data was stored on physical mediums and moved from room-to-room and building-to-building, known now as "sneakernet." Complex machine coding transitioned to computer languages that could be compiled across systems. The confluence of interconnected computers, or networking, and Operating System (OS), which connected machine hardware to software that would manage its resources, accelerated the use of computing from academic institutions to personal computing and having one in every home.

Networking and Interconnecting Computers: 1960's through the 1980's

Early computers weren’t connected to one another, which meant that security concerns largely remained a localized problem. This changed in the late 1960’s when the Advanced Research Projects Agency Network (ARPANET) was created. By December 1969, the ARPANET connected computers at the University of California, Los Angeles (UCLA), the University of California, Santa Barbara (UCSB), Stanford University, and the University of Utah. Six months later computers at the Massachusetts Institute of Technology (MIT), Harvard University, and a small Boston-based firm called Bolt, Beranek and Newman (BBN) were connected (IIS 2015). As computer networks grew, security concerns relating to lost information, intercepted communications, and the verification of identities began to emerge, albeit among only a small number of specialists (Warner 2012). Willis H. Ware, an early computer security pioneer, specifically emphasized the importance of humans to security, noting that “[T]here are human vulnerabilities throughout; individual acts can accidentally or deliberately jeopardize the protection of information in a system” (Ware 1967, 11).

If computers had not yet been connected to form a network, what had? In the 1960s-1970s, the most sophisticated networked technology was the telephone system. Early phone-hackers, calling themselves “phone phreaks” used their growing technical knowledge of the way phone system networks operated—their circuits, switches, relays, tonal complexities, and network diagrams—to hijack the telephone system for their own purposes, whether that be to avoid fees, connect to foreign conference calls, or gain access to areas of the network considered off-limits using normal telephonic protocols (Orth 1971 & Rosenbaum 1971).

One of the early pioneers in phone phreaking, John Draper (aka “Cap’n Crunch”), discovered that a 2600 hertz tone would, when produced by a regular phone user, provide access to the Operator Mode used by phone operators to connect calls to anywhere in the world. Draper’s nickname “Cap’n Crunch” originated in his discovery that a small toy whistle offered in Cap’n Crunch cereal boxes at the time effectively produced exactly the 2600 Hz necessary to access this Operator Mode (Orth 1971, 28).

draper
John “Cap’n Crunch” Draper.

Another technique used by phone phreakers utilized what came to be known as a “Blue Box,” a mechanical telephone keypad box—constructed by phreakers themselves—with a speaker allowing tonal output. By understanding how the telephone calls get routed over network trunk lines, phreakers could trick the network into bypassing the normal toll collection and allow them to route calls on their own (Rosenbaum 1971).

Phone phreakers didn’t confine themselves to technical hacks. Interviews with John Draper reveal that often he and his friend and fellow pioneer phreaker, Dennis Dan “Denny” Teresi, would use human manipulation techniques called “social engineering” to gain needed information from unsuspecting Bell Telephone employees. Draper described social engineering as “the ability of going in and talking to people on the inside of the phone company…making them believe you were working for the phone company" and acclaimed Teresi as its foremost expert of the day (Draper 2001). You will learn more about modern social engineering techniques later in the course.

blue_box
Blue Box.

Before phone phreaking, the term “social engineering” had only been applied to the activities of powerful policy planners—individuals in business or government attempting to cure what they identified as “social ills” through the use of their superior technical knowledge of public policy and economics. Phone phreakers inverted this power structure, thereby inaugurating what would later be called the “hacker mentality.” Here were relatively powerless individuals—often teenagers—usurping the designs of powerful phone companies. The other inversion that took place under this new application was from the allegedly benign purposes of the powerful policy planners to the nefarious purposes of the phreakers themselves. Phreakers reversed the social hierarchy that had stood alongside the concept of social engineering and, at the same time, put this tactic to their own disreputable—typically illegal—uses (Hatfield 2018).

Two early phone phreakers, Steve Jobs and Steve “Woz” Wozniak, later founded Apple Computers which helped, in the 1980s, usher in the era of the personal computer (Lee 2001). Throughout that decade, the adoption of the Transmission Control Protocol / Internet Protocol (TCP/IP) model, the invention of the World Wide Web in 1989, the proliferation of personal computers, and the client-server model for network services, effectively united the computer community with the phone phreakers—particularly since early computer networks communicated over telephone lines (Denning and Denning 2016, 5).

jobs
Steve Jobs, right, with his friend and co-founder, Steve Wozniak.

In 1984, the earliest hacker magazine 2600: The Hacker Quarterly, which took its name from the 2600 hertz tone used by phone phreakers, began publishing anonymous articles on how to manipulate and repurpose, or “hack,” technologies such as telephones and the newly available personal computers. Other rival magazines soon appeared such as Phrack (a portmanteau of “phreak” and “hack”) founded by Craig Neidorf (aka "Knight Lightning") alongside Randy Tischler (aka "Taran King"). Phrack began publishing in November 1985 and continues today online (Hatfield 2018). In 1990, Neidorf (alongside Robert Riggs) was later arrested and charged with possession and distribution of a stolen BellSouth document, which BellSouth claimed was worth $80K. Facing up to 31 years in prison, Neidorf’s defense, in United States v. Riggs, was able to demonstrate that the information in the documents could have been acquired for $13 (Denning 1991).

Early phone phreakers and computer hackers shared what scholars call the “hacker ethic,” a sensibility that developed over time and remains prevalent in the hacker underworld. Principles of the hacker ethic include (Levy 1984, Steinmetz and Gerber 2015):

  • Access to computers is a right
  • Hackers should be judged by their abilities rather than criteria such as degrees, age, sex/gender, race, or position
  • A do-it-yourself mentality of exploration and manipulation
  • General disregard for traditional rules and norms
  • An assumption that information should be open and available; the burden of proof is on those who want to maintain confidentiality (e.g. governments, corporations)
  • The use of anonymity (e.g. nicknames, anonymizing protocols) to protect against unjustified coercion by authorities
  • Distrust of authority—promote decentralization
  • The sharing of innovations among other like-minded individuals
The open-source software movement is an extension of this ethic. So too are the activities of hacker activist (“hacktivist”) groups such as Anonymous, a decentralized international hacktivist group which emerged in 2004 and became widely known for its various Distributed Denial of Service (DDoS) attacks against several governments, government institutions and government agencies, corporations, the Church of Scientology, and other targets (Olson 2013). Within the World Wide Web that emerged in the 1990s and remains the dominant technology today, such groups have found new ways to carry out rebellion and political activism all while remaining relatively decentralized, anonymous, and free from counterattack—whether by law enforcement or their political enemies.

Computing Into the 21st Century

Rewinding back to the same year as ARPANET was being established in 1969, Ken Thompson, Dennis Ritchie and others started working on the DEC Program Data Processor (PDP) 7 at Bell Labs, which would later become to be known as UNIX -- yes the same DEC mentioned earlier where Hopper eventually worked at. This stemmed from the need to rewrite an Operating System (OS) in order to play space war on another smaller machine with only 4k of memory, resulting in a system which a colleague called UNICS (UNiplexed Information and Computing Service) and was a play on the Bell Labs failed time-sharing operating system called Multics. Although it cannot be confirmed how the spelling eventually became UNIX, the early versions released in the early 1970's aren't too far off from today's systems.

The first website in the world was published and hosted in 1989 by Tim Berners-Lee, a British scientist working at CERN. Within a few years, the World Wide Web (WWW) software was made available in the public domain with an open license, allowing the web to expand to what it's known as today.

Tux
Official Linux logo from the
linux.org website in Feb 1998.

Courtesy of the WayBackMachine.

In 1991, Linus Torvalds created the Linux OS out of a need to provide an alternative to the Microsoft (MS) Desktop Operating System (DOS). The name derived from a combination of his name and the UNIX with his mascot a penguin named "Tux" and is a recognizable symbol for the OS around the world. What set Linux apart was that he released the Linux OS source code for free, accessible for anyone to use anywhere in the world, on the Internet. His philosophy was that if the software was free, anyone with interest in computer programming could modify and improve the system, ultimately improving it over time. Licensed under GNU General Public License (GPL), there were an estimated seven million computers running Linux by 1999.

Torvalds describes his work with statements like "Linux hackers do something because they find it to be very interesting" (Himanen, 2001). For the hacker, "the computer itself is entertainment," meaning that the hacker programs because he finds programming intrinsically interesting, exciting, and joyous (Himanen, 2001). Companies like Oracle, Intel, Netscape, Corel, and IBM financially supported Linux development and made their hardware and software Linux-compatible. The Apache web server, originally written for Linux, led to many companies switching to the operating system and is also what is used at the Naval Academy.

President's 60-Day Cyberspace Policy Review

21st century societies rely on information to make informed decisions about every aspect of life due to the inherent value information contains through the benefits it provides to its owners. However, individuals and/or groups authorized to utilize information must maintain possession of their information and keep it out of reach from those who seek to use it for their own advantage. Securing information has been an important aspect throughout history and the widespread use of a public internet has created an environment in which information can be compromised from any geographic location.

Internet-related vulnerabilities can be traced back to the 1980s and have grown exponentially more complex and numerous in the decades since. The United States is a wealthy and powerful nation and maintaining an uninterrupted and uncompromised flow of information is vital to our continued success. In May of 2009, President Obama directed a 60-Day Cyberspace Policy Review which among a variety of recommendations included an action item to "expand and train the workforce, including ... cyber security expertise in the Federal government". In the fall semester of 2011, the Superintendent of the United States Naval Academy directed that an Introduction to Cyber Security course be added as part of the core curriculum for all Midshipmen to fulfill the action item from the 2009 60-Day Cyberspace Policy Review and prepare future Officers for the Cyberspace challenges they will encounter in the Navy and Marine Corps.

Cyberspace Domain: What is it?

The term "Cyber" is relatively new (first mentioned in the 1970s) and professionals have not reached a consensus on what Cyber means. That being said, the term Cyber is so widely used that most people have a general idea of what it entails. For the purpose of this course we will use the following two definitions when thinking about Cyberspace:

  • Common: In Layman's terms the Cyberspace domain is "the domain characterized by the 'human' use of electronics and the electromagnetic spectrum to store, modify, and exchange data via networked systems and associated physical infrastructures."

  • Department of Defense: The Joint Publication 3-12 on Joint Cyberspace Operations (JP 3-12) defines the Cyberspace domain as "a global domain within the information environment consisting of the interdependent networks of information technology infrastructures and resident data, including the Internet, telecommunications networks, computer systems, and embedded processors and controllers."

The cyberspace model can be described in terms of three interrelated layers: (1) physical network, (2) logical network, and (3) cyber-persona.

Cyberspace Model
The three interrelated layers of cyberspace. Courtesy of US Cyber Command, as depicted from JP 3-12, Joint Cyberspace Operations (Dec 2022).
  • The physical network layer consists of the Information Technology (IT) devices and infrastructure in the physical domains that provide storage, transport, and processing of information within cyberspace, to include data repositories and the connections that transfer data between network components. The physical network components include the hardware and infrastructure (e.g., computing devices, storage devices, network devices, and wired and wireless links). Components of the physical network layer require physical security measures to protect them from physical damage or unauthorized physical access, which can sometimes be leveraged to gain logical access. While geopolitical boundaries can easily and quickly be crossed in cyberspace, crossing these boundaries involves the principle of territorial sovereignty tied to the physical domains. Every physical component of cyberspace is owned by a public or private entity, which can control or restrict access to its components.
  • The logical network layer consists of those elements of the network related to one another in a way that is abstracted from the physical network, based on the logic programming (code) that drives network components (i.e., the relationships are not necessarily tied to a specific physical link or node but to their ability to be addressed logically and exchange or process data). Individual links and nodes are represented in the logical layer, but various distributed elements of cyberspace, including data, applications, and network processes, are not tied to a single node. A website may exist on multiple servers in multiple locations in the physical domains but is represented as a single Uniform Resource Locator (URL) on the World Wide Web (WWW). There are things with intrinsic value that exist only in the logical layer, such as digital currency, non-fungible tokens, or a retirement savings account.
  • The cyber-persona layer is a view of cyberspace created by abstracting and combining data from the logical network layer to develop descriptions of digital representations of an actor or entity identity in cyberspace (cyber-persona). The cyber-persona layer consists of network or IT user accounts, whether human or automated, and their relationships to one another. Cyber-personas may relate directly to an actual person or entity, incorporating some personal or organizational data (e.g., email and IP addresses, websites, phone numbers, Web forum logins, or financial account passwords). One individual may create and maintain multiple cyber-personas through use of multiple identifiers in cyberspace, such as separate work and personal email addresses, and different identities on different Web forums, chat rooms, and social networking sites, which may vary in the degree to which they are factually accurate. Conversely, a single cyber-persona can have multiple users, such as multiple hackers using the same malicious software (malware) control alias, multiple extremists using a single bank account, or all members of the same organization using the same email address.

As a midshipman, you have multiple physical devices (laptop, tablet, cell phone, Common Access Card) that may be secured in a wall locker at Bancroft Hall, forgotten at Nimitz Library, or stolen in Downtown Annapolis. Accessing the GNBA wireless network requires physical proximity or a network cable to connect you to network resources. Accessing records in the Midshipmen Information Database System (MIDS) will provide logical information that exists on servers, storage devices, or in the cloud. Logging into your email with your 'alpha' as your username is a cyber-persona that will be specifically associated with you over the next four years.

Cyberspace Domain: Where is it? (Aside from everywhere)
We will come to learn in this course that the Cyberspace Domain happens all around us, all the time. Aside from the day-to-day life interactions, Cyberspace is taking the workforce by storm. Check out this infographic on where cyber jobs are and what a career in the cyberspace domain could look like.

Major Cybersecurity Incidents

Recent Cyber Trends: COVID-19
Trends have changed a lot in this ever-evolving field-especially since COVID-19. This article by PurpleSec discusses how workforces will have to permanently change the way they do business due to the pandemic and the major cybersecurity impacts that this pandemic causes. Purplesec even states that "Cybercrime as a whole has increased by 600% since the start of the pandemic."

Over the past decade, an ever growing number of high profile incidents have caught the attention of governments, organizations, companies, and individuals around the world. Understanding these incidents and their implications will help Navy and Marine Corps Officers understand the challenges they may be confronted with in the fleet and how to avoid becoming the subject of a future Cyber incident. Below are some of the most significant and impactful Cyber incidents in recent years:

  • Operation Rolling Tide: was the Navy's first officially named Cyber Operation. The operation was initiated in response to Iranian intrusion into an unclassified Navy network. The amount of information and the usefulness is not publicly known, but the Navy recognized the importance of dedicated incident response defense teams and stood up the Cyber Mission Force to defend networks and rapidly respond to breaches.

  • Democratic National Committee (DNC) Email Leak: In 2014 a Russian intelligence and hacking group operating under the name Internet Research Agency began targeting various organizations involved in the upcoming 2016 U.S. Presidential Election. In March of 2016 John Podesta, the Hillary Clinton campaign chairman, and others were targeted by spear-phishing emails, which we'll talk more about later in the course. One spear-phishing email successfully obtained the login credentials for John Podesta's email account allowing hackers to steal over 50,000 emails. The DNC Email Leak is only a small part of a larger investigation into suspected Russian interference in the 2016 U.S. Presidential Election.
  • Effects of Data Breaches on People
    Data breaches can affect society in several ways. The victims are not only companies or nation-states. This article by USA Today discusses the effects of a major cyber security incident on a major company, T-Mobile, and the impacts on everyday people. Several millions of people were affected with their PINs, account information, and payment info compromised. The trickle-down effects include companies paying out for insurance protection and lawsuits. We will investigate how these hacks happen and what you can do to safeguard yourself.

  • Office of Personnel Management (OPM): In April of 2015 the Office of Personnel Management, the federal government's human resources department, announced a breach and data theft that affected an estimated 22 million government employees, many of which included military members and possibly your instructor. U.S. government officials suspected Chinese hackers based on investigations. Most of those affected only had their security clearance file information stolen, but an estimated 4 million employees also had social security numbers, addresses, fingerprints, performance evaluations, and job assignment information stolen.

  • Petya/Notpetya: Malware that affected Microsoft Windows computers in 2016 and 2017. Petya is a ransomware, a type of malware that prevents victims from accessing their data and requires the victim to pay a ransom in order to regain access to their information. Notpetya closely resembles Petya, but is not a ransomware and only exists to prevent users from accessing their data with no hope of ever recovering it. Loss of data can cause billions of dollars in lost productivity and countries like North Korea and Russia use cyber crime to boost their treasuries when their economies suffer from sanctions and other negative factors.

  • Equifax: In May of 2017, hackers used a known vulnerability to gain access to the personal data systems of Equifax, one of the three major U.S. credit reporting companies. The attackers stole the personal data of 143 million people and put every single one of the victims at risk for financial and identity fraud. The hack occurred due to a failure to fix/patch the vulnerability two months after it had been publicly distributed. The hack brought attention to the private sector's responsibility to protect consumer data and is still a topic of intense debate.

  • Solarwinds: In early 2020, suspected Russian hackers inserted malicious code within Solarwind’s software system called “Orion”. When Solarwinds sent out software updates for “Orion” to its 33,000 customers the malicious code was able to create backdoors that enabled hackers to spy on many different organizations including the National Nuclear Security Administration which maintains the U.S. nuclear stockpile.

  • Colonial Pipeline: In the spring of 2021, hacking group Darkside was suspected of launching an attack against U.S. energy distribution company, Colonial Pipeline. This ransomware attack caused Colonial Pipeline, which supplies a majority of the gasoline to the U.S. Eastern Seaboard, to have to pause operations for several days until a Bitcoin ransom was paid. This disruption caused increased gas prices and fears of critical infrastructure security problems.

  • Apache Log4j: Scoring a maximum of 10 out of 10 in the Common Vulnerability Scoring System (CVSS), Apache Software Foundation's Log4j Java-based logging library was an actively exploited vulnerability uncovered at the end of 2021 that could be weaponized to execute malicious code and allow a complete takeover of systems. It's present in many software manufacturers like Amazon, Apple iCloud, Cisco, Cloudflare, ElasticSearch, Red Hat, Steam, Tesla, Twitter, and video games such as Minecraft. Integrated into millions of systems, Akamai assessed that approximately 57% of observed exploitation activity within the first week of public disclosure originated from known malicious actors (CSRB, 2022).

  • Viasat Satellite Network: Cyber attacks extend into the space domain. Moments before Russian troops crossed the border into Ukraine on February 24th 2023, tens of thousands of modems were knocked offline throughout Ukraine and European countries. The deliberate attack targeted a misconfigured Virtual Private Network (VPN) appliance, providing remote access to a trusted management segment of Viasat's network. Moving laterally across the management network, the attacker executed commands that resulted in a simultaneous, targeted attack that overwrote software used by flash memory, knocking modems offline and rendering them unusable. Over 30,000 modems have been distributed to restore service back to customers.

The DIKW Hierarchy

Computers store and transmit data in the form of 1's and 0's. Alone, this data doesn't have any value or significance but when combining data sets it becomes information. As you'll learn over the next few classes, data can represent different forms of information where 10 can equal ten, 10 can equal two, 10 can equal A, or 10 can equal 49 48.

Information may not be adequate in itself as well and can change depending on how it's viewed. Take the five numeric digits - 21402. What comes to mind?

If those sequence of numbers are found in your bank account with a deposit of $21402 you might be excited or if you received a tuition bill of $21402 the emotions will be very different. The $ symbol leading the numeric sequence takes the information and applies knowledge of a financial gain or burden based on its context. How about Annapolis, MD 21402? Now you've applied wisdom knowing the same five-digit sequence is a zip code.

The DIKW hierarchy builds on itself, representing Data < Information < Knowledge < Wisdom. When considering cybersecurity, little bits of data can lead to the compromise of information based on the knowledge of how computer systems work. The main purpose of cybersecurity is to protect data and that is what our pursuit will be over the next 16 weeks in order for you to attain some wisdom!

Tenets of Cybersecurity

In all of the major cybersecurity incidents mentioned earlier, data compromise resulted in the violation of confidentiality, integrity, and availability - the CIA triad. This term was first found in the NASA Information Security Plan document in 1989 but may have been coined as early as 1986 (Saylor Academy, 2023). You may notice pillars outside of the classrooms, which define each part of the CIA-triad. Let's look more into definitions and examples involving the tenets of cybersecurity.

Confidentiality:

Preserving authorized restrictions on information access and disclosure, including means for protecting personal privacy and proprietary information.
If other midshipmen are able to access the Midshipmen Information Database System (MIDS) to view your Midshipmen Academic Performance Reviews (MAPRs) written by your professors, then the concept of confidentiality has been violated.

Integrity:
Guarding against improper information modification or destruction, and includes ensuring information non-repudiation and authenticity.
Modifying your Order of Merit (OOM) in MIDS to something other than what the system should calculate as for your graduating class violates the principle of data integrity. This includes accessing MIDS using someone else's credentials (authentication) or spoofing and email from someone authorized to make changes, such as a professor, to a system administrator (non-repudiation). Note that data integrity is different from the honor concept of integrity.

Availability:
Ensuring timely and reliable access to and use of information.
Not being able to log into MIDS to register for classes that are in high demand next semester violates the principle of availability. The first Denial of Service (DoS) attack was the Morris Worm that prevented availability for 10 percent of the systems connected to the Internet in 1988.

More Information on Data Breaches

Here is visualization of the world's biggest data breaches and hacks: Information is Beautiful Blog

An overview of Verizon's annual report on data breach investigations: Verizon's Data Breach Investigation Report


Supplemental Media:

Hacking Google - Operation Aurora


Review Questions:

  1. Name the key historical figures in the early days of computers?
  2. What impacts did Grace Hopper have to the Navy and in STEM?
  3. What influenced the need to interconnect computer systems?
  4. How were computers connected before there were networks?
  5. Who invented the World Wide Web (WWW)?
  6. What did the President's 60-Day Cyberspace Policy Review accomplish?
  7. How does the Department of Defense define the Cyberspace domain?
  8. What are the interrelated layers of the cyberspace model?
  9. What are some of the impacts of major cybersecurity incidents in the past decade?
  10. What do the components of the DIKW hierarchy represent?
  11. How is the CIA-triad applied to cybersecurity?


References

  1. Cyber Safety Review Board (CSRB). (2022). "Review of the December 2021 Log4j Event." Department of Homeland Security, 5. [Online]. Available: https://www.cisa.gov/sites/default/files/publications/CSRB-Report-on-Log4-July-11-2022_508.pdf
  2. Cyberseek. (2021). "Cyber Heatmap." [Online]. Available: https://www.cyberseek.org/heatmap.html
  3. D. E. Denning, "The United States vs. Craig Neidorf: A Debate on Electronic Publishing, Constitutional Rights and Hacking," Communications of the Association for Computing Machinery, vol. 34 no. 3, pp. 22-43, 1991.
  4. P. J. Denning and D. E. Denning, "Cybersecurity is Harder than Building Bridges," American Scientist, vol. 104 no.3, pp. 1-6, 1991.
  5. L. Greiner, “Hacking your network’s weakest link – you,” Network Magazine, vol. 12 no.1 pp. 9-12, 2008.
  6. J. Grudin, “Three Faces of Human-Computer Interaction,” IEEE Annals of the History of Computing, vol. 27 no.4, pp. 46-62, 2005.
  7. J. Hatfield, “Social Engineering in Cybersecurity: The Evolution of a Concept,” Computers & Security, vol. 73, pp. 102-113, 2018.
  8. S. Levy, Hackers: Heroes of the Computer Revolution. New York: Doubleday, 1984.
  9. Nvidia. (2023). "NVIDIA Grace Hopper Superchip: The breakthrough accelerated CPU for giant-scale AI and HPC applications." [Online]. Available: https://www.nvidia.com/en-us/data-center/grace-hopper-superchip/
  10. M. Orth, "For Whom Ma Bell Tolls Not," Los Angeles Times, Oct. 31, 1971, pp. 28-32.
  11. R. Rosenbaum, "Secrets of the Little Blue Box," Esquire Magazine, Oct. 1971, pp. 117-125, 222-226.
  12. K. Steinmetz and J. Gerber, “‘It Doesn’t Have to be This Way’: Hacker Perspectives on Privacy,” Social Justice, vol. 41, no 3, p 29, 2015.
  13. D. Swade, The Difference Engine: Charles Babbage and the Quest to Build the First Computer. New York: Viking-Penguin, 2001<.
  14. Thimbleby, “Human Factors and Missed Solutions to Enigma Design Weaknesses,” Cryptologia, vol. 40, no. 2, pp. 177-202, 2016.
  15. A. Turing, "On Computable Numbers, with an Application to the Entscheidungs (Decision) problem," Proceedings of the London Mathematical Society, vol. s2-42, no. 1, pp. 230-265, 1936.
  16. M. Warner, “Cybersecurity: A Pre-History,” Intelligence and National Security, vol. 27, no. 5, pp. 781-799, 2012.