Basic Notions Seminars
AY2025-26
The Basic Notions seminar at the US Naval Academy hosts talks from faculty and midshipmen about the research they're doing. The goal is to explain the research at a very "basic" level; to introduce different types of problems you might think about in OR/applied math/pure math.
For midshipmen: This is a chance to think about what you might do for your honors, Trident or Bowman project, who you might want to work with, or what type of electives you want to take. Or, if you just like thinking about cool problems and how to solve them, stop by.
For faculty: If you would like to speak in the seminar please let me know! We're always looking for more speakers.
All talks are from 12:40-1:15 p.m. in the Seminar room, unless otherwise specified.
-
Apr24
-
Apr06
-
Decentralized Splitting for Large Optimization ProblemsCDR Peter BarkleyUSNA-MathTime: 12:40 PM
View Abstract
Large-scale optimization problems over the sum of a set of convex functions often benefit from splitting techniques which allow us to use each function individually, either to accommodate some structure in each function, or to scale to larger problem sizes. This presentation provides an introduction to two popular proximal splitting methods, Douglas-Rachford Splitting and the Alternating Direction Method of Multipliers (ADMM), and then introduces a novel, generalized framework for designing new proximal splitting algorithms. By formulating and solving a specific semidefinite programming problem, we can systematically construct "frugal" proximal splitting algorithms (which use each proximal operator only once per iteration) that are both fast and customizable.
-
Mar20
-
Pasture-raised matroidsMax WakefieldUSNA-MathTime: 12:40 PM
View Abstract
Matroids are abstractions of linear independence structure. Consequently, one of the main questions in matroid theory is determining the conditions under which a matroid can be realized as a set of vectors (or a matrix) with coefficients in a fixed (or any) field. Furthermore, a realizable matroid generates multiple beautiful topological spaces: the hyperplane arrangement complement, the reciprocal plane, and matroid Schubert varieties. Unfortunately, it turns out that not all matroids are realizable; moreover, asymptotically, almost no matroids are realizable. However, in a recent series of papers, Baker and Lorscheid construct an algebraic structure called a pasture—a relaxation of a field—under which every matroid is realizable. In this talk, we will define matroids and their (possible) realizations over a field, and then define pastures. If time permits, we will examine matroid realizations over pastures and the universal pasture of a matroid.
-
Feb06
-
The Art of Why: Basic principles of causal inferenceDoug VanDerwerkenUSNA-MathTime: 12:40 PM
View Abstract
Does Tylenol cause autism? Do smartphones make us dumber? Why are teen suicide rates so high compared to a generation ago? I won't answer any of these questions; I'm not even planning on discussing them. But each of these questions, at its core, is a question about causation. What does it mean for one thing to cause another? How can we know if one thing causes another? We'll tackle these questions.
-
Dec02
-
Learning from data to make better predictions in dynamical systems.Josh HudsonUSNA-MathTime: 12:40 PM
View Abstract
Since Issac Newton proposed force is mass times acceleration, differential equations have been the language of physics. We can often use first principles to model natural phenomena as a system of differential equations, but solving these systems requires specifics; for example, measurements of the initial state are needed to predict future values. However, measurements are finitely accurate, and time can magnify errors in predictions (think of the butterfly effect). Fortunately, it is often the case that we don't have to passively wait while a prediction unfolds, and instead can continue to take measurements and see how our predictions are mistaken. How to best learn from evident prediction error while taking account of the model dynamics is the subject of data assimilation. In this talk we will discuss differential equations, and introduce some classical data assimilation techniques and discuss how they allow us to make more accurate predictions.
-
Oct21
-
The Basics of Model Simplification and Dimension Reduction in Differential EquationsIrina PopoviciUSNA-MathTime: 12:40 PM
View Abstract
TBD
-
Sep30
-
Sequential Analysis and ApplicationsAustin WarnerUSNA-MathTime: 12:40 PM
View Abstract
In statistical methodology, traditionally, all the data is available from the start. In this short informational talk, I will discuss problems in statistics where the data are made available sequentially (over time), rather than being available all-at-once. Attendees will learn about historical motivations (spoiler: WWII) and modern applications. I will pitch general research ideas for students and faculty to promote discussion.
-
Sep16
-
A New Integral TransformMike HoffmanUSNA-MathTime: 12:40 PM
View Abstract
While integral transforms are most often used to solve differential equations, they have other uses. I will talk about a transform introduced by Chavan, Kobayashi, and Layja to obtain an integral formula for a particular type of series. In fact, the transform leads to many other series formulas as well, and also reveals remarkable relations among special functions.