SMIMIC Schedule

The San Marcos Informal Mathematics In-person Colloquium will take place every other Thursday, 12–1 PM, in Commons 206.
9/8 Hanson Smith, Richard Guy's Strong Law of Small Numbers and How Not to Make Friends

This talk will be a guided tour through Richard Guy's wonderful paper The Strong Law of Small Numbers. The main theorem is, "You can't tell by looking." Following Guy, we will prove this by intimidation. Audience participation, questions, and guesses are highly encouraged. We'll conclude the talk with an exciting application to web comics!

9/22 Shahed Sharif, SIKE Attack: How to Break Post-Quantum Crypto

Over the past few years, the National Institute of Standards and Technology has been running a competition to standardize new cryptographic protocols which are safe from quantum computers. This past July, one of the top contenders, SIKE, was spectacularly broken. I will give an overview of the attack and explain the consequences.

10/6 Mikaela Meitz, Hamiltonian Neural Network Exploration for Electron Particle Tracking

In the field of accelerator physics, there is a burgeoning interest in using machine learning methods for aiding in the design and optimization of charged particle accelerators. The Advance Light Sources (ALS) at the Lawrence Berkeley National Laboratory is a periodic circular accelerator called a synchrotron that emits ultraviolet and soft x-ray beams by accelerating electron bunches nearly as fast as the speed of light. These accelerators are prone to beam instability resulting in particle loss and consequently creating less x-ray brightness. The stability of an electron over thousands of revolutions is important to the performance of the accelerator. During these machines’ design or upgrade process, electron particle tracking is needed to ensure the particle dynamics are sufficient for the intended scientific use, but can be computationally expensive. If the dynamic aperture, the stability region of phase space in the synchrotron, is too small, then adjustments are made and the process is repeated until the desired result is achieved. Optimizing the dynamic aperture can require doing this tracking several times while iterating the accelerator design. Machine learning methods may alleviate some of the need for these expensive computations by making particle integration faster and easier to parallelize. This research explores electron particle tracking with the use of Hamiltonian Neural Networks. Machine learning based Hamiltonian Neural Networks (HNN) constrains the model learning to obey Hamiltonian mechanics so that the neural network can learn conservation laws from data. We compare the performance of HNN to other machine learning based models.

10/20 Kimberly Ayers, Baby Sharkovskii doo doo doo

It's time to get sharky—Sharkovskii, that is! Have you ever looked at the usual ordering on \(\mathbb{N}\), and thought, "I bet I can do better"? Well, Oleksandr Mykolaiovych Sharkovskii did just that in 1964 when he proved what is now called Sharkovskii's Theorem. This is Dr. Ayers's favorite theorem in the entire world. Come hear her talk about discrete dynamics, bifurcations, chaos theory, weird orderings on the natural numbers, graph theory, and even applications to candy making.

11/3 Sandie Hansen, Math on the Brain: Using Neuroscience to Better Understand How People Learn Math

A human’s natural number sense first developed as a basic survival skill, but the abstract mathematics we teach in classrooms today is not. Using neuroscience and cognitive psychology research, we shall discuss why math is so hard for so many people and whether there is really such a thing as a "math person". The evolution of mathematics has exponentially outpaced that of the human brain, by studying the cognitive mechanisms involved in processing mathematical operations we will build a better understanding of how students learn math and we will consider how to adapt the math classroom using more brain-friendly methods.

11/17 Wayne Aitken, Filters and Ultrafilters

In calculus and especially in analysis we learn about a plethora of types of limits: one-sided limits, two-sided limits, limits of sequences, limits of functions, limits with finite values, limits with infinite values, not to mention limsups and liminfs and limits of "nets". What do these definitions of limits have in common?

In 1937 the celebrated French mathematician Henri Cartan developed a theory of "filters" to provide a common framework to the theory of limits. Since then filters have proved to be a valuable tool in topology and set theory.

Even more revolutionary than a filter is an "ultrafilter". These are filters on steroids. These magical devices can be used to build objects with amazing properties. For example, we can build a nonstandard model of the real numbers called the "hyperreals" which have infinitesimals. Infinitesimals are positive numbers that are "infinitely small", smaller than any positive rational number. These were used in the early days of calculus to develop the theory of derivatives and integrals, and are still used informally in physics and engineering. In the 1800s mathematicians banned these troublesome and possibly inconsistent infinitesimals from the real numbers, and replaced them with a precise theory of limits (using epsilon-delta quantification for example). However, around 1960 Abraham Robinson realized that ultrafilters can be used to build a a nonstandard field of real numbers that include infinitesimals. So the use of infinitesimals was proved to be consistent after all, launching a new field called "nonstandard analysis".

Soon after this, mathematicians began to use ultrafilters to prove results outside of nonstandard analysis. For example, James Ax and Kochren used ultrafilters to prove the important Ax-Kochen theorem in diophantine equations. More recently mathematicians such as Terence Tao have used ultrafilters in combinatorics (additive and multiplicative combinatorics) to convert finite "approximate groups" to infinite structures with a toplogical structure (locally compact spaces related to Lie groups), and then used these new structures to prove powerful results about finite approximate groups. Tao calls this type of use of ultrafilters "a bridge between discrete and continuous analysis".

In my talk I will introduce filters. I will also show how filters can be used to unify the theory of limits. Time permitting I will explain how ultrafilters are made, and how they can be used to construct infinitesimals. Other applications may be very briefly mentioned to give a sense of what is possible with ultrafilters. For example, the amusing Ax-Grothendieck theorem which asserts that every injective polynomial function from \(\mathbb{C}^n \rightarrow \mathbb{C}^n\) is actually surjective can be proved using ultrafilters. It starts by proving this result for finite fields using the pigeonhole principle, and then using ultrafilters to convert this result to infinite fields of "characteristic zero".