1. mindfuckmath:

    No, really, pi is wrong: The Tau Manifesto

    Happy Tau Day, everyone!  In case you don’t know, Tau = 2 Pi, or the ratio of a circle’s circumference to its radius which works out to being 6.283185307179586

    Compared to Pi, Tau is a much more natural and logical choice for the circle constant. Think about the radians on a circle. For example, a quarter of a circle is a quarter of Tau radians, as opposed to the confusing half Pi radians.  Half of a circle is half Tau radians, as opposed to the bewildering Pi radians.  A full circle is Tau radians, as opposed to the inelegant and ridiculous 2 Pi radians.

    Make sure to click on the link above for videos and all sorts of other good stuff on Tau, and read Michael Hartl’s brilliant The Tau Manifesto. I also recommend reading the wonderful paper that started this debate Pi is Wrong! by Bob Palais.  

    Happy Tau Day 2014!

    Have you accepted the circle constant into your heart and given your life over to geometric beauty and reason? We just want you to stop living in the darkness of π and let the light of τ illuminate your understanding. Perhaps we could just leave you with a copy of the Tau Manifesto. The truth will set you free!

    [with apologies to the Goose.]

     
  2. Measure Theory. Measure theory is the study of an abstract generalization of natural “measures” of geometric objects, like length, area, and volume. In 1904, Lebesgue discovered how to tease out a notion of generalized addition (or integration) from these objects. His integral would shatter the previous limits of the prior knowledge laid out by Riemann (yes, that Riemann) and Stieltjes. 

    I could write a long block of text explaining what inverse functions are, but I think I will take the picture-worth-1000-words route here.

    In fairness, the picture is not totally accurate, but it’s pretty darn close. Those of you with formal familiarity of inverse functions might enjoy finding the flaws :)

     
  3. Linear Algebra. Linear algebra is the study of spaces which are in some sense “flat” like points, lines, planes, and their higher-dimensional analogues. The tools of linear algebra often help to formalize geometric intuitions and give a clean picture into the behavior of spaces which may have literally millions of dimensions (or more!).

    The proposition statement references “the action” of Gamma (the half-T thing). Gamma is a group, whose elements can be thought of as symmetries; the action is the mechanism by which those elements are “decoded” from their abstract forms into physical symmetries such as reflections and rotations.

    To finish the rest of that clause: a point in space is called a fixed point of Gamma if the action of each element leaves a point alone. For example, in two dimensions a reflection leaves an entire line alone, and so if Gamma consists of two reflections whose lines are not parallel* then Gamma has only one fixed point — where the two lines meet.

    ( * and the identity and their products, technically. )

     
  4. Convex Geometry. Geometry began life as the study of properties which are invariant under rigid transformations and/or dilations, and gives rise naturally to the intuitive idea of an object’s “shape”. Convex geometry investigates shapes which do not “curve inward”, which means that all lines whose endpoints lie in the shape have the entire line inside it as well.

    The ≤ symbol does at the beginning of the proof is not the usual less-than-or-equal-to relation on real numbers. Instead, it signifies that Gamma (the half-T on the left side of ≤) and S_n are groups, which are objects that encodes symmetries, and that Gamma is a subgroup of S_n (so it encodes fewer symmetries than S_n).

    This theorem is a launching point for a certain problem in operations research. There is a very well-understood object called a linear program, and by considering variants of linear programs we arrive at an object which is much more difficult to handle. However, a recent paper* recently suggesting that if this object exhibits symmetry in a group-theory sense, we might be able to reign it in.

    * Herr, Rehn, Schürmann. Exploiting Symmetry in Integer Convex Optimization using Core Points. Online at http://arxiv.org/abs/1202.0435

     
  5. Convexity. A set is considered convex if any line segment whose endpoints are in the set, is itself entirely contained in the set. This property guarantees that we can consider the “local” geometry of the set without being concerned in what happens outside of it.

    I am doing research again this summer and preparing for my senior thesis, which means that there should be a pretty steady stream of content (hopefully).

    This theorem is just a derpy little thing I came up with about a week ago for the project; it is the converse to a theorem with much more ambition, but which turned out to be false. But it’s cute and I haven’t posted a bite-sized proof in a while, so here you go :)

     
  6. Combinatorics. Combinatorics began as the systematic study of counting the number of elements in particularly common sets.  Nowadays, this definition has broadened and you can use “combinatorics” to refer to pretty much all of finite mathematics. The original field is now more descriptively called “enumerative combinatorics”

    One of the most surprisingly powerful techniques in combinatorics is that of a generating function. Given a sequence, for example (n^2)=(0,1,4,9,16,25,…), we construct a formal power series by just putting powers of x next to all the numbers:

    0x^0 + 1x^1 + 4x^2 + 9x^3 + 16x^4 + 25x^5 + …

    It turns out that there are a lot of information tied up in this object, and it can even be used to give approximate forms to sequences which are resistant to usual methods of attack.

    This proof produces an exponential generating function, which is a slight technical refinement that is more useful for getting the information when the sequence grows very rapidly.

     
  7. Combinatorics. Combinatorics was initially conceived as a systematic method of counting objects which arise often in other areas of mathematics or highly-symmetrized versions of objects that occur in real life. As mathematics has grown, the word “combinatorics” has come to describe a large, loosely related set of disciplines; the original field is more descriptively called “enumerative combinatorics”.

    The ∑ symbol at the top of the second image is a capital greek letter called sigma; it makes the soft ‘s’ sound. It is used to write sums [addition] more compactly and precisely when the number of summands is itself variable. In fact, the equality in the proof in which it appears is no more than a definition. 

    (I’ve probably done a symbol explanation of ∑ before, but it bears repeating since in my experience it tends to be the media’s visual shortcut for ‘this is math’.)

    I have rules for when I do and don’t credit things on this blog and unfortunately this is one that is pretty unambiguously on the “do credit” side. I say unfortunately because the crediting pretty complicated, so I’m going to explain a little more why I chose those names at the top.

    it’s probably a very well-known fact, but I don’t know that is true, and I first heard it from Prof. Sellers, who asked our class for a combinatorial proof. Sellers does not teach our class, and so I credit him for the question. The proof is mine, but the idea is essentially equivalent to that of several of my classmates who submitted answers before I did. Bandes-Storch was the first responder, so I credit him for the answer.

     
  8. Quantifiers. Quantifier is one of those words which is more grandiose than the concept it represents: there are precisely two kinds of quantifiers in ordinary mathematics. () means “For all” and () means “There exists”. These are basic objects in logic, and are used  branches of math that you can imagine [and also all the other ones].

    One of the things that confuses new students about quantifiers is that the order of quantifiers is usually extremely important. When I was coming to grips with them I ended up with a model of quantified statements as a game: There are quantifiers, and then there is a condition. Going from left to right on the quantifiers, you get to choose an object when there is a  and your opponent gets to choose an object when there is a ; all of the choices are public at the instant they are made. In the example of supercontinuity, your opponent picks the x, then you pick the delta, then your opponent picks the epsilon and the y. Now, you win if the condition is satisfied. The statement is true if (and only if) you have a winning strategy; a way to win regardless of your opponent’s choices. Explain why the strategy works, and you have a proof of the statement.

    From this model a lot of general trends become more palatable: you can rearrange the order of two quantifiers if they are the same kind. If a  is pulled closer to the beginning, it produces a stronger condition (because you get less information and so it is harder to win). Also, each ∃ represents some kind of construction in the proof, and you can only use a variable in that construction if it comes earlier in the chain. And you have to assume that  variables are going to be chosen to make the proof as hard as possible.

    This theorem arose from a tutoring exercise in which we were swapping the order of quantifiers in some multi-quantifier definitions and theorems to see what happens. Actually, if you are a tutor of a student who is struggling with proofs, this is a great exercise. There are problems like ∀/∃ ∀/∃ y , x-y=0 that can build a lot of confidence, and then you can basically just go pull out the most quantifier-laden definitions that they’ve seen in class, work through a couple of examples and then mess around with them. It helps to do this in advance so that you can pick out good examples, though.

    Longer text than usual, because I love teaching quantifiers. Every statement of mathematics feels their influence, but often people are not taught to appreciate it. This can make things very confusing. But at the same time, they’re very simple to understand, once you put some work into it. And when you do, you start to see their influence, and are a lot closer to being able to do mathematics than you were when you were blind to them.

     
  9. Set Theory. I don’t really know a lot about real set theory, but I know that it was founded by Georg Cantor in an effort to understand various magnitudes of infinity. One of the curious things about infinity is that by subtracting off small numbers, you still get infinity, and moreover the infinity you are left with is the same size as the one you started with. However, to actually sit down and do this can sometimes give rise to tremendous complications. This proof explores one of them.

     
  10. Graph Theory. Graph theory is the study of graphs, which are tools for studying the interactions between objects at a very abstract level. It is arguably the second-most applicable field of mathematics outside of linear algebra, because its central objects lend themselves well to modeling many kinds of phenomena. It has been gainfully applied in networks, traffic flow, electrical circuits, decision analysis, and game theory. It also is extremely applicable inside of mathematics because of the deep relationship between graphs and certain algebraic objects.

    Graph theory proofs tend to be pretty dull to read incorrectly. They come from these amazing pictures, and then they get translated into linear writing. People who work with graphs have (collectively) gotten very good at doing this, and there is a very rich language with which to do it; so rich that it is not even very hard to skip over the pictures entirely. You can read the text and follow its validity, and the proof is over. But the proofs are always much more satisfying if you have a pencil in hand.

    I tried to put in a couple of pictures that came up while I was thinking about this proof, but there were many, many more.

     
  11. A selection of the pictures which produced this theorem.

     
  12. Measure Theory. Measure theory is a subsection of analysis which was invented by Lebesgue in order to provide an appropriate setting in which to generalize the concept of an integral. Its central objects of study are (unsurprisingly) called measures, which I mentioned briefly when I was going on about the Cauchy-Schwarz inequality.

    Basically, a measure is a rule that assigns numbers to sets in a way that captures many of our intuitive ideas about “size”. Area and volume are the measures that we are used to, and it turns out to be (frightfully) difficult to construct almost any measure — including those two. However, there is at least one simple measure: the counting measure, which simply assigns to a finite set the number of elements in that set, and assigns ∞ to any infinite set.

    This theorem is particularly interesting in that it is entirely unconcerned with convergence of these sums; if one diverges, the other does as well. If only sequences were as well-behaved as series…

     
  13. Failure. In the attempt to solve problems, everyone will at some point fall short of success. In mathematics, as in life, a failure is often an opportunity to reassess a situation. As I’ve mentioned before on the blog, failure is not bad, but it is important that a mathematician is able to communicate the attempts E made, explain precisely the difficulties which stymied them, and hopefully recover partial results.

    This is the less glamorous side of the work I do (if you can imagine that). It would be a bit blasé to say that this is a more typical homework writeup, not to mention entirely false. But what is true is that there have been many occasions when if I didn’t write up “solutions” like this I wouldn’t have much to write up at all.

    This wasn’t what I was planning on putting up today, but I think its important to share. 

    (Though, the more I think about it, the harder I find it to articulate why… but I’d bet that probably has less to do with it being inappropriate and more to do with that I’ve been up for the last *coughcough* hours hacking away at this assignment. )

     
  14. Topology. Topology is the study of spaces that come defined with an inherent notion of nearness. It is often described as “rubber sheet geometry”, as it is the study of properties that remain under continuous deformations, which we might imagine as being like stretching a sheet of rubber.

    As far as I know, an H-S space is an object of my own invention; it is certainly my own name. I created this problem for an analysis student who was unable to meet with me for tutoring this week. E has a hard time getting concepts to interact, so I thought I would see how E reacts to something entirely “out there”. Maybe it will help to have something with fewer strings attached, and hopefully the practice will be good. I don’t know if E follows the blog, so I don’t want to talk about it anymore yet :)

    I intended to have a different piece content today. Also had a different idea about when “today” was. Both situations arose from trying to prove a theorem and then just going entirely down the wrong path.

     
  15. Harmonic analysis. Harmonic analysis is a subfield of functional analysis (which is itself a subfield of linear algebra). It is a generalization of Joseph Fourier’s insight that periodic functions — those which eventually repeat themselves — can be broken down into easily-understood waves in a relatively simple way.

    The Hilbert Transform is an mechanism for turning one function into another function that is the usual example of a singular integral operator, which is a very technical thing. It is not at all unreasonable to devote an entire career to studying singular integral operators.

    Despite falling under the purview of analysis, the heavy analytic techniques are hiding in Theorem 2.6.1 and Theorem 2.5.4. This proof really has more of an algebraic flavor.

    In my opinion, mathematics education as it is most commonly structured has certain “black belt” courses, where after understanding the material in them, you become able to access a supply of knowledge that is much more rich than anything you had access to before. Functional analysis is definitely one of those courses. It’s kicking my ass, but the professor is sympathetic and I’m the only one who goes to office hours :P