Log in

No account? Create an account
Previous Entry Share Next Entry
(no subject)
One of the beautiful things about mathematics is that there are often relationships between seemingly disparate "objects" in maths that end up tieing everything together. Sometimes this is apparent by having something show up in an unexpected place. For example, Stirlings Approximation for factorials involves π, which is unexpected if you just know π in relation to circles and don't know that π shows up all over the place.

The other common way the interconnectedness happens can be somewhat annoying or difficult for a student trying to understand what is going on. It is very common for a mathematical structure which has certain nice properties to have other nice properties as well. Not only is it common for a mathematical structure which has property A to also have properties B, C, and D, but it is very often the case that only structures with property A have B, C, and D.

I'll use an example from linear algebra. Those reading this who speak math will understand what I'm saying is true. Those who don't will read this as gibberish but should see what I mean anyway.

There's a class set group type of n by n matrix which have the following properties (assuming A is a matrix of this type):

  1. All the rows of A are independent.
  2. All the columns of A are independent.
  3. The rank of A is n
  4. There exists a unique n by n matrix B such that BA = AB = I (the n by n Identity matrix)
  5. The determinant of A det(A) &neq; 0
  6. The equation Ax = b has a unique solution x for any b
  7. The equation Ax = 0 has the unique solution x = 0
  8. The reduced row-eschelon form of A rref(a)=I

If any one of those properties is true for a given n by n matrix then they all hold. And that's beautiful because they each give a different view of the matrix, what you can do with it, and you can show further interconnections. You can, for example, use the property of determinants det(A)det(B)=det(AB) (for n by n matrices A and B) to show that if A and B are both of the type we are discussing then their product AB is as well. Wikipedia lists 16 properties of this type of matrix, double what I have here. So this type of matrix is generally considered "nice" (although that's not a formal name for it).

That's the beauty -- everything is connected and has unexpected properties you can use elsewhere. The annoyance comes when you want to name the thing and define it. Any of the above properties would work fine to define this type of matrix. And a couple are. Matrices with determinants of zero are called "singular", so the matrices we are talking about are called "nonsingular". The B such that BA = AB = I is called the "inverse of B" so matrices of our type are called "invertible". So there are two different names for A, both of which are based on seemingly different properties of A. In this example looking up "invertable matrix", "nonsingular matrix" , "singular matrix" and "degenerate matrix" (a synonym of singular matrix) will all take you to the same page listing all 16 equivalently defining properties of this type of matrix, so it isn't so bad.

But that isn't the case always. You can't always easily find all the properties of a mathematical object simply by looking up its name. You can't always find all the names of a mathematical object by looking up one name. You might know all about something based on one name (invertable, for instance) and have no idea what someone is talking about when they use a different name (nonsingular, for instance).

It's not obvious what properties you are supposed to know about, especially if you don't know about all of them. QUantum mechanics use something called "Hermitian matrices" (named after a French mathematician named Hermite) which are defined variously as self-adjoint matrices or matrices over C which equal their complex transpose. You can (and I have) done a lot of searching to find out what make the self-adjoint property useful before finding out that the real utility of Hermitian matrices in quantum mechanics is that they have real eigenvalues and orthogonal eigenvectors. Which of course starts another search to find out exactly what eigenwhoozits are and why they are useful.

It is difficult, in general, as a student, to follow higher mathematical discussions not just because the concepts are hard but in part because you are expected to be familiar with all these beautiful relationships under whatever name the authors choose to use. I don't think there is any shortcut except practice, and it can be hard to know where to start.

(As an aside, another issue is that all the good words are taken. Above I have "class", "set",and "group" struck-out before settling on "type" to describe the collection of nonsingular matrices. It turns out that mathematicians have already given strict formal mathematical meanings to the words "class", "set", and "group" which mean different things, not necessarily obvious from their non-mathematical meaning. It so happens that, in this case, the nonsingular matrices happen to meet the definitions of sets, classes, and (depending on what is chosen as the "operation") groups, so any would have worked.)