0:46

>> Very specific.

Â >> Chuck.

Â >> Determinants plus one.

Â >> Determinants, right has to do every orthogonal matrix has to

Â determined of plus or minus one we prove that quickly.

Â But the proper orthogonal has a plus one that means it's used the right hand rule

Â or the mirroring operations in mathematics also give you this orthogonal matrixes.

Â But they have a determinant of minus one and so

Â we need one with a plus one determinant.

Â That's what we need here.

Â I is simply the identity but it could be, this is for n by n matrices,

Â not just three by three.

Â And then q is the skew-symmetric matrix which we're very familiar with,

Â skew-symmetric just means q transposed is minus q, right?

Â It has that mathematical property definition.

Â Then we define this equation.

Â 1:42

the DCM, or a orthogonal matrix.

Â For three dimensional rotations, this is our attitude description.

Â That is the DCM matrix.

Â But this actually also works for n dimensional spaces.

Â Now where does this appear?

Â Structures is a big field where we often do i and t decompositions.

Â You end up with these symmetric positive definite matrices and

Â you can decompose it.

Â There's this huge n by n you have to invert.

Â And there's whole theories on how to actually do it better, extract notes, and

Â your eigenvectors of a symmetric positive definite matrix form a orthogonal matrix.

Â You might have a 100 by 100 orthogonal matrix.

Â A 100 times 100, that's a 100 squared elements you have to track and invert.

Â So with attitude instead of a three by three, which is nine.

Â We know we can get away with three corners.

Â That's the minimal description.

Â It turns out with a 100 by 100, it's always n minus n minus one.

Â So n times n minus one over two, so

Â for three by three it's a three times two just six over two which is three.

Â I mean three degrees of freedom for attitude, if it's a 100 by 100,

Â it's a 100 times 99 divided by two, which is way less than a 100 squared.

Â So that's why people are interested in these things.

Â In different ways to decompose it, so we can use attitude descriptions

Â anytime you have to describe the time behavior of an orthogonal matrix.

Â So we're just going to spend a few minutes on this to kind of highlight that

Â the concepts you're learning actually expand to much higher dimensional

Â spaces as well.

Â And there's ongoing cool research in this area.

Â Now this equation has some interesting using properties.

Â The skew-symmetry we defined,

Â we had to define the way we did earlier with that x till d.

Â You know there was zero, zero, zero minus x three, x two in the first row and

Â so forth.

Â Then this works.

Â But look at the other side.

Â I have one minus q, one plus q inverted or we switch the order.

Â 3:34

With matrix math a times b TIBO is that the same thing as b times a?

Â >> Not generally, no.

Â >> No.

Â That's a big no no, right.

Â You can't just flip the order of matrix math.

Â You break all kinds of laws right there.

Â Here it turns out this Cayley transform doesn't care.

Â So this particular version allows you to switch.

Â I can compute this and this and one times the other with the other times the first.

Â It's just kind of cool.

Â So this is a math thing that you can do and

Â you can easily compute this coin math lab plug in numbers and you will see.

Â Wow I do get back in the flow of a matrix and see where that goes.

Â Now, the inverse mapping, if you have a othogonal matrix.

Â And you're trying to decompose it back into a othogonal matrix,

Â this is the math you have to do.

Â I take identity minus that matrix, identity plus the matrix, and

Â invert that one.

Â The order, again doesn't matter.

Â 4:56

>> This not an eye test.

Â >> [LAUGH] No, there's nothing else different, right?

Â So this is kind of q's.

Â How often does that happen?

Â Not only can we interchange the matrix order in this particular mapping in fact

Â you can write one sub routine to do the forward mapping.

Â And the inverse mapping, you just have to give it an n by n matrix.

Â And this matrix needs to either be an orthogonal matrix in which case the output

Â will be a skew-symmetric matrix.

Â Or you give it a skew-symmetric matrix do the same math and

Â out comes an orthogonal matrix.

Â How cool is that?

Â You don't see that very often.

Â 5:29

We'll say the Cayley transform after we do attitude estimation that be one

Â application where this form is used to decompose.

Â And come up with all estimation method here a few years ago as well.

Â So I'll show you that, but that's it so forwards and

Â backwards it's the same it's a really cool property.

Â And our publishing gets so excited about this stuff but it is kind of neat.

Â Here's a simple example, now for 3D space you plug it in and

Â this is where this definition of skew-symmetry is important.

Â These q one, twos, and threes, are the CRP's.

Â So another way to get to the CRP besides quaternions or

Â stereo graphic mapping from them Geometrically.

Â You can also say from this Cayley math, I get a, I put in the DCM I do this math,

Â out comes this skew-symmetric matrix whose components with

Â the right sides which is are my classical Rodrigues parameters.

Â But this also works, this allows us not to generalize the idea of classic

Â parameters to n dimensional manifolds, not just three dimensional spaces.

Â 6:30

And so this is just some numbers you can plug it in do something similar

Â on a computer coming up and that will give you this.

Â If you took these numbers and

Â plugged them into our earlier formulas on how to go from CRP to DCM.

Â You would get back this one exactly or

Â you can put it into the Cayley transform function again.

Â And you get back to exact same equation, the exact same values.

Â So it's kind of an elegant mapping that you have.

Â Higher dimensional, I'm just showing you one, here's an orthogonal matrix.

Â And you decompose it and you can see instead of tracking 16 numbers now in your

Â code you could have it decomposed into six numbers.

Â Some ordinate factor of two better, in as far as compactness, right?

Â But just knowing what these six elements represent, these 16 elements,

Â there's a mapping to and from them.

Â We still need differential equations on how to integrate those.

Â So what's that differential equation?

Â And this one actually, if you remember,

Â we proved this differential equation had to hold for general orthogonal matrices.

Â This wasn't just good for three dimensional matrix, orthogonal matrices.

Â You could prove this formula works regardless of the dimension.

Â And you can use this, I'm not going to do this in the class.

Â But you can use it to actually derive q dots, right.

Â That's like your CRP rates but in a matrix form and

Â how this relates to z omega or if you have CRP rates you can do this.

Â So people have used these kind of things to integrate these differential

Â equations to come up with the evolution of this

Â eigen vector matrix versus dealing with the eigen vector matrix directly.

Â 8:04

So different things that can be done there.

Â So I'm not going to spend much time on this, this is just the,

Â where else you could go.

Â What else is happening in this kind of of a research and how we describe it.

Â But this is physical example.

Â If you do mechanics, you would see a system mass matrix of order n times

Â the accelerations is equal to some forcing function.

Â This could be decomposed, as these are symmetric positive definite.

Â These are orthogonal.

Â And they satisfy these things, so you can replace these v and v transpose with

Â a subset of coordinates and that's basically how this is to be applied.

Â 8:52

Okay, that's classical Rodrigues parameters.

Â What I want you to remember is that our singular they move the singularity by as

Â far as you can go without wrapping back on to yourself.

Â So 180 degrees is much better than oiler angles but they're still singular.

Â There's some nice properties back and forth, they're linearized to that mapping,

Â those are all cool things but that's kind of the concept.

Â