For the linear system with constant coefficients, say the X prime is equal to A times our backs. We now consider the following case. The coefficient to matrix A has the some eigenvalue with the multiplicity. We assume that A has an eigenvalue Lambda 1 of multiplicity m. M is some integer greater than or equal to 2. Definitely the less than or equal to m, and less than or equal to n. Sometimes people call this number m. The number of the multiplicity, as in the algebraic multiplicity. This is the algebraic multiplicity of the eigenvalue Lambda 1. Let me ask you the following question, by the definition of an eigenvalue, for any eigenvalue, there be corresponding eigenvector away at least one. Non-trivial the solution K satisfying A minus Lambda 1 times the K is equal to 0. By definition is trivia. How many the eigenvectors corresponding to this eigenvalue Lambda 1? How many? Can you count them? In fact, there are infinitely many distinct eigenvectors corresponding to this single eigenvalue Lambda 1. It's because it just tell rained, if, say the vector K_1 is an eigenvector for the eigenvalue Lambda 1. The any constant to multiple of the K_1. See there any non-zero constant y, c times the K_1 is also on eigenvector for the same eigenvalue Lambda 1. If we simply count the number of eigenvectors corresponding to this eigenvalue. They always infinitely many distinct eigenvector for the given eigenvalue Lambda 1 because this coefficient c can be any non-zero real number. That is trivial. It's the meaning is to count simply the number of the eigenvectors. But let's change the question into the following way. How many linearly independent eigenvectors for the eigenvalue Lambda 1. Let's change the question into the following way. If K_1 is an eigenvector, then arbitrary non-zero constant multiple of it, is also an eigenvector. But all these eigenvectors are linearly dependent because it's simply a zero constant to multiple K_1. We should ask how many linearly independent eigenvectors for any given eigenvalue Lambda 1. Again, by definition corresponding to any eigenvalue there must be at least one linearly independent eigenvector way for any eigenvalue. If the eigenvalue has multiplicity m, which is between 2 and n, then it may have at least one and at most m linearly independent eigenvectors. The answer for this question is, there is k linearly independent eigenvectors for an eigenvalue Lambda 1 of multiplicity m, where the number k is at least one and at most m. That is the well-known factor from the linear algebra. People sometimes call this number k, the number of linearly independent eigenvectors corresponding to any given eigenvalue, this number k, call it to be the geometric multiplicity. There is a k, at least 1 and at most m linearly independent eigenvectors for any given eigenvalue of multiplicity m. In this case, we are going to consider such a situation, shown that the coefficient matrix A has an eigenvalue Lambda 1 of multiplicity m, which is greater than equal to 2 and the less than or equal to the dimension of the problem which is n. In that case, how can you find corresponding linearly independent solutions? We will consider the two extreme cases. Say the following, here's case 2.1. I'm still assuming the Lambda 1 is an eigenvalue of A of multiplicity m, which is greater than equal to 2 and less than or equal to n. Corresponding to this Lambda 1, first time assuming that there is a maximum possible, that is m linearly independent eigenvectors, say K_1, K_2, and so on and K_m for given eigenvalue Lambda 1 of multiplicity m. Largest possible linearly independent eigenvector corresponding to this eigenvalue of multiplicity m. Then it's easy to conclude then, in this case, X_1 is equal to K_1 e^ Lambda 1 of t. X_2 is equal to K_2 times e^Lambda 1 of t, and X_m is equal to k_m times e^Lambda 1 of t. You can form those solutions, because of those are vectors are linearly independent, times e^Lambda 1 of t times e^Lambda1 t, times e^Lambda 1 of t. In other words, X_1, X_2, and X_m, they are m linearly independent solutions of the original problem, say x prime is equal to A times X. That's the conclusion. C is the same. I'll just think about one concrete example of this situation. Here let's consider the following problem.