HomeНаука и техникаRelated VideosMore From: Physics Videos by Eugene Khutoryansky

Tensors Explained Intuitively: Covariant, Contravariant, Rank

5099 ratings | 249745 views
Tensors of rank 1, 2, and 3 visualized with covariant and contravariant components. My Patreon page is at https://www.patreon.com/EugeneK
Html code for embedding videos on your blog
Text Comments (586)
柴士童 (22 minutes ago)
physics is likely to be closely related to math
Three Camels (17 hours ago)
No such thing as gravity or "space/time".    Einstein was a hack and a fraud.
Tommy 77 (21 hours ago)
I just know that a vector can also be described using its dot product with the bases, but how do I show that such description will give a unique representation of that vector?
Rachel Ginsberg (2 days ago)
Also, I liked the music :) It matched the excitement I felt at finally understanding this!
Rachel Ginsberg (2 days ago)
Thank you so much. I've been trying to get some sort of intuition for what a tensor is, and this is definitely the best video I've found to help me with that.
MrDeep414 (4 days ago)
I dislike this video, due to I can not concentrate on text. Remove music please!
Dan Woodall (6 days ago)
The music is distracting
陈绍伍 (6 days ago)
Watch this video https://www.youtube.com/watch?v=f5liqUk0ZTw together, then you will have a better understanding
Kevin Byrne (7 days ago)
For DECADES I've searched for an explanation of tensors that's as simple as the one that you've presented here in less than 12 minutes. Thank you, thank you, thank you ! I am in your debt.
Glad my video was helpful. Thanks.
Mark Masterburg (7 days ago)
great video but no music needed... its horrible.
Claudio Saspinski (8 days ago)
I have followed the excelents videos of Leonard Susskind, and that concepts were used a lot in general relativity. But today, for the first time, I see a definition and why the names: contravariant and covariant. It is amazing how good is this video. Thanks a lot for that. I think that if that videos were available when prof. Susskind recorded his lectures, he would tell the students to watch that video before the next class!
aka izo (11 days ago)
about 8:00 i am not sure understand it well, we have basis (5 yellow, 2 red, 3 green) so i can use T^11= (4 yellow, 1 red and 2 green), and (5 yellow, 0 red 0 green). what is the exact definition of "every possible combination of these two basis vector?? can you give some clear example of 2 rank tensor has same basis.
Rodrigo E Toobe (4 days ago)
No, we "associate" a number with each basis vector, this number is so called component of the vector/tensor Suppose we have a Force vector F and a displacement vector x, we introduce now a coordiante system, say cartesian or polar=cilindrical , we express these vectors in contravariant form [[In index notation and youtube comments limitations, F_i mean F sub i, Fi mean F super]] That is: Fi and xi (contravariant components, covariant basis asociated) if we make the tensor (Fi xj) with each combination as shown in the video we get a tensor of rank 2 full contravariant, you know what Force dot product distance is, it's work done by the force and its a number (a tensor of rank 1) wich can be calculated from the tensor by the trace of the matrix (sum over i of this multiplication: Fi*xi). But this way of doing it has a problem, in cartesian coordinates is OK, but in any other coordinates the work done by the force is different than cartesian, ¿How to make it in such a way that the work done is the same for every coordinate system? Well the "trick" is to make one of the vector *components* Fi or xi, a covariant component (u need metric tensor to transform from one type to another) so we make a new tensor, Fi*x_i, and this new tensor is called mixed tensor or contra-covariant tensor and this tensor is great because, the trace of this damn tensor is a number wich is the same number for every coordinate system, also de determinant of the mixed tensor is the same The old tensor (the Fi*xi full contravariant components) has another preperty, that is the Determinant of the tensor ||Fi xi|| in one coordinate system (say cartesian) is proportional to a number obtained by the determinant of the Jacobian ||J|| to any other coordinate system (primed system) ||Fi xi|| = ||J|| ||Fi' xi'|| This last one property is more advanced and is incomplete (u need to be carefull wich jacobian to use). The exponent of the jacobian is called weight of the tensor, the metric tensor (in full covariant or full contravariant has a Jacobian of exponent 2 or -2 so the metric (expressed in full covariant or contravariant components) is called a *relative tensor* of rank 2 and weight 2, In a mixed tensor (contra-covariant) the weight is 0, so a mixed tensor is a relative tensor of weight 0 You can play with cartesian and cilindrical coordinates to get used by tensors and stuff the metric (full covariant one ) of cartesian is the identity matrix,,, the metric of polar (the full covariant one) is almost the identity matrix but a factor of r^2 in the middle,(1 0 0, 0 r^2 0, 0 1 0) and the mixed metric of both ones is the identity matrix
M6BrokeMe (12 days ago)
That was some of the most intense mental masturbation I have experienced in a very long time. Thanks!
Cindy Tsai (16 days ago)
Thank you !!
You are welcome and thanks.
Jeff Wiebe (18 days ago)
Appreciate the helpful instruction. Thank you.
Thanks.
Jayarava Attwood (20 days ago)
By the end I am slightly sea sick from the swinging motion of the figures. It adds nothing by distraction. And the William Tell Overture is also just distracting. I'm trying to concentrate, but you keep interrupting me. Any animation effects or music should contribute something to the learning process - not add random bits of non-essential information. This is poor pedagogy.
Jayarava Attwood (20 days ago)
At 5:05 "Suppose we multiply one of the contra-variant of V with one of the contra-variants of P". It's not clear why we would do this or what it achieves.
Iadved Czahnaett (22 days ago)
Explenation of rank 3 tensor *William Tell overture ensues* ayy lmao
The Vegg (27 days ago)
The problem with your nice video is where you increase the dot product to show covariance. Sure, the increase in length of the basis vectors increase the dot product BUT you have then entirely broken the connection whereby the dot product length projected perpendicularly to the tip of the vector actually projects to the tip. Following that logic, I could say the increasing basis vectors INCREASES the contravariant components as long as I'm not constrained to having the projection of the length reach the tip of vector. What am I missing, please.
Jorge Guzmán (29 days ago)
If I describe vector in terms of dot product with each of basis vectors?
pendalink (1 month ago)
I watched this when it came out but now i actually need it for class, i love this channel!
Glad my video was helpful and I am glad you like my channel. Thanks.
Jacob011 (1 month ago)
This is awesome! I FINALLY understand all that co-variant and contra-variant business. I've never seen it explained so well.
Thanks.
Nathaniel Neubert (1 month ago)
This is making me sick...
Carlos Cardenas (1 month ago)
What a great idea to explain as you do it. Thanks!!!
Thanks.
N VDL (1 month ago)
The music is well matched. Thank you
The Vegg (1 month ago)
Finally, I know of two definitions of covariant that seem to not say the same thing. https://www.google.com/search?q=how+do+you+determine+the+covariant+components+of+a+vector%3F&safe=active&source=lnms&tbm=isch&sa=X&ved=0ahUKEwjW3oKzmr3dAhUJvFkKHXzjA9QQ_AUIDigB&biw=1821&bih=868#imgrc=qUwQ6AUw4nFf5M: 1-The above image states that covariant components are projections perpendicular to the x axis while contravariant is parallel to the y axis. 2-Covariant components are found by taking the dot product of the vector and the basis vector. In 2, I can see how increasing the basis vector increases the scalar result (covariant) In 1, increasing the basis vector has NO effect on the projection since the light source perpendicular to x axis does not change and the vector length does not change. (non-variant).
The Vegg (1 month ago)
Can you please relate the dot product to the explanation in this image? https://www.google.com/search?q=how+do+you+determine+the+covariant+components+of+a+vector%3F&safe=active&source=lnms&tbm=isch&sa=X&ved=0ahUKEwjW3oKzmr3dAhUJvFkKHXzjA9QQ_AUIDigB&biw=1821&bih=868#imgrc=qUwQ6AUw4nFf5M:
The Vegg (1 month ago)
in keeping to my previous question, I notice when the covariant components double because of doubling the basis vectors, you make the (90 degree notation and the line pointing to the tip of the vector) disappear. is that because it is impossible to keep it at 90 degrees if you change the length of the covariant basis vector? Does changing the length of a basis vector from 1 to any other length (in covariant measurement) change the angle from 90 to other degrees? If so, what is the significance of the 90 degrees at the magic (1) for a basis vector? I assumed that the dot product was the projection of the vector on to the basis vector, but if that is true then changing the length of the basis vector would cause no change to the projection and as such the value of the component in the covariant measurement.
The Vegg (1 month ago)
I agree with Jesus. Also, if the dot product of a vector is tantamount to the projection of the vector on the basis, why would increasing or decreasing the basis increase or decrease the number. The projection is the same no matter the length of the basis. Thanks for your great animations.
The dot product is more than just the projection. I explain how to visualize dot products in my video at https://youtu.be/h0NJK4mEIJU Thanks.
Sorry to sound harsh, but I didn't found this very intuitive. Suppose we multiply the co-variant component with the contra-variant components to get this matrix. Sure...you CAN do that, but WHY would you want to that? There is where intuition lies, in my opinion.
Rodrigo E Toobe (4 days ago)
if you do that you get the mixed metric wich is always the Identity matrix. In the video in 8:00 the components of a tensor of rank 2 are a numbers associated to each combination of the basis (idk if you falled in that confusion but i just wrote that just in case) If you multiply the covariant with the contravariant you get a nice matrix, this matrix has a two special properties(and probably more): 1) The trace of that matrix is the same for every coordinate system (is invariant) u can check it by doing cartesian and polar coordinates 2) the determinant is also the same for every coordinate system, determinant is the area enclosed by two vector. If you look at the other tensor variation (full contravariant or full covariant) you cant do that.
I love your vids
Thanks.
m33LLS (1 month ago)
Great video! Could you do a video about stress (or strain) tensors?
Zes (1 month ago)
no such thing as way or usual or not, say anyx
Cat22 (1 month ago)
Eugene Khutoryansky: Can you do 2 videos - 1. on the vacuum energy and virtual particles (quantum foam) what are they, do they have charge? can energy be extracted from the particles? etc 2. on quantum entanglement 3. Bonus: How about one on 'dark flow' where it appears galaxies are moving towards a specific point in space. Your video's are very logical and easy to follow - excellent work!
Swarna Kumar S Naik (1 month ago)
I'm seriously saying this , your all videos are tremendous .
Thanks for the compliment.
ibrahim lakhdar (1 month ago)
contravariant and covariant components are equal in a orthonormal basis,am I wrong?
You are correct.
Philippe durrer (2 months ago)
Absolutely phenomenal video, i really wish we had these to study 8 years ago. I finally understood the difference between co and contravariant .... before i just knew the definition
keyur chodvadiya (2 months ago)
ma'am may i ask which software do you use to make videos
I make my 3D animations with "Poser."
Lyman Paul (2 months ago)
very cool. great job!
Thanks for the compliment.
Kunal Garg (2 months ago)
Awesome graphics and explanation. Nice work. Keep it up.
Thanks for the compliment.
Steve S (2 months ago)
At last a simple illustration on the difference between co variant and contravariant components along with associated indexing. Brilliant.
Thanks.
Marco Furlan (2 months ago)
Did you notice the cat at 9:10?
Dimi S (2 months ago)
Дикторше очевидно пох на то о чем она ведает...
Adrián Ramírez (2 months ago)
Compairing with the abstract learning in math books and engineering courses, I believe this kind of videos in YTube present a different perspective on math learning, and are very helpful to clarify concepts and equations. Thank you.
Thanks.
Ana Maria Quintal (2 months ago)
Some times the discourse of the music colides with the speech of resoning. Here we have It.
Anyone know how he got the values of 1.826, 5.055 and 3.567 for the dot products at 1:04? Is there enough information on the screen at 1:04 for us viewers to be able to calculate these values? I thought we would at least need to know the angles between each basis vector (red, green, yellow) and the white vector? Also, is the length (magnitude?) of the white vector = square root of (2^2 + 4^2 + 6^2) = square root of 56? Thanks to anyone who can help!
Ken Jenks (2 months ago)
A very clear explanation of tensors. Nice, simple graphics.
Thanks.
rewtnode (3 months ago)
The oral explanations were very good. I realize though that I found the fancy graphics being of barely any use. The explanation of covariant versus contra variant was great. The graphics coming with that explanation however was mostly a distraction. However, when you letter explain 2nd and third order tensor I found the graphics quite useful.
Reliable Beast (3 months ago)
The spinning diagram gave me motion sickness.
Frazer Kirkman (3 months ago)
After 2 minutes, you no longer explain anything, you are just making definitions. I wish you had explained why you would multiply those different basis vectors together.
RAVI SHANKER (3 months ago)
Tensors for Beginners: https://www.youtube.com/playlist?list=PLJHszsWbB6hrkmmq57lX8BV-o-YIOFsiG
Nick (4 months ago)
All very well. But dot product - well of course everyone knows what that means!
I cover dot products in my video at https://youtu.be/h0NJK4mEIJU
umeng2002 (4 months ago)
Having a good instructor makes a night and day difference when learning more advanced subjects. Great video. Making the jump from just dealing with vectors to tensors trips up a good number of people.
-/ChaNNel713/- (4 months ago)
Умоляю, сделайте перевод на Русский Translate in Russian, pleeeease
Akshay Sunil Bhadage (4 months ago)
3:53 Avengers:Infinity War vibes for a moment because of music
Ian Haggerty (4 months ago)
Yessss! Finally an explanation behind the terminology "covariant" and "contravariant". It's alien language like this that can really throw me off learning new topics in physics & math. MAHASIVE Props to you.
Thanks.
Cees Timmerman (4 months ago)
So a tensor (always 3 dimensional?) has n ranks, where n is the number of vectors it combines?
Mj B (4 months ago)
Great video - just for future videos keep in mind that sound//music is contra productive to some types of learners - like me. VERY distracting. And I like the music ;-)
foreverseethe (5 months ago)
I don't find this intuitive in the least. Misleading title. The term intuitive should be reserved only for concepts that laymen can generalize from everyday experience.
Robert Collins (5 months ago)
The video seemed to do a good job, but the background music was so distracting that I gave up about halfway through. (I liked the music, and I liked the video, but I could not understand the narrative because of the music.) Remove the music and you will have a much better video.
carmatic (5 months ago)
can you explain what a dot product is? what does it have to do with 90 degree angles?
carmatic (5 months ago)
super helpful, thank you!
I cover dot products in my video at https://youtu.be/h0NJK4mEIJU
Swapnil Udgirkar (5 months ago)
I found the background music a little distracting. Maybe tone it down a bit? The content was superb though. Great visuals and explanation! Thanks a lot :)
Daniel Ribas Tandeitnik (5 months ago)
Nice video, but the classical music was very odd... thank you very much!!
Thomas Lehner (5 months ago)
Good video but distracting music
Andrew Brown (5 months ago)
FINALLY A HELPFUL VISUAL REPRESENTATION!! I’ve been stuck on intuiting covariant vectors for YEARS! I think I get it now, it’s the *components* of the vector that are really covariant or contravariant, not the invariant/intrinsic vector itself
Faw Bri (6 months ago)
Hi,Thanks for the video and the explanations.In the beginning of the video you say "if we double the length of the basis vectors, the dot product doubles" if V = (2, 0) in the basis e1 = (1, 0), e2 = (0, 1), V.e1 = 2 But if e1' = (2, 0), V in the new basis would be V = (1, 0), and V.e1' = 2 So why didn't you express V in the new basis for the dot product but you did it for the normal components of V ?
rudolf gelpke (5 months ago)
(First I thought "what a sensible explanation" ... then I realized I don't get the covariant case, having the impression it played out similar to the contravariant case ... but days later ...) (As of now, edited, my comment doesn't fit here as a comment on Faw Bri) I believe I understand now. Before, I was wrong in two points: 1) I did not fully understand the dot product. It goes like (V dot E = |V| |Ê| cos(angle V-Ê)). Having learned the dot product in the context of coordinate systems with orthonormal basis vectors (all basis vectors at right angle to each other and of UNIT length), I IGNORED the basis vector's magnitude as a factor (it used to be always 1, because of unit basis vectors). 2) Even though explicitly stated in the video, I still did not realize that the the new component equals in fact the dot product itself. Instead, I wrongly assumed the new component to be that multiple of the basis vector length that is equal in lenght to the projection of vector V onto that basis vector Ê (alike to the contravariant case, where the component is a multiple of the pertaining basis vector).
Paul (6 months ago)
The graphics are a brilliant way of explaining multi-dimensional components, but a generic comment: Dear Educators, please don't put music of any kind on the presentations. It's very distracting (annoying, even). Inversely, imagine going to a concert and having a voice-over throughout!
Antonio Alvarez (6 months ago)
Thank you!! A true hero for science :D
Thor Palmer (6 months ago)
Brilliant channel
Thanks.
Anand Nataraj (6 months ago)
How do you make these animations?
I make my 3D animations with "Poser."
h (6 months ago)
Thanks for using jokerman
Dave Humphreys (7 months ago)
Another excellent video! But, its not clear to me, when you create the matrices of the 'products' of the various components of the two vectors, whether the 'product' is a dot product or cross product. Also, no explanation is given as to WHY you would want to mix up the covariant and contravariant components of the vectors in that way. Lastly, what does the final matrix describe? Is it the addition of the two vectors, or the multiplication of them or what?
Мики Ricky (7 months ago)
V and P for the Tensors, Yes yes, I can sense their relationship, subliminally they will become one.
Shanid haneef (7 months ago)
where was you all this time dear <3 hahaha
Ken Ess (7 months ago)
The background music is to loud and disturbs concentration.
Concepts Made Easy (7 months ago)
This is so much better description and intuitive. God bless.
Peter Haggstrom (7 months ago)
The distinction between covariant and contravariant components is frequently obscured. Indeed in the classic treatment of tensors that Einstein studied (Tullio Levi-Civita's works) it is all about linear transformations, Jacobians, Cramer's rule and so on so that the basic point made in this video  gets lost.  The point Eugene makes in this video is sort of made by some educators but not as clearly as this as far as I can recall.  Well done !  Lot of work in those animations too.
Thanks for the compliment.
Bishal Adhikary (8 months ago)
If you don't mind, may I ask which software do you use for these animations?
Poser.
Steve Taylor (8 months ago)
I've zoned out, listening instead to the TROMBONES!
Carlos Eduardo Nuñez (8 months ago)
Clear explanation. Nice. Congratulations!
Thanks.
Andrea Calaon (8 months ago)
Excellent video! Many ignore the origin of the co- and contra- prefixes. May be a reference to the reciprocal basis would help visualizing even more the two dual representations of vectors in non orthogonal reference frames.
Tempestas Praefert (8 months ago)
Information density is a bit low, even when on 2x speed. The constant movement of the "3d objects" is a bit unnecessary. I still hit that like button, because the matter discussed is quite abstract and the explanation splendid! Well done ;-)
Physicsnerd1 (8 months ago)
Excellent Eugene. Great explanation and visual of co-variant, contra-variant, and sub/super scripts. Nice to grasp the concepts and rules of the game. I have had two different physics instructors who couldn't explain what you have put so succinctly. I have also read many texts that convoluted such simple material. I look forward to watching more of your videos. Thank you so very much!
Thanks. I am glad you liked my video and I hope you enjoy my other videos too.
fayaz ahmad (8 months ago)
l luv u very much
Francesco Bolleri (8 months ago)
Dear Eugene, you are truly a great teacher! I'm spreading your channel to all the people who like me are interested in these topics.I wanted to ask: are you planning to make a video about the quaternions? It would be great to finally have a clear one like what you did for the tensors. Thanks for all your work
I will add quaternions to my list of topics for future videos. Thanks for the compliment and thanks for encouraging people to visit my channel.
Amar Samar (8 months ago)
abbabbbabbabbababbabbabababbabaaaa....!!!! Awesome video
kareszt (8 months ago)
1 week - internal thought. Google provided.
Amy de Buitléir (8 months ago)
Top=notch video! I did find the music a little distracting, though.
Cool Tae (8 months ago)
This was really helpful.Thank you for uploading.😊
Thanks.
giordano bruno (8 months ago)
El video esta bien, mejor estaría con subtitulos en español
M P (8 months ago)
Too quiet
Arman Tavakoli (8 months ago)
Very nice explanations; I love them. Thanks a lot!
M M (8 months ago)
Excellent video!!
Thanks.
Prenom Nom (8 months ago)
Simply EXCELLENT. I never post comments on YouTube but this deserves to be the TOP video in any search on the topic.
Thanks. I am glad you liked my video.
Please start a series on Deep Learning. Would be really helpful for aspiring AI engineers. It is after all the future of the industry. One more humble suggestion as your student. For the kind of clarity your videos have, this channel can fill in the gap in the learner’s needs in the research. One such gap is the complexity in understanding the research papers of interesting AI projects. If you could make videos explaining the latest innovative ideas available as research papers, it will be very helpful.
Matthew Pharr (9 months ago)
The music when you got to rank 3 made me laugh
Ole Olee (9 months ago)
Wow, awesome work. Very nice visualisation, congrats! I would even more appreciate such a great explanation for the Riemann tensor! Which programm are you using for the visualisations?
Thanks. I make my 3D animations with Poser. Also, the Riemann tensor is covered in my video on Einstein's Field Equations at https://youtu.be/UfThVvBWZxM
Mauro Cruz (9 months ago)
I simply can't understand why this topic in the books is so entangled and you just made up so easy!
Pedro Novak (9 months ago)
One of the best math and physics channels out there!
Thanks for the compliment.
yogran1 (9 months ago)
If you increase the basis vectors and the dot product increases ie the length of the lines are increasing then won't the vector itself also get bigger?
No, the vector itself stays the same length.

Would you like to comment?

Join YouTube for a free account, or sign in if you are already a member.