The article I'm linking here was written back in 2001, it is "old" in technology terms, yet is still 100% completely accurate and relevant!
Java, C#, .NET, database languages like MySQL, and things like this all benefit from someone with a nice, solid understanding of how "lower level" technology works. The linked article delves in to a bit of C code in which all these other languages implement on some level.
I will be forthcoming on this, I do not agree, even today, that universities should remove a C programming class from any Computer Science curriculum. I believe it teaches vital "low level" understanding of fundamental concepts of how software works. Sure, one level further down and we get to assembly and all that good stuff, and while I think that should be taught as well to some degree - maybe as an elective - C is probably the "best" starting point.
I also agree with Joel, the article's author, with the essence of this quote.
I am actually physically disgusted that so many computer science programs think that Java is a good introductory language, because it's "easy" and you don't get confused with all that boring string/malloc stuff but you can learn cool OOP stuff which will make your big programs ever so modular. This is a pedagogical disaster waiting to happen. Generations of graduates are descending on us and creating Shlemiel The Painter algorithms right and left and they don't even realize it, since they fundamentally have no idea that strings are, at a very deep level, difficult, even if you can't quite see that in your perl script. If you want to teach somebody something well, you have to start at the very lowest level.