I learn something new every day. I have to learn constantly to keep up with relentlessly fast changing IT world. I also forget most things i’ve learned – to conserve the finite amount of developer memory i have – but this is not the point. I find that i have only really understood something if i’ve gone through the “understanding dip”. Most of the time, when i believe that i’ve understood something initially, i find out that i actually haven’t. At least my subjective feeling that i’ve understood something is initially quite well. This is sufficient most of the time because most of the time you’re not entirely wrong, just not really correct ( and most of the time your peers are none the wiser because we’re talking about non trivial subject matters here). Upon more thorough learning, i tend to go though a “dip” which is the feeling that you understand less than before. Eventually through intensive study which requires time for reflection, you can come out of the dip with a better understanding than before – wiser than before. If i’m studying something and have not had this “dip” feeling, i don’t trust that i’ve fully understood the subject.
Most of the time, IT practitioners today don’t have the time to ride the full learning curve all the way to the end. By the time you’ve learned something ( gone some way up that curve), the next thing is knocking on the door and you start on the next curve. The learning curve is more often related to developer “productivity” than to “understanding”. If you cheat and skip off the “learning curve”, you’re productivity suffers. If you don’t ride the “understanding dip” to the end, you’re likely to be simply wrong about a lot of the things you think you know about – which is potentially a lot worse.