Tuesday, November 8, 2016

The Fundamentals

What changed me from an amateur into a professional was getting a really firm grip on the fundamentals -- Toshiro Kageyama (7d Professional Go Player)

It was pretty much the only advise I remember my dad repeatedly give me, both before starting undergrad and before starting my masters just a few months ago: focus on the fundamentals. He may have said other things, but focus on the fundamentals is the only phrase that really stuck.

You might have heard of the story of Da Vinci and the egg that illustrates this concept:
It has been said that when Da Vinci first came to Verrocchio’s workshop, he was told to draw eggs. Day after day, Da Vinci was told to draw eggs all the time. One day, Da Vinci finally got tired of drawing eggs so that he came to his master Verrocchio for complaint. However, his master explained to him of profound significance “Drawing eggs is not a simple thing to do, even for the same egg, if you change the observation angle, the light will change as well, and you will find the different shape of it.” Da Vinci suddenly understood the purpose of his master. After then, Da Vinci accepted drawing eggs with an open mind which actually helped him built the foundation of further achievement.
The fundamentals can often be less interesting than whatever is new and shiny. It requires patience and honest self-assessment about how much one actually understands. It's always so tempting to "move fast and break things". But a firm grasp on the fundamentals is necessary to be able to intuit deep connections and do meaningful work.

There's a second part to the argument for focusing on the fundamentals. Popular research will move on, and that whatever is new and shiny now will cease to be important in a few years. The skills that will stay relevant for a long time will end up being the fundamentals, the things that won't change or go out of favour. This is even more crucial in a field like machine learning that moves lightning fast.

But what exactly are the fundamentals of machine learning? There are the obvious tools like linear algebra and multivariate calculus. There's regression and its generalizations, gradient descent and its second-order extensions, back-propagations and the like. What about all the types of neural networks with the many acronyms like CNN, RNN, LSTM, ...? At what point do we break away from the fundamentals and find ourselves in the arena of popular research, the kind of things that will cease to be important in a few years?

What's been smelling most like fundamentals in the last few months have been variational inference and variational autoencoders (VAE's). They come up everywhere in recent research, but VAE's were only introduced in 2014!

It's possible that something else will take its place in another few years. If so, is it still a fundamental? Perhaps machine learning is just such a young field that the fundamentals are still being built?