I’m in Zurich doing these things:

  • Research in Transfer Learning: I am doing an AI Residency (research in industry) at Google Brain in Zurich.
    • I am researching transfer learning and learning good representations of images.
    • We recently showed that you can get really good performance cheaply on new tasks when you train your original model using a big dataset (among other key components we describe in our paper). Our model is called Big Transfer (BiT). You can see our paper here.
      • As of today, our model is the best on ImageNet, CIFAR-10, CIFAR-100 and other vision datasets. 🙂 And it does very well on new datasets even if you give it only a few examples per class.
      • Another interesting finding was that if your model isn’t big enough, if you train on a larger dataset you might actually do worse (smallest dot in the rightmost column of each plot), but if your model is big enough (and train using the simple method we describe in the paper) you benefit a lot from the larger dataset (other dots in rightmost column of each plot).
      • We plan to release the models trained on ImageNet-21k so other researchers and practitioners can use them – am really excited about that!
      • Let me know if you want to know more. Will likely write a longer post on this later. 🙂
  • Blogging (hoping to continue!)
    • Blogging about learning machine learning and programming (Python specifics and algorithm design).
    • I blog to better understand what I’m learning and to make learning easier for other people.
    • Hoping to continue blogging once I have settled down and ironed out the details with Google.
  • Learning French 🙂 and ein bisschen Deutsch.
  • Music: Am singing and learning to play the bass guitar (!!) at C3 (church) Zurich.
  • Reading the Bible and other books.
  • Occasionally bouldering (climbing) indoors.
  • Pet Projects: Improving HelloMotions.com, an online debating topics database.

Last updated: December 2019

(Idea borrowed from Derek Sivers.)