Back in early May, Silvia and myself went to Aalto, Finland to attend the 2018 International Workshop on Machine Learning for Materials Science. Most of the talks from the conference were live streamed and are available here.
Although Materials Science is not directly related to our work, many of the tools and machine learning algorithms are the same and this conference seemed like a perfect place to get to know the state of the art of the field.
The conference had several nice posters and presentations but a few stood out.
Bjørk Hammer‘s group presented a poster on modifying the AlphaGo algorithm to optimize molecular structures or small clusters. The method only works in toy examples for now, but works by placing one atom at a time while trying to optimize a long term loss function. This in principle avoids the combinatorics problem associated with e.g. nano clusters.
Christopher Sutton held a very thought inspiring talk, where they posed a machine learning problem on the competition platform Kaggle. The way Kaggle works is that you put a bounty on a machine learning problem to the community including a data set containing example input and output. Anyone can then submit a model that tries to predict the output of some never seen before input. The three models that performs the best on the ‘secret’ test set known only to the organizers and Kaggle then shares the prize money.
In this case they offered a $5000 prize to the models that best predicts both formation energy and band gap energy of (AlXGaYInZ)O3 compounds. The talk is available here.
This approach raises an interesting question about whether you should outsource a scientific problem to e.g. Kaggle. If you don’t have the expertise needed to solve a problem among your group or current collaborators, then you can either actively seek out new collaboration, which might be the safe choice, or you can try your luck with Kaggle which might be more likely to yield a very novel approach.