Junction Tree Variational Autoencoder for Molecular Graph Generation. Jin, W., Barzilay, R., & Jaakkola, T. In Dy, J. & Krause, A., editors, Proceedings of the 35th International Conference on Machine Learning, volume 80, of Proceedings of Machine Learning Research, pages 2323–2332, July, 2018. PMLR.
Junction Tree Variational Autoencoder for Molecular Graph Generation [link]Paper  abstract   bibtex   
We seek to automate the design of molecules based on specific chemical properties. In computational terms, this task involves continuous embedding and generation of molecular graphs. Our primary contribution is the direct realization of molecular graphs, a task previously approached by generating linear SMILES strings instead of graphs. Our junction tree variational autoencoder generates molecular graphs in two phases, by first generating a tree-structured scaffold over chemical substructures, and then combining them into a molecule with a graph message passing network. This approach allows us to incrementally expand molecules while maintaining chemical validity at every step. We evaluate our model on multiple tasks ranging from molecular generation to optimization. Across these tasks, our model outperforms previous state-of-the-art baselines by a significant margin.
@inproceedings{jin_junction_2018,
	series = {Proceedings of {Machine} {Learning} {Research}},
	title = {Junction {Tree} {Variational} {Autoencoder} for {Molecular} {Graph} {Generation}},
	volume = {80},
	url = {http://proceedings.mlr.press/v80/jin18a.html},
	abstract = {We seek to automate the design of molecules based on specific chemical properties. In computational terms, this task involves continuous embedding and generation of molecular graphs. Our primary contribution is the direct realization of molecular graphs, a task previously approached by generating linear SMILES strings instead of graphs. Our junction tree variational autoencoder generates molecular graphs in two phases, by first generating a tree-structured scaffold over chemical substructures, and then combining them into a molecule with a graph message passing network. This approach allows us to incrementally expand molecules while maintaining chemical validity at every step. We evaluate our model on multiple tasks ranging from molecular generation to optimization. Across these tasks, our model outperforms previous state-of-the-art baselines by a significant margin.},
	booktitle = {Proceedings of the 35th {International} {Conference} on {Machine} {Learning}},
	publisher = {PMLR},
	author = {Jin, Wengong and Barzilay, Regina and Jaakkola, Tommi},
	editor = {Dy, Jennifer and Krause, Andreas},
	month = jul,
	year = {2018},
	pages = {2323--2332},
}

Downloads: 0