kzhao.hf [at] gmail.com
I recently joined Google as a research scientist. [Curriculum Vitae]
I received my Ph.D. degree from School of Electrical Engineering and Computer Science, Oregon State University.
My research is focused on algorithms and theory in Natural Language Processing (NLP), especially in structured prediction problems including syntactic/semantic parsing, machine translation, and textual entailment. I am also very interested in Machine Learning like online learning and deep learning. My advisor is Prof. Liang Huang.
I have been very fortunate to work with Hao Zhang, Cong Yu, and Flip Korn at Google Research for internship on data mining from structured data in the summer of 2015, with Hany Hassan and Michael Auli at Microsoft Research in the summer of 2014, and with Abe Ittycheriah and Haitao Mi at IBM T.J. Watson Research Center in the summer of 2013 on machine translation.
Prior to Oregon State University, I started my Ph.D. study at Computer Science Department, Graduate Center, City University of New York (CUNY). I received my Bachelor of Engineering (B.Eng.) degree from Computer Science Department, University of Science and Technology of China (USTC) in 2010.
My publication list is also available at Google Scholar.
Liang Huang, He Zhang, Dezhong Deng, Kai Zhao, Kaibo Liu, David Hendrix, and David Mathews. LinearFold: linear-time approximate RNA folding by 5’-to-3’ dynamic programming and beam search. In Proceedings of ISMB 2019. parsing
Kai Zhao and Liang Huang. Joint Syntacto-Discourse Parsing and the Syntacto-Discourse Treebank. In Proceedings of EMNLP 2017 (arXiv, code). parsing deep learning
Liang Huang, Kai Zhao, and Mingbo Ma. When to Finish? Optimal Beam Search for Neural Text Generation (modulo beam size). In Proceedings of EMNLP 2017. machine translation deep learning
Mingbo Ma, Kai Zhao, Liang Huang, Bing Xiang, and Bowen Zhou, Jointly Trained Sequential Labeling and Classification by Sparse Attention Neural Networks. In Proceedings of Interspeech 2017 (arXiv). deep learning
Kai Zhao, Liang Huang, and Mingbo Ma. Textual Entailment with Structured Attentions and Composition. In Proceedings of COLING 2016 (arXiv, code, slides). textual entailment deep learning
Feifei Zhai, Liang Huang, and Kai Zhao. Search-aware Tuning for Hierarchical Phrase-based Decoding. In Proceedings of EMNLP 2015. machine translation
Kai Zhao, Hany Hassan, and Michael Auli. Learning Translation Models from Monolingual Continuous Representations. In Proceedings of NAACL 2015 (slides). machine translation deep learning
Kai Zhao and Liang Huang. Type-driven Incremental Semantic Parsing with Polymorphism. In Proceedings of NAACL 2015 (code, slides). semantic parsing structured prediction
Kai Zhao, Liang Huang, Haitao Mi, and Abe Ittycheriah. Hierarchical MT Training using Max-Violation Perceptron. In Proceedings of ACL 2014 (slides). machine translation structured prediction
Kai Zhao, James Cross, and Liang Huang. Optimal Incremental Parsing via Best-First Dynamic Programming. In Proceedings of EMNLP 2013. syntactic parsing
Heng Yu, Liang Huang, Haitao Mi, and Kai Zhao. Max-Violation Perceptron and Forced Decoding for Scalable MT Training. In Proceedings of EMNLP 2013. machine translation structured prediction
Hao Zhang, Liang Huang, Kai Zhao, and Ryan McDonald. Online Learning with Inexact Hypergraph Search. In Proceedings of EMNLP 2013. syntactic parsing structured prediction
Yoav Goldberg, Kai Zhao, and Liang Huang. Efficient Implementation for Beam Search Incremental Parsers. In Proceedings of ACL 2013. syntactic parsing
Kai Zhao and Liang Huang. Minibatch and Parallelization for Online Large Margin Structured Learning. In Proceedings of NAACL 2013. syntactic parsing online learning
I am cleaning up and releasing my old codes for some of the papers listed above. Hopefully these can help people doing related researches. I usually write in Python, sometimes in C/C++ for efficiency reasons. My experience with C# at Microsoft is fascinating. (LINQ and parallelization are my favorites.) I am looking forword to seeing it become more popular. For deep learning I mostly write in Lua with Torch.
Optimal Incremental Dependency Parser with Best-First Dynamic Programming: to be ready soon.
EverTeX: I am not a heavy Evernote user, but its lack of LaTeX-style formula support is really annoying. So I wrote this Firefox greasemonkey plugin to help LaTeX formula input/display in Evernote website.