CoNLL 2016 - Accepted submissions

  • A Data-driven Investigation of Corrective Feedback on Subject Omission Errors in First Language Acquisition
    Sarah Hiller and Raquel Fernandez
  • Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond
    Ramesh Nallapati, Bowen Zhou, Cicero dos Santos, Caglar Gulcehre and Bing Xiang
  • Analyzing Learner Understanding of Novel L2 Vocabulary
    Rebecca Knowles, Adithya Renduchintala, Philipp Koehn and Jason Eisner
  • Beyond Centrality and Structural Features: Learning Information Importance for Text Summarization
    Markus Zopf, Eneldo Loza Mencía and Johannes Fürnkranz
  • Beyond Prefix-Based Interactive Translation Prediction
    Jesús González-Rubio, Daniel Ortiz Martinez, Francisco Casacuberta and Jose Miguel Benedi Ruiz
  • Compression of Neural Machine Translation Models via Pruning
    Abigail See, Minh-Thang Luong and Christopher D. Manning
  • context2vec: Learning Generic Context Embedding with Bidirectional LSTM
    Oren Melamud, Jacob Goldberger and Ido Dagan
  • Coreference in Wikipedia: Main Concept Resolution
    Abbas Ghaddar and Phillippe Langlais
  • Cross-Lingual Named Entity Recognition via Wikification
    Chen-Tse Tsai, Stephen Mayhew and Dan Roth
  • Distributed Representation based Compositional Script Model
    Ashutosh Modi and Ivan Titov
  • Entity Disambiguation by Knowledge and Text Jointly Embedding
    Wei Fang, Jianwen Zhang, Dilin Wang, Zheng Chen and Ming Li
  • Event Linking with Sentential Features from Convolutional Neural Networks
    Sebastian Krause, Feiyu Xu, Hans Uszkoreit and Dirk Weissenborn
  • Exploring Prediction Uncertainty in Machine Translation Quality Estimation
    Daniel Beck, Lucia Specia and Trevor Cohn
  • Generating Sentences from a Continuous Space
    Samuel R. Bowman, Luke Vilnis, Oriol Vinyals, Andrew Dai, Rafal Jozefowicz and Samy Bengio
  • Greedy, Joint Syntactic-Semantic Parsing with Stack LSTMs
    Swabha Swayamdipta, Miguel Ballesteros, Chris Dyer and Noah A. Smith
  • Harnessing Sequence Labeling for Sarcasm Detection in Dialogue from TV Series `Friends'
    Aditya Joshi, Vaibhav Tripathi, Pushpak Bhattacharyya and Mark J Carman
  • Identifying Temporal Orientation of Word Senses
    Mohammed Hasanuzzaman and Gaël Dias
  • Incremental Prediction of Sentence-final Verbs
    Alvin Grissom II, Naho Orita and Jordan Boyd-Graber
  • Joint Learning of the Embedding of Words and Entities for Named Entity Disambiguation
    Ikuya Yamada, Hiroyuki Shindo, Hideaki Takeda and Yoshiyasu Takefuji
  • Learning to Jointly Predict Ellipsis and Comparison Structures
    Alexis Cornelia Wellwood, James Allen and Omid Bakhshandeh
  • Learning when to trust distant supervision: An application to low-resource POS tagging using cross-lingual projection
    Meng Fang and Trevor Cohn
  • Leveraging Cognitive Features for Sentiment Analysis
    Abhijit Mishra, Diptesh Kanojia, Seema Nagar, Kuntal Dey and Pushpak Bhattacharyya
  • Modeling language change and evolution through context-sensitive phonetic alignment
    Javad Nouri and Roman Yangarber
  • Modelling Context with User Embeddingsfor Sarcasm Detection in Social Media
    Silvio Amir and Byron C. Wallace
  • Modelling the Usage of Discourse Connectives as Rational Speech Acts
    Frances Yung, Kevin Duh, Taku Komura and Yuji Matsumoto
  • Neighborhood Mixture Model for Knowledge Base Completion
    Dat Quoc Nguyen, Kairit Sirts, Lizhen Qu and Mark Johnson
  • Redefining part-of-speech classes with distributional semantic models
    Andrey Kutuzov, Erik Velldal and Lilja Øvrelid
  • Semi-supervised Clustering for Short Text via Deep Representation Learning
    Zhiguo Wang, Haitao Mi and Abraham Ittycheriah
  • Semi-supervised Convolutional Networks for Translation Adaptation with Tiny Amount of In-domain Data
    Boxing Chen and Fei Huang
  • Substring-based unsupervised transliteration with phonetic and contextual knowledge
    Anoop Kunchukuttan, Pushpak Bhattacharyya and Mitesh M. Khapra