Qiuhui Liu

Orcid: 0000-0002-6936-4107

According to our database1, Qiuhui Liu authored at least 16 papers between 2017 and 2024.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2024
Rewiring the Transformer with Depth-Wise LSTMs.
Proceedings of the 2024 Joint International Conference on Computational Linguistics, 2024

2021
Probing Word Translations in the Transformer and Trading Decoder for Encoder Layers.
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2021

Learning Hard Retrieval Decoder Attention for Transformers.
Proceedings of the Findings of the Association for Computational Linguistics: EMNLP 2021, 2021

Multi-Head Highly Parallelized LSTM Decoder for Neural Machine Translation.
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, 2021

Modeling Task-Aware MIMO Cardinality for Efficient Multilingual Neural Machine Translation.
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, 2021

2020
Learning Hard Retrieval Cross Attention for Transformer.
CoRR, 2020

Transformer with Depth-Wise LSTM.
CoRR, 2020

Analyzing Word Translation of Transformer Layers.
CoRR, 2020

Efficient Context-Aware Neural Machine Translation with Layer-Wise Weighting and Input-Aware Gating.
Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence, 2020

Lipschitz Constrained Parameter Initialization for Deep Transformers.
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2020

Learning Source Phrase Representations for Neural Machine Translation.
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2020

Dynamically Adjusting Transformer Batch Size by Monitoring Gradient Direction Change.
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2020

2019
Why Deep Transformers are Difficult to Converge? From Computation Order to Lipschitz Restricted Parameter Initialization.
CoRR, 2019

Neutron: An Implementation of the Transformer Translation Model and its Variants.
CoRR, 2019

UdS Submission for the WMT 19 Automatic Post-Editing Task.
Proceedings of the Fourth Conference on Machine Translation, 2019

2017
Improving Chinese-English Neural Machine Translation with Detected Usages of Function Words.
Proceedings of the Natural Language Processing and Chinese Computing, 2017


  Loading...