発表文献

学位論文

  1. 修士論文:Multilingual model using cross-lingual word embeddings based on subword alignment and cross-task projection

査読付き国際会議

  1. Vocabulary Adaptation for Domain Adaptation in Neural Machine Translation. Shoetsu Sato, Jin Sakuma, Naoki Yoshinaga, Masashi Toyoda, Masaru Kitsuregawa, EMNLP Findings 2020
  2. Wikipedia2Vec: An Efficient Toolkit for Learning and Visualizing the Embeddings of Words and Entities from Wikipedia. Ikuya Yamada, Akari Asai, Jin Sakuma, Hiroyuki Shindo, Hideaki Takeda, Yoshiyasu Takefuji, Yuji Matsumoto, EMNLP System Demonstrations 2020 [repo]
  3. Multilingual model using cross-task embedding projection. Jin Sakuma, Naoki Yoshinaga, CoNLL 2019 [slide, repo]
  4. Multilingual model using cross-task embedding projection. Jin Sakuma, Naoki Yoshinaga, ACL SRW 2019 (non-archival) [poster]
  5. Unsupervised Cross-lingual Word Embeddings Based on Subword Alignment. Jin Sakuma, Naoki Yoshinaga, CICLing 2019 [poster, repo]
  6. A Bag of Useful Tricks for Practical Neural Machine Translation: Embedding Layer Initialization and Large Batch Size. Masato Neishi*, Jin Sakuma*, Satoshi Tohda*, Shonosuke Ishiwatari, Naoki Yoshinaga, Masashi Toyoda, AUAPAF 2018 (*equal contribution)

査読なし国際会議

  1. A Bag of Useful Tricks for Practical Neural Machine Translation: Embedding Layer Initialization and Large Batch Size. Masato Neishi*, Jin Sakuma*, Satoshi Tohda*, Shonosuke Ishiwatari, Naoki Yoshinaga, Masashi Toyoda, WAT 2017 (*equal contribution) [repo]

国内会議

  1. Cross-lingual transfer learning considering word order difference. Haosen Zhan, Jin Sakuma, Naoki Yoshinaga, Masashi Toyoda. NLP 2021
  2. 語彙切換に基づくニューラル機械翻訳の遠ドメイン適応.佐藤翔悦,佐久間仁,吉永直樹,豊田正史,喜連川優.NLP 2020
  3. 単語分散表現のタスク横断写像に基づく高精度多言語モデル佐久間 仁,吉永 直樹.NLP 2019(若手奨励賞)
  4. タスクに特化した多言語単語分散表現を用いた多言語モデル佐久間 仁,吉永 直樹.YANS 2018
  5. ニューラル機械翻訳における埋め込み層の教師なし事前学習.根石 将人,佐久間 仁,遠田 哲史,石渡 祥之佑,吉永 直樹,豊田 正史,第233回 NL研(2017)
  6. 表層類似性を用いた多言語単語分散表現の教師なし学習手法佐久間仁,吉永直樹.第233回 NL研(2017)