Relational Knowledge and Language Models


In this talk, I explain a few recent studies in the area of commonsense & relational knowledge probing of pretrained language models. Following papers are covered in this talk:

  • Petroni, et al. “Language models as knowledge bases?” 2019
  • Jiang, et al. “How can we know what language models know?” 2019
  • Bouraoui, et al. “Inducing relational knowledge from BERT.” 2019
2020-12, Cardiff NLP Reading Group, Commonsense Knowledge Probing from asahiushio1