Skip to main content

Web Content Display Web Content Display

Recent publications

  • Łukasz Borchmann, Łukasz Garncarek, Michał Pietruszka, Sparsifying Transformer Models with Trainable Representation Pooling, Association of Computational Linguistics [ACL](MAIN), (2022),
  • Łukasz Borchmann, Jurkiewicz Dawid, Filip Graliński, Michał Pietruszka, Tomasz Stanisławek, Karolina Szyndler, Michał Turski, DUE: End-to-End Document Understanding Benchmark, Advances in Neural Information Processing Systems [NeurIPS], (2021),
  • Łukasz Borchmann, Jurkiewicz Dawid, Tomasz Dwojak, Pałka Gabriela, Michał Pietruszka, Powalski Rafał, Going Full-TILT Boogie on Document Understanding with Text-Image-Layout Transformer, IEEE International Conference on Document Analysis and Recognition [ICDAR], (2021),
  • Łukasz Borchmann, Filip Graliński, Michał Pietruszka, Successive Halving Top-k Operator, National Conference of the American Association for Artificial Intelligence [AAAI], (2021),
  • Łukasz Borchmann, Jakub Chłędowski, Tomasz Dwojak, Filip Graliński, Michał Pietruszka, From Dataset Recycling to Multi-Property Extraction and Beyond, Conference on Computational Natural Language Learning [CoNLL] Proceedings of the 24th Conference on Computational Natural Language Learning (2020),
default avatar for Michał Pietruszka

Michał Pietruszka

academic degree/title Master of Science (MSc) status
PhD student of computer science
contact
michal.pietruszka@doctoral.uj.edu.pl