ℹ️
🇬🇧
Search
Search for publications relevant for "pre-training"
pre-training
Publication
Class
Person
Publication
Programmes
Export current view
publication
Limitations and Challenges of Unsupervised Cross-lingual Pre-training
Publication without faculty affiliation
publication
An Empirical Exploration of Local Ordering Pre-training for Structured Prediction
Publication without faculty affiliation
publication
The Importance of Token Granularity Matching of Pre-trained Word Vectors for Deep Learning-Based Spam Classification
Publication without faculty affiliation
publication
Grammatical Error Correction with Pre-trained Model and Multilingual Learner Corpus for Cross-lingual Transfer Learning
Publication without faculty affiliation
publication
A BERT's Eye View: Identification of Irish Multiword Expressions Using Pre-trained Language Models
Publication without faculty affiliation
publication
An Empirical Study of Pre-trained Transformers for Arabic Information Extraction
Publication without faculty affiliation
publication
MERGEDISTILL: Merging Pre-trained Language Models Using Distillation
Publication without faculty affiliation
publication
Multilingual Probing of Deep Pre-Trained Contextual Encoders
Publication without faculty affiliation
publication
Same Neurons, Different Languages: Probing Morphosyntax in Multilingual Pre-trained Models
Publication without faculty affiliation
publication
Variable Mini-Batch Sizing and Pre-Trained Embeddings
2017 |
Faculty of Mathematics and Physics
publication
How Linguistically Fair Are Multilingual Pre-Trained Language Models?
Publication without faculty affiliation
publication
XGLUE: A New Benchmark Datasetfor Cross-lingual Pre-training, Understanding and Generation
Publication without faculty affiliation
publication
AuGPT: Dialogue with Pre-trained Language Models and Data Augmentation
2021 |
Faculty of Mathematics and Physics
publication
BertOdia: BERT Pre-training for Low Resource Odia Language
Publication without faculty affiliation
publication
On the Language Neutrality of Pre-trained Multilingual Representations
2020 |
Faculty of Mathematics and Physics
publication
Allocating Large Vocabulary Capacity for Cross-Lingual Language Model Pre-Training
Publication without faculty affiliation
publication
Evaluating Pre-training Objectives for Low-Resource Translation into Morphologically Rich Languages
Publication without faculty affiliation
publication
IndoLEM and IndoBERT: A Benchmark Dataset and Pre-trained Language Model for Indonesian NLP
Publication without faculty affiliation
publication
AuGPT: Auxiliary Tasks and Data Augmentation for End-To-End Dialogue with Pre-Trained Language Models
2021 |
Faculty of Mathematics and Physics
publication
Benchmarking pre-trained language models for multilingual NER: TraSpaS at the BSNLP2021 shared task
Publication without faculty affiliation
publication
Zero-shot Dependency Parsing with Pre-trained Multilingual Sentence Representations
Publication without faculty affiliation
publication
MLASK: Multimodal Summarization of Video-based News Articles
2023 |
Faculty of Mathematics and Physics
publication
Combining Static and Contextualised Multilingual Embeddings
2022 |
Faculty of Mathematics and Physics
publication
Improving Parallel Data Identification using Iteratively Refined Sentence Alignments and Bilingual Mappings of Pre-trained Language Models
Publication without faculty affiliation
publication
MLASK: Multimodal Summarization of Video-based News Articles
Publication without faculty affiliation
publication
Introducing various semantic models for amharic: Experimentation and evaluation with multiple tasks and datasets
Publication without faculty affiliation
publication
Fact Search and Analysis Tool
Publication without faculty affiliation
publication
Acoustic Emission and Feature Selection based on Sensitivity Analysis
2001 |
Faculty of Mathematics and Physics
publication
A Dataset and Strong Baselines for Classification of Czech News Texts
2023 |
Faculty of Mathematics and Physics
publication
ÚFAL at MultiLexNorm 2021: Improving Multilingual Lexical Normalization by Fine-tuning ByT5
2021 |
Faculty of Mathematics and Physics