1 to 10 of 15 Results
Mar 26, 2020
Rehbein, Ines; Ruppenhofer, Josef; Zimmermann, Victor, 2020, "A harmonised testsuite for social media POS tagging (DE)", https://doi.org/10.11588/data/KXLMHN, heiDATA, V1
A harmonised POS testsuite of web data, CMC and Twitter microtext, with word forms and STTS pos tags (+ some additional CMC-specific tags). UD pos tags have been automatically converted, based on the STTS pos tags. The data does not contain (manually corrected) lemma information.... |
Mar 26, 2020
Rehbein, Ines; Steen, Julius; Do, Bich-Ngoc; Frank, Anette, 2020, "Converter for content-to-head style syntactic dependencies", https://doi.org/10.11588/data/HE3BAZ, heiDATA, V1
A set of Python scripts that convert function-head style encodings in dependency treebanks in a content-head style encoding (as used in the UD treebanks) and vice versa (for adpositions, copula and coordination). For more information, see (Rehbein, Steen, Do & Frank 2017). |
Nov 13, 2023 - Neural Techniques for German Dependency Parsing
Do, Bich-Ngoc; Rehbein, Ines, 2023, "Datasets for Dependency Tree Reranking", https://doi.org/10.11588/data/E5NOYH, heiDATA, V1
This resource contains the datasets for dependency tree reranking in 3 languages: English, German and Czech. The creation, analysis and experiment results of the datasets are described in the paper: Do and Rehbein (2020). "Neural Reranking for Dependency Parsing: An Evaluation". |
Mar 26, 2020
Rehbein, Ines; Ruppenhofer, Josef, 2020, "German causal language annotations and lexicon (verbs, nouns, prepositions) (DE)", https://doi.org/10.11588/data/ZHI94V, heiDATA, V1
Annotations of causal verbs, nouns and prepositions in context and lexicon file for causal verbs, nouns and prepositions. |
Nov 13, 2023 - Neural Techniques for German Dependency Parsing
Do, Bich-Ngoc; Rehbein, Ines; Frank, Anette, 2023, "Head Selection Parsers and LSTM Labelers", https://doi.org/10.11588/data/BPWWJL, heiDATA, V1
This resource contains code, data and pre-trained models for various types of neural dependency parsers and LSTM labelers used in the papers: Do et al. (2017). "What Do We Need to Know About an Unknown Word When Parsing German" Do and Rehbein (2017). "Evaluating LSTM Models for G... |
Mar 26, 2020
Rehbein, Ines; Ruppenhofer, Josef; Steen, Julius, 2020, "MACE-AL", https://doi.org/10.11588/data/C2OQN4, heiDATA, V1
A method for detecting noise in automatically annotated sequence-labelled data, combining MACE (Hovy et al. 2013) with Active Learning. |
Mar 26, 2020
Rehbein, Ines; Ruppenhofer, Josef, 2020, "MACE-AL-TREE", https://doi.org/10.11588/data/THPEBR, heiDATA, V1
An method for detecting noise in automatically annotated dependency parse trees, combining MACE (Hovy et al. 2013) with Active Learning. |
Nov 13, 2023 - Neural Techniques for German Dependency Parsing
Do, Bich-Ngoc; Rehbein, Ines, 2023, "Neural Dependency Parser with Biaffine Attention and BERT Embeddings", https://doi.org/10.11588/data/0U6IWL, heiDATA, V1
This resource contains the code of the dependency parser used in the paper: Do and Rehbein (2020). "Parsers Know Best: German PP Attachment Revisited". The parser is a re-implementation of the neural dependency parser from Dozat and Manning (2017) and is extended to use the BERT... |
Nov 13, 2023 - Neural Techniques for German Dependency Parsing
Do, Bich-Ngoc; Rehbein, Ines, 2023, "Neural PP Attachment Disambiguation Systems", https://doi.org/10.11588/data/DKWKGJ, heiDATA, V1
This resource contains code for different types of neural PP attachment disambiguation systems: A disambiguation system inspired by de Kok et al. (2017) but with the ranking loss function. A disambiguation system with biaffine attention similar to the neural dependency parser in... |
Nov 13, 2023 - Neural Techniques for German Dependency Parsing
Do, Bich-Ngoc; Rehbein, Ines, 2023, "Neural Rerankers for Dependency Parsing", https://doi.org/10.11588/data/NNGPQZ, heiDATA, V1
This resource contains code for different types of neural rerankers (RCNN, RCNN-shared and GCN) from the paper: Do and Rehbein (2020). "Neural Reranking for Dependency Parsing: An Evaluation". We also include in this resource the pre-trained models of different rerankers on 3 lan... |