A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | Please write your name in the Columns "Presenter" and "Reviewer" for the papers that you want to present and review. You will need to choose one paper for presentation and one paper for reviewing. You can nominate your papers but please talk with me first. | Schedule | ||||||||||||||||||||||||
2 | Paper ID | Paper | URL | Presenter | Reviewer | 04/11 | Transforming Sequence Tagging Into A Seq2Seq Task | https://aclanthology.org/2022.emnlp-main.813/ | Amir | |||||||||||||||||
3 | 1 | UL2: Unifying Language Learning Paradigms | https://arxiv.org/abs/2205.05131 | Preventing Verbatim Memorization in Language Models Gives a False Sense of Privacy | https://arxiv.org/pdf/2210.17546.pdf | Zayd | ||||||||||||||||||||
4 | 2 | The Geometry of Multilingual Language Model Representations | https://aclanthology.org/2022.emnlp-main.9/ | Zayd | x | 04/18 | Don’t Prompt, Search! Mining-based Zero-Shot Learning with Language Models | https://aclanthology.org/2022.emnlp-main.509/ | Viet | |||||||||||||||||
5 | 3 | Interpreting Language Models with Contrastive Explanations | https://aclanthology.org/2022.emnlp-main.14/ | Hieu Man | x | A Multitask, Multilingual, Multimodal Evaluation of ChatGPT on Reasoning, Hallucination, and Interactivity | https://arxiv.org/abs/2302.04023 | Gabriel | ||||||||||||||||||
6 | 4 | Structured Prompting: Scaling In-Context Learning to 1,000 Examples | https://arxiv.org/abs/2212.06713 | Viet | x | 04/25 | Cross-Linguistic Syntactic Difference in Multilingual BERT: How Good is It and How Does It Affect Transfer? | https://aclanthology.org/2022.emnlp-main.552/ | Hakyung | |||||||||||||||||
7 | 5 | Parallel Context Windows Improve In-Context Learning of Large Language Models | https://arxiv.org/abs/2212.10947 | Paul | x | Zero-shot Cross-lingual Transfer of Prompt-based Tuning with a Unified Multilingual Prompt | https://aclanthology.org/2022.emnlp-main.790/ | Navya | ||||||||||||||||||
8 | 6 | A Length-Extrapolatable Transformer | https://arxiv.org/abs/2212.10554 | Gabriel | x | 05/02 | Interpreting Language Models with Contrastive Explanations | https://aclanthology.org/2022.emnlp-main.14/ | Hieu Man | |||||||||||||||||
9 | 7 | A Multitask, Multilingual, Multimodal Evaluation of ChatGPT on Reasoning, Hallucination, and Interactivity | https://arxiv.org/abs/2302.04023 | Gabriel | x | RankGen: Improving Text Generation with Large Ranking Models | https://aclanthology.org/2022.emnlp-main.15/ | Timmy | ||||||||||||||||||
10 | 8 | RankGen: Improving Text Generation with Large Ranking Models | https://aclanthology.org/2022.emnlp-main.15/ | Timmy | x | 05/09 | Parallel Context Windows Improve In-Context Learning of Large Language Models | https://arxiv.org/abs/2212.10947 | Paul | |||||||||||||||||
11 | 9 | Entity Extraction in Low Resource Domains with Selective Pre-training of Large Language Models | https://aclanthology.org/2022.emnlp-main.61/ | Navya | x | A Length-Extrapolatable Transformer | https://arxiv.org/abs/2212.10554 | Gabriel | ||||||||||||||||||
12 | 10 | Gradient-based Constrained Sampling from Language Models | https://aclanthology.org/2022.emnlp-main.144/ | Timmy | x | 05/16 | The Geometry of Multilingual Language Model Representations | https://aclanthology.org/2022.emnlp-main.9/ | Zayd | |||||||||||||||||
13 | 11 | Cross-Linguistic Syntactic Difference in Multilingual BERT: How Good is It and How Does It Affect Transfer? | https://aclanthology.org/2022.emnlp-main.552/ | Hakyung | x | Prompt-based Distribution Alignment for Domain Generalization in Text Classification | https://aclanthology.org/2022.emnlp-main.690/ | Hieu Man | ||||||||||||||||||
14 | 12 | PromptBERT: Improving BERT Sentence Embeddings with Prompts | https://aclanthology.org/2022.emnlp-main.603/ | 05/23 | Structured Prompting: Scaling In-Context Learning to 1,000 Examples | https://arxiv.org/abs/2212.06713 | Viet | |||||||||||||||||||
15 | 13 | Active Example Selection for In-Context Learning | https://aclanthology.org/2022.emnlp-main.622/ | Amir | Gradient-based Constrained Sampling from Language Models | https://aclanthology.org/2022.emnlp-main.144/ | Timmy | |||||||||||||||||||
16 | 14 | Prompt-based Distribution Alignment for Domain Generalization in Text Classification | https://aclanthology.org/2022.emnlp-main.690/ | Hieu Man | x | 05/30 | Efficient Pre-training of Masked Language Model via Concept-based Curriculum Masking | https://aclanthology.org/2022.emnlp-main.502/ | Hakyung | |||||||||||||||||
17 | 15 | Evade the Trap of Mediocrity: Promoting Diversity and Novelty in Text Generation via Concentrating Attention | https://aclanthology.org/2022.emnlp-main.745/ | Paul | Entity Extraction in Low Resource Domains with Selective Pre-training of Large Language Models | https://aclanthology.org/2022.emnlp-main.61/ | Navya | |||||||||||||||||||
18 | 16 | Zero-shot Cross-lingual Transfer of Prompt-based Tuning with a Unified Multilingual Prompt | https://aclanthology.org/2022.emnlp-main.790/ | Navya | x | 06/06 | ||||||||||||||||||||
19 | 17 | Transforming Sequence Tagging Into A Seq2Seq Task | https://aclanthology.org/2022.emnlp-main.813/ | Amir | x | |||||||||||||||||||||
20 | 18 | Efficient Pre-training of Masked Language Model via Concept-based Curriculum Masking | https://aclanthology.org/2022.emnlp-main.502/ | Hakyung | x | |||||||||||||||||||||
21 | 19 | Don’t Prompt, Search! Mining-based Zero-Shot Learning with Language Models | https://aclanthology.org/2022.emnlp-main.509/ | Viet | x | |||||||||||||||||||||
22 | 20 | Quality at a Glance: An Audit of Web-Crawled Multilingual Datasets | https://aclanthology.org/2022.tacl-1.4/ | |||||||||||||||||||||||
23 | 21 | Preventing Verbatim Memorization in Language Models Gives a False Sense of Privacy | https://arxiv.org/pdf/2210.17546.pdf | Zayd | x | |||||||||||||||||||||
24 | ||||||||||||||||||||||||||
25 | ||||||||||||||||||||||||||
26 | ||||||||||||||||||||||||||
27 | ||||||||||||||||||||||||||
28 | ||||||||||||||||||||||||||
29 | ||||||||||||||||||||||||||
30 | ||||||||||||||||||||||||||
31 | ||||||||||||||||||||||||||
32 | ||||||||||||||||||||||||||
33 | ||||||||||||||||||||||||||
34 | ||||||||||||||||||||||||||
35 | ||||||||||||||||||||||||||
36 | ||||||||||||||||||||||||||
37 | ||||||||||||||||||||||||||
38 | ||||||||||||||||||||||||||
39 | ||||||||||||||||||||||||||
40 | ||||||||||||||||||||||||||
41 | ||||||||||||||||||||||||||
42 | ||||||||||||||||||||||||||
43 | ||||||||||||||||||||||||||
44 | ||||||||||||||||||||||||||
45 | ||||||||||||||||||||||||||
46 | ||||||||||||||||||||||||||
47 | ||||||||||||||||||||||||||
48 | ||||||||||||||||||||||||||
49 | ||||||||||||||||||||||||||
50 | ||||||||||||||||||||||||||
51 | ||||||||||||||||||||||||||
52 | ||||||||||||||||||||||||||
53 | ||||||||||||||||||||||||||
54 | ||||||||||||||||||||||||||
55 | ||||||||||||||||||||||||||
56 | ||||||||||||||||||||||||||
57 | ||||||||||||||||||||||||||
58 | ||||||||||||||||||||||||||
59 | ||||||||||||||||||||||||||
60 | ||||||||||||||||||||||||||
61 | ||||||||||||||||||||||||||
62 | ||||||||||||||||||||||||||
63 | ||||||||||||||||||||||||||
64 | ||||||||||||||||||||||||||
65 | ||||||||||||||||||||||||||
66 | ||||||||||||||||||||||||||
67 | ||||||||||||||||||||||||||
68 | ||||||||||||||||||||||||||
69 | ||||||||||||||||||||||||||
70 | ||||||||||||||||||||||||||
71 | ||||||||||||||||||||||||||
72 | ||||||||||||||||||||||||||
73 | ||||||||||||||||||||||||||
74 | ||||||||||||||||||||||||||
75 | ||||||||||||||||||||||||||
76 | ||||||||||||||||||||||||||
77 | ||||||||||||||||||||||||||
78 | ||||||||||||||||||||||||||
79 | ||||||||||||||||||||||||||
80 | ||||||||||||||||||||||||||
81 | ||||||||||||||||||||||||||
82 | ||||||||||||||||||||||||||
83 | ||||||||||||||||||||||||||
84 | ||||||||||||||||||||||||||
85 | ||||||||||||||||||||||||||
86 | ||||||||||||||||||||||||||
87 | ||||||||||||||||||||||||||
88 | ||||||||||||||||||||||||||
89 | ||||||||||||||||||||||||||
90 | ||||||||||||||||||||||||||
91 | ||||||||||||||||||||||||||
92 | ||||||||||||||||||||||||||
93 | ||||||||||||||||||||||||||
94 | ||||||||||||||||||||||||||
95 | ||||||||||||||||||||||||||
96 | ||||||||||||||||||||||||||
97 | ||||||||||||||||||||||||||
98 | ||||||||||||||||||||||||||
99 | ||||||||||||||||||||||||||
100 |