Obserwuj
I-Ta Lee
Tytuł
Cytowane przez
Cytowane przez
Rok
Multi-relational script learning for discourse relations
IT Lee, D Goldwasser
Proceedings of the 57th Annual Meeting of the Association for Computational …, 2019
292019
FEEL: Featured Event Embedding Learning
IT Lee, D Goldwasser
Association for the Advancement Artificial Intelligence (AAAI), 2018
212018
Ideological phrase indicators for classification of political discourse framing on Twitter
K Johnson, IT Lee, D Goldwasser
Proceedings of the Second Workshop on NLP and Computational Social Science …, 2017
182017
A cooperative multicast routing protocol for mobile ad hoc networks
IT Lee, GL Chiou, SR Yang
Computer Networks 55 (10), 2407-2424, 2011
82011
Weakly-supervised modeling of contextualized event embedding for discourse relations
IT Lee, ML Pacheco, D Goldwasser
Findings of the Association for Computational Linguistics: EMNLP 2020, 4962-4972, 2020
72020
ACE–an anomaly contribution explainer for cyber-security applications
X Zhang, M Marwah, I Lee, M Arlitt, D Goldwasser
2019 IEEE International Conference on Big Data (Big Data), 1991-2000, 2019
72019
PurdueNLP at SemEval-2017 Task 1: Predicting semantic textual similarity with paraphrase and event embeddings
IT Lee, M Goindani, C Li, D Jin, K Johnson, X Zhang, ML Pacheco, ...
Proceedings of the 11th International Workshop on Semantic Evaluation …, 2017
52017
Adapting event embedding for implicit discourse relation recognition
ML Pacheco, IT Lee, X Zhang, AK Zehady, P Daga, D Jin, A Parolia, ...
Proceedings of the CoNLL-16 shared task, 136-142, 2016
52016
Modeling Human Mental States with an Entity-based Narrative Graph
IT Lee, ML Pacheco, D Goldwasser
arXiv preprint arXiv:2104.07079, 2021
12021
Commonsense Knowledge Representation and Reasoning in Statistical Script Learning
IT Lee
Purdue University Graduate School, 2020
2020
Attention-Based Self-Supervised Feature Learning for Security Data
IT Lee, M Marwah, M Arlitt
arXiv preprint arXiv:2003.10639, 2020
2020
Nie można teraz wykonać tej operacji. Spróbuj ponownie później.
Prace 1–11