Speaker
Description
Large Language Models (LLMs) have been dominating the discussion fora on language technology for at least the past seven years. As much as LLMs have spurred progress in NLP, recent research has been demonstrating their performance seems to reach a limit which cannot be overcome with more training data. Therefore, hybrid approaches combining LLMs and Language Resources have been gaining momentum. In this talk I explore possible futures for research in semantic lexical resources in combination with LLMs and AI techniques. As examples of possible research paths, I discuss the application of the FrameNet model to the development of a tool for identifying territories prone to suffer from gender based violence, as well as to the growing field of multimodal NLP.le in polite company.