You need to activate javascript for this site.
Menu Conteúdo Rodapé
  1. Home
  2. Courses
  3. Computer Science and Engineering
  4. Computation in Natural Language

Computation in Natural Language

Code 17916
Year 1
Semester S2
ECTS Credits 6
Workload PL(30H)/T(30H)
Scientific area Informatics
Entry requirements Consolidated experience in computer programming.
Learning outcomes This curricular unit explores state-of-the-art developments in natural language processing (NLP). Its primary objective is to equip students with the necessary skills to apply these technologies in solving complex scientific and engineering problems. Students are expected to develop the competence to design and implement systems that incorporate and leverage Large Language Models (LLMs). The learning path progresses from the classical foundations of natural language processing (NLP) to the study of modern architectures and large-scale models. Upon completion, students should be able to design Retrieval-Augmented Generation (RAG) systems, orchestrate resources using reference frameworks, and develop NL-based autonomous agents for practical and scientific applications.
Syllabus 1. Introduction and Fundamentals
1.1. Levels of human language analysis: lexical, syntactic, semantic, and pragmatic. Challenges and solutions. NLP and Computational Linguistics.
1.2. Classical resources for natural language (NL) manipulation (e.g., NLTK and SpaCy).
1.3. Examples and applications.

2. Vector Semantics of NL
2.1. Sparse (TF-IDF and BoW) and dense (Word2Vec) vectors in NL.
2.2. Dense and contextualized vectors (e.g., BERT).
2.3 Documental and lexical semantics

3. Information Manipulation with LLMs
3.1. Training, post-training, and fine-tuning of Large Language Models (LLMs).
3.2. Retrieval-Augmented Generation (RAG) systems.
3.3. Information identification and extraction in text.

4. NL Resource Orchestration Environments
4.1. Prompt engineering.
4.2. LangChain and LlamaIndex frameworks.
4.3. NL-based autonomous agent systems.

5. Conversational Interaction Systems
5.1. Model alignment and safety.
5.2. Conversation modes, effectiveness, and emotions.
Main Bibliography 1. Jurafsky, D. & Martin, J. (2026). Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition with Language Models, 3rd edition. Online manuscript.
2. Bouchard, L. F., & Peters, L. (2024). Building LLMs for production: enhancing LLM abilities and reliability with prompting, fine-tuning, and RAG. Towards AI
3. Oshin, M., & Campos, N. (2025). Learning LangChain. " O'Reilly Media, Inc."
4. Iusztin, M. L. P. (2024). LLM Engineer's Handbook. Packt Publishing.
5. Tunstall, L., Von Werra, L., & Wolf, T. (2022). Natural language processing with transformers. O'Reilly Media.
Language Portuguese. Tutorial support is available in English.
Last updated on: 2026-03-16

The cookies used in this website do not collect personal information that helps to identify you. By continuing you agree to the cookie policy.