 |
Time Series Forecasting: Classical to Advanced Modeling Techniques & Tools
[Details]
September, 2024
Task: Building understanding of time series forecasting problems, challenges, and various techniques through hands-on Python notebooks
Libraries: GluonTS, neuralforecast, Chronos, PyTorch, NumPy, Matplotlib
Series Overview: This comprehensive tutorial series covers time series forecasting from classical methods to advanced LLM-based approaches through 7 modules:
- Module 1-2 - Covers the basics of time series forecasting, including introduction to GluonTS
- Module 3 - Focuses on data and problem formulation for time series modeling
- Module 4-5 - Explores classical and neural network-based forecasting approaches
- Module 6 - Covers transformer-based architectures for time series
- Module 7 - Introduces LLM-based approaches for time series forecasting
|
 |
Understanding Kolmogorov-Arnold Networks: A Tutorial Series on KAN using Toy Examples
[Details]
May 27, 2024
Publication: Liu et al. KAN: Kolmogorov-Arnold Networks. ArXiv, 2024
Task: Building an understanding of Kolmogorov-Arnold Networks (KAN) using B-splines as activation functions through practical coding examples and theoretical insights.
Libraries: PyTorch, NumPy, Matplotlib, Jupyter
Series Overview: This tutorial series guides you through the complexities of KAN by dissecting its core components and demonstrating their application through practical examples. Each notebook in the series focuses on a different aspect of KANs:
|
 |
Exploring LLM Transmission Chains: A Tutorial Series on Human-like Content Biases in LLMs
[Details]
April 30, 2024
Publication: Acerbi et al. Large Language Models Show Human-like Content Biases in Transmission Chain Experiments. PNAS, 2023
Task: Reproducing and understanding the results regarding content biases in LLMs as observed in transmission chain experiments
Libraries: openai, rpy2, python-docx, matplotlib, pandas
Series Overview: This series includes multiple detailed notebooks, guiding you through the process of setting up, executing, and analyzing transmission chain experiments with LLMs to explore human-like content biases. Each notebook corresponds to one of the six plots in the main paper and includes three sections:
- Authors’ Results - Reproduces the paper’s findings using Python (originally conducted in R).
- Authors’ Summaries / GPT-4 Evaluation - Evaluates summaries from the original study using GPT-4.
- New Summaries / GPT-4 Evaluation - Uses new summaries from updated experiments and evaluates them using GPT-4.
|
 |
Exploring Simulacra: A Tutorial Series on Simulacra’s Generative Agents
[Details]
March 31, 2024
Publication: Park et al. Generative Agents: Interactice Simulacra of Human Behavior. UIST, 2023
Task: Creating and understanding generative agents proposed in Simulacra
Libraries: openai
Series Overview: This tutorial series spans four comprehensive notebooks, each designed to progressively build your understanding and skills in simulating sociological phenomena with generative agents. Check out the blog post where I break down the big ideas behind creating sociological simulations in an easy-to-get way.
- Notebook 1.1 - Basic Simulation Structure: An introduction to the essential components of simulation, including environment interaction and the construction of a basic simulation loop.
- Notebook 1.2 - Building Generative Agents: A deep dive into the creation of generative agents, with emphasis on memory structures derived from the Simulacra paper.
- Notebook 1.3 - Agent Schedule Planning: Techniques for developing agents’ abilities to autonomously plan their schedules.
- Notebook 1.4 - Agent Cognition and Communication: Exploration of agents’ cognitive functions, such as perception and memory retrieval, and how they interact and communicate.
|
 |
[Details]
Feb 5, 2024
Publication: Müller et al. Transformers can do Bayesian Inference. ICLR, 2022
Task: Using Transformers to estimate Posterior Predictive Distributions
Libraries: PyTorch
Learning objectives:
- Understand the principles of Prior-Data Fitted Networks (PFNs) and their application in integrating prior knowledge to predict the Posterior Predictive Distribution (PPD) in machine learning models.
- Acquire practical skills in defining priors, creating dataset loaders for synthetic data generation, developing transformer models for PPD approximation, formulating loss functions for specific regression tasks, and evaluating model output quality.
|
 |
Data Generating Priors for Prior-Fitted Networks
[Details]
Feb 5, 2024
Publication: Müller et al. Transformers can do Bayesian Inference. ICLR, 2022Hollmann et al. TabPFN: A Transformer That Solves Small Tabular Classification Problems in a Second. ICLR, 2023
Task: Generating data according to different priors for PFNs
Libraries: PyTorch
Learning objectives: Learn how to generate synthetic data based on specified priors and utilize it for training neural networks to approximate Bayesian inference.
|
 |
LLMTime - Zero-shot prompting LLMs for time series forecasting
[Details]
Nov 12, 2023
Publication: Gruver et al. Large Language Models are Zero Shot Time Series Forecasters. NeurIPS, 2023
Task: Weather forecasting using LLMs
Libraries: openai, tiktoken, jax
Learning objectives: Explore zero-shot prompting with Large Language Models (LLMs) for time series forecasting. In this tutorial, we aim to:
- Acquaint you with the application of machine learning techniques using Large Language Models (LLMs).
- Enhance your understanding of LLMs and the parameters influencing their behavior.
- Guide you through the essentials for successful time series prediction with LLMs.
- Translate knowledge from transformers to the realm of LLMs.
|
 |
Critical Exploration of Transformer Models
[Details]
Nov 7, 2023
Learning objectives: Delve into the inner workings of transformer models beyond basic applications
Key Areas:
- Adversarial Inputs: Crafting inputs to challenge language models
- Attention Visualization: Understanding focus mechanisms in transformers
- Fine-Tuning with LoRA: Implementing Low-Rank Adaptation for model refinement
- Bias Detection: Investigating biases in model responses
Note: This tutorial is an introductory exploration of transformers. It’s a starting point for more advanced study.
|
 |
Molecule Attention Transformer
[Details]
Apr 21, 2023
Publication: Maziarka, et al. "Molecule Attention Transformer"
Task: Classification task to predict Blood-brain barrier permeability (BBBP)
Dataset: BBBP
Libraries: PyTorch, DeepChem, RDKit
Learning objectives:
- Learn key concepts required to work with molecules
- Perform critical data preprocessing tasks, such as feature extraction, graph formation, and scaffold splitting
- Explore challenges of drug discovery, particularly designing drugs that can cross the blood-brain barrier and enter the central nervous system
- Implement Molecule Attention Transformer (MAT) proposed by Maziarka et al. (2020) using a deep learning pipeline
- Train and evaluate the model on molecular datasets
|
 |
Accessing Research Data for Social Science [Oxford Internet Institute, MT 2022]
[Details]
September, 2022
DeepNote: (Jupyter notebook hosting service) DIY Notebooks.
Github: Repository to work on your local machine.
Programming Language: Python
Libraries: Pandas, feedparser, newscatcherapi, psaw, requests, twarc (Twitter API), requests-html
Learning objectives:
- Use Python to collect research data from the social web
- Give due consideration to the ethics of data collection
- Following topics are covered:
- Accessing RSS feeds
- Accessing Reddit and Wikipedia through APIs
- Accessing Twitter using Twitter API
- Web crawling
|
 |
Attention is all you need
[Details]
Aug 5, 2021
Publication: Vaswani, Ashish, et al. "Attention is all you need." NeurIPS 2017.
Task: Neural Machine Translation (e.g, German-English)
Dataset: Multi30k
Libraries: PyTorch, NLTK, Spacy, torchtext
Learning objectives:
- Build a transformer model for neural machine translation
- Train the model using proposed label smoothing loss and learning rate scheduler
- Use the trained model to infer likely translations using
- Greedy Decoding
- Beam Search
|