Exploring Simulacra: A Tutorial Series on Simulacra’s Generative Agents [Details] March 31, 2024
Topics
Practical
Publication: Park et al. Generative Agents: Interactice Simulacra of Human Behavior. UIST, 2023Task: Creating and understanding generative agents proposed in Simulacra Libraries: openai Series Overview: This tutorial series spans four comprehensive notebooks, each designed to progressively build your understanding and skills in simulating sociological phenomena with generative agents. Check out the blog post where I break down the big ideas behind creating sociological simulations in an easy-to-get way.
|
|
Prior-Fitted Networks [Details] Feb 5, 2024
Topics
Practical
[ Notebook (Soln.) ]Publication: Müller et al. Transformers can do Bayesian Inference. ICLR, 2022Task: Using Transformers to estimate Posterior Predictive Distributions Libraries: PyTorch Learning objectives:
|
|
Data Generating Priors for Prior-Fitted Networks [Details] Feb 5, 2024
Topics
Practical
[ Notebook (Soln.) ]Publication: Müller et al. Transformers can do Bayesian Inference. ICLR, 2022Hollmann et al. TabPFN: A Transformer That Solves Small Tabular Classification Problems in a Second. ICLR, 2023 Task: Generating data according to different priors for PFNs Libraries: PyTorch Learning objectives: Learn how to generate synthetic data based on specified priors and utilize it for training neural networks to approximate Bayesian inference. |
|
LLMTime - Zero-shot prompting LLMs for time series forecasting [Details] Nov 12, 2023
Topics
Practical
[ Notebook (DIY) ] [ Notebook (Soln.) ]Publication: Gruver et al. Large Language Models are Zero Shot Time Series Forecasters. NeurIPS, 2023Task: Weather forecasting using LLMs Libraries: openai, tiktoken, jax Learning objectives: Explore zero-shot prompting with Large Language Models (LLMs) for time series forecasting. In this tutorial, we aim to:
|
|
Critical Exploration of Transformer Models [Details] Nov 7, 2023
Topics
Practical
[ Notebook (DIY) ] [ Notebook (Soln.) ]Learning objectives: Delve into the inner workings of transformer models beyond basic applicationsKey Areas:
|
|
Molecule Attention Transformer [Details] Apr 21, 2023
Topics
Practical
[ Notebook (Soln.) ]Publication: Maziarka, et al. "Molecule Attention Transformer"Task: Classification task to predict Blood-brain barrier permeability (BBBP) Dataset: BBBP Libraries: PyTorch, DeepChem, RDKit Learning objectives:
|
|
Accessing Research Data for Social Science [Oxford Internet Institute, MT 2022] [Details] September, 2022
Topics
Practical
[ Notebook (DIY) ]DeepNote: (Jupyter notebook hosting service) DIY Notebooks.Github: Repository to work on your local machine. Programming Language: Python Libraries: Pandas, feedparser, newscatcherapi, psaw, requests, twarc (Twitter API), requests-html Learning objectives:
|
|
Attention is all you need [Details] Aug 5, 2021
Topics
Practical
[ Notebook (DIY) ] [ Notebook (Soln.) ]Publication: Vaswani, Ashish, et al. "Attention is all you need." NeurIPS 2017.Task: Neural Machine Translation (e.g, German-English) Dataset: Multi30k Libraries: PyTorch, NLTK, Spacy, torchtext Learning objectives:
|