Economics Transformer: Model Architecture
Build Architecture for the AI Legislator
This project is a sub-project, and part of a larger partnership with Humanity Unleashed to build an AI Legislator. The partnership aims to develop a comprehensive AI-driven framework for understanding, predicting, and proposing policies using multivariate time series and legislative data. It consists of several teams working on different aspects: curating large datasets from US law and time series data, developing value elicitation algorithms for understanding user values and proposing policies, building user-friendly frontends for public policy and legislation analysis, pretraining large foundation models for time-series prediction, designing specialized model architectures, and establishing scaling laws for time-series models. The project seeks to leverage AI and deep learning to bring transparency, efficiency, and effectiveness to policymaking and public policy analysis, ultimately creating an ecosystem where both citizens and legislators can interact with data-driven insights and AI-generated recommendations.
We would like to efficiently train and finetune an existing LLM for multivariate timeseries prediction. How can we achieve the best possible accuracy, enable language+timeseries (multimodal) capabilities, uncertainty quantification, and counterfactual generation? Possible things to explore are patch encoders, LORA, and VQ-VAE.