Starred repositories
International Trade Network Analysis in R. This package can clean and transform international trade data from WITS (http://wits.worldbank.org/) into a international trade network and undertake desc…
This repository contains the work of a Bocconi AI & Neuroscience Association team tasked with building a knowledge graph of the English Wikipedia.
A repository for downloading UK goods export data from the ONS and creating a simple Streamlit Visualisation dashboard.
This is the sequential Encoder-Decoder implementation of Neural Machine Translation using Keras
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
Classifying and predicting fraudulent transactions using BankSim data
This repository is for code to download UNComtrade data and its dressing for panel processing.
Notebooks using the Hugging Face libraries 🤗
contains notebooks on topic modeling, spark and pandas implementation
NVIDIA Generative AI reference workflows optimized for accelerated infrastructure and microservice architecture.
An open database of international sanctions data, persons of interest and politically exposed persons
🤖 An authorial set of fundamental python recipes on Machine Learning and Artificial Intelligence.
✨ An authorial set of fundamental python recipes on Data Science and Analytics.
[AAAI 2022] Seq2Pat: Sequence-to-Pattern Generation Library
Jupyter book on unsupervised machine learning to detect patterns in illicit trade data
lvwerra / autogluon
Forked from autogluon/autogluonAutoGluon: AutoML for Image, Text, Time Series, and Tabular Data
Create real-time plots in Jupyter Notebooks.
Train transformer language models with reinforcement learning.
Up-to-date version of labs for ISLP
Automate Excel with Python
Repository for useful code for data analysis and visualization
Code to accompany EMNLP paper on nonstandard word dissemination online.
A series of Jupyter notebooks that walk you through the fundamentals of Machine Learning and Deep Learning in Python using Scikit-Learn, Keras and TensorFlow 2.