A pipeline for transforming Extracting, Loading and Transforming data to businesses
A fully dockerized ELT pipeline project, using PostgreSQL, dbt, Apache Airflow, and Redash.
Explore the docs »
Table of Contents
Using a docker-compose file, developed a completely dockerized ELT pipeline with PostgreSQL for data storage, Airflow for automation and orchestration, DBT for data transformation, and a Redash dashboard connected to the PostgreSQL database.
Tech Stack used in this project
Make sure you have docker installed on local machine.
- Docker
- DockerCompose
- Clone the repo
git clone https://github.com/tutorialcreation/trafficELT.git
- Run
docker-compose build docker-compose up
- Open Airflow web browser
Navigate to `http://localhost:8000/` on the browser activate and trigger dbt_load_dag activate and trigger dbt_dbt_dag
- Access redash dashboard
Navigate to `http://localhost:5000/` on the browser
- Access your PostgreSQL database using adminar
Navigate to `http://localhost:8080/` on the browser choose PostgreSQL databse use `root` for username use `pssd` for password
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
Distributed under the MIT License. See LICENSE
for more information.
Martin Luther - @email - tutorialcreation81@gmail.com