do apache spark, pyspark, hive, airflow, kafka, pandas, etl
About this Service
Big Data Engineer with 10+ years of experience in Hadoop, Scala Spark, PySpark, Hive, Kafka, Databricks and Cloud platforms like AWS, GCP, Azure. Developed and scheduled highly efficient production ETL pipelines capable of handling on Terabytes of data.
I also have experience in software engineering with open source tools SQL, REST API, Apache Airflow.
If you're looking for someone to build pipeline from scratch or enhance or improve performance of existing pipelines, you're just a ping away from accomplishing it.
Service Features
About the Seller
From
Norway
Member Since
Aug 2025
Skills:
Compare packages
| Package |
$20.00
Basic
|
$60.00
Standard
|
$140.00
Premium
|
|---|---|---|---|
| Revisions | 1 | 1 | 1 |
| Delivery Time | 1 Days | 1 Days | 1 Days |
| Project task | |||
| Software files | |||
| Total | $20.00 | $60.00 | $140.00 |
Basic
Join as a freelancer or client
Join as a Freelancer
Join as a Client