Data Engineer
About On The Spot
On The Spot Development is a software development company that builds R&D teams for heavily invested startups and tech companies from the UK, EU, and Israel.
This job opportunity is for a position within one of our teams. You will be directly employed by On The Spot and will collaborate closely with our partner companies, contributing to their product's journey.
About the team
Haptiq is a global technology company dedicated to delivering best in class software and digital solutions engineered to drive profitable and scalable growth for our clients.Operating as the nexus of portfolio management, growth, and optimization, Haptiq offers SaaS platforms fueled by cutting-edge AI, a comprehensive end-to-end data management platform, and a suite of strategic services.
Among our flagship offerings is Olympus FinTech, a dynamic solution empowering private equity, debt, and CLO managers with customizable and streamlined workflows, robust data management, and sophisticated reporting capabilities. Pantheon by Haptiq is a platform that helps create interactive, personalized virtual experiences. It uses AI to make content more engaging and customizable, making it useful for entertainment, marketing, and education. Users can easily build and manage these immersive experiences in real-time.
About the role
We are seeking a motivated and self-driven Data Engineer to join our dynamic data team. You will play a key role in designing, building, and maintaining ETL infrastructure and data pipelines, ensuring data quality and scalability.
[01]
Responsibilities
- Develop and optimize ETL pipelines for efficient data ingestion, transformation, and loading
- Design, build, and deploy Python scripts and ETL processes using ADF
- Work with structured, semi-structured, and unstructured data across diverse sources
- Implement dimensional data modeling and data warehousing concepts (OLTP, OLAP, Facts, Dimensions)
- Ensure best practices in data management, security, and governance within cloud environments
- Troubleshoot and optimize ADF jobs and ETL workflows for performance and scalability
- Perform code reviews, manage version control (GitHub), and deploy via CI/CD pipelines
- Collaborate on cloud-based architectures and enterprise-wide data migrations
[01]
Requirements
- 3+ years in Python coding
- 5+ years in SQL Server development and large datasets
- Proficiency in developing and deploying ETL pipelines using Databricks and PySpark
- Expertise in cloud data warehouses like Synapse, Redshift, Snowflake, or ADF
- Knowledge of event-based/streaming data ingestion and processing
- Strong skills in SQL, Python, data modeling, and dimensional design
- Hands-on experience with cloud architectures and messaging systems
- Familiarity with CI/CD workflows and deployment processes
- Certifications: Cloud certifications are a plus
- Experience with Airflow, AWS Lambda, Glue, and Step Functions will be your advantage
[01]
[02]
[03]
Benefits
- Work in a highly professional team. Informal and friendly atmosphere in the team
- Paid vacation — 20 business days per year, 100% sick leave payment
- Equipment provision
- Partially compensated educational costs (for courses, certifications, professional events, etc.)
- Legal and Accounting supportin Poland
- Ability to work from our comfortable office in Warsaw at Prosta 51
- 5 sick days per year
- Medical insurance (after the end of the probationary period)
- Flexible working hours – we care about you (!) and your output
- English and Polish classes 2 times a week (online)
- Bright and memorable corporate life: corporate parties, gifts to employees on significant dates
Join us
If you'd like to work from Poland, having a PBH visa/Karta pobytu/Paszport Polski is obligatory to be considered for this position. Thank you!
Contribute to our growth
Know someone perfect for this role? Let us know about them. We have a referral program to recognize your support.