Senior Big Data Engineer
About On The Spot
On The Spot Development is a software development company that builds R&D teams for heavily invested startups and tech companies from the UK, EU, and Israel.
This job opportunity is for a position within one of our teams. You will be directly employed by On The Spot and will collaborate closely with our partner companies, contributing to their product's journey.
About the team
As a part of Orca cloud security suite, we are developing a product that processes massive amounts of real-time data. Our team drives Cloud Detection and Response (CDR) offering, managing large-scale API log ingestion, comprehensive stream and batch log analysis, and high-performance real-time services. By connecting software engineering, big data processing, cloud infrastructure and security, we are delivering scalable and reliable systems.
About the role
We are looking for a highly skilled Senior Big Data Engineer to join our team and lead the development of scalable, efficient, and reliable data pipelines for big data processing. You will be responsible for designing, implementing, and optimizing data workflows, ensuring data quality, and enabling real-time security features. If you have a passion for large-scale data processing and a strong background in building ETL/ELT pipelines, we’d love to hear from you!
[01]
Responsibilities
- Design, build, and maintain scalable data pipelines for processing large-scale structured and unstructured datasets
- Develop and optimize ETL/ELT workflows to ensure high performance and efficiency
- Integrate data from various sources, including logs, events, databases, APIs and streams
- Ensure data quality and integrity through validation, monitoring, and logging
- Implement both batch and real-time processing using modern big data technologies (e.g., Spark, Flink, Kafka)
- Manage and optimize cloud-based data storage, including warehouses and lakehouses
- Automate pipeline deployment with CI/CD practices and Infrastructure as Code
- Stay up-to-date with industry trends and best practices in data engineering
Required skills & experience
- 5+ years of experience in software/data engineering, with a strong focus on data pipelines, streaming and processing in the last 3 years
- Strong hands-on experience with Python, Scala or Java
- Solid knowledge of ETL/ELT frameworks (e.g. Apache Airflow, dbt, or Luigi)
- Proven experience with stream processing frameworks such as Kafka Streams, Apache Flink, or Spark Streaming
- Proficiency with big data technologies: Apache Spark, Flink, Kafka, or Hadoop
- Experience with cloud platforms (AWS, GCP or Azure) and related services (e.g. S3, Athena, Redshift, BigQuery, Databricks)
- Strong command of SQL, NoSQL, and data modeling principles
- Experience working with data lakehouse architectures (e.g. Iceberg, Delta Lake)
Nice to have
- Experience with modern data warehouses (e.g. SingleStore, Snowflake, BigQuery)
- Familiarity with containerization and orchestration tools (Docker, Kubernetes)
- Exposure to CI/CD pipelines and Infrastructure as Code (Terraform, CloudFormation)
Benefits
- Work in a highly professional team. Informal and friendly atmosphere in the team
- Paid vacation — 20 business days per year, 100% sick leave payment
- Equipment provision
- Partially compensated educational costs (for courses, certifications, professional events, etc.)
- Legal and Accounting support in Poland
- Ability to work from our comfortable office in Warsaw at Prosta 51
- 5 sick days per year
- Medical insurance (after the end of the probationary period)
- Flexible working hours – we care about you (!) and your output
- English and Polish classes 2 times a week (online)
- Bright and memorable corporate life: corporate parties, gifts to employees on significant dates
Join us
If you'd like to work from Poland, having Polish citizenship, karta pobytu, or D-type work visa for Poland is obligatory to be considered for this position. Thank you!
Contribute to our growth
Know someone perfect for this role? Let us know about them. We have a referral program to recognize your support.