Data Platform Engineer (Remote)
17000-20400 PLN miesięcznie (Umowa o pracę)
Infolet
Czym będziesz się zajmować?
You will- Design and implement a cloud-based data lakehouse platform ingesting engineering and security telemetry from multiple sources
- Build scalable Bronze/Silver/Gold data layers, ensuring reliable ingestion, transformation, and consumption patterns
- Develop and maintain streaming and batch pipelines using technologies such as Kafka, Flink, Spark Streaming, or Kinesis
- Model complex relationships using graph databases (Neo4j, Neptune, TigerGraph) to support advanced analytics and AI-driven use cases
- Collaborate with data scientists, platform engineers, and security teams to deliver high‑quality, production-grade data solutions
- Implement robust data transformation frameworks (dbt, Databricks SQL, Dataform, or custom SQL/Python pipelines)
- Build and maintain orchestration workflows using Airflow, Prefect, Dagster, Step Functions, or similar tools
- Define and enforce data modeling standards, including dimensional modeling, SCD2, and normalization strategies
- Use Infrastructure as Code (Terraform, CloudFormation, Pulumi) to provision and manage cloud resources
- Ensure observability and reliability of pipelines through monitoring, alerting, and data quality validation
- Work autonomously in a fast‑moving environment, proposing solutions even when requirements are incomplete
Kogo poszukujemy?
Must have- 10+ years of experience in data engineering, (at least 2 years building lakehouse architectures)
- Proven experience delivering production-grade data platforms in cloud environments
- Strong hands-on expertise with AWS (S3/Blob storage, RDS/SQL DB, managed Kafka, serverless compute)
- Expert-level SQL and deep understanding of data modeling (dimensional, SCD2, normalization/denormalization)
- Proficiency with Python or Scala (data processing and automation)
- Experience with stream processing (Kafka, Flink, Spark Streaming, Kinesis)
- Hands-on experience with DBT, Databricks SQL, Dataform, or similar transformation frameworks
- Strong orchestration skills using Airflow, Prefect, Dagster, or equivalent
- Solid understanding of IaC (Terraform, CloudFormation, Pulumi)
- Ability to explain technical trade-offs to non-technical stakeholders
- Strong problem-solving skills
- Fluent English (business and technical)
- Experience with search technologies (OpenSearch, Elasticsearch, Solr)
- Advanced graph skills: Cypher, SPARQL, Gremlin, or graph ETL pipelines
- Familiarity with data quality frameworks (Great Expectations, dbt tests)
- Experience with real-time event processing (Flink, Spark Streaming, Lambda-based pipelines)
- Knowledge of observability tools (Grafana, Datadog, CloudWatch)
Czego wymagamy?
Znajomości:
Mile widziane:
Języki:
- Polski
- Angielski
Jakie warunki i benefity otrzymasz?
- 17000-20400 PLN miesięcznie (Umowa o pracę)
- Umowa o pracę - Elastyczne godziny pracy (100%)
- 140-170 PLN godzinowo (B2B)
- B2B - Elastyczne godziny pracy (100%)
- Praca zdalna: Możliwa w całości
- Pakiet medyczny, Ubezpieczenie, Pakiet sportowy
- Pakiet relokacyjny
Gdzie będziesz pracował?
Miłkowskiego, Kraków lub zdalnie
Kim jesteśmy?
Od 20 lat wspieramy liderów IT, dostarczając technologie, ekspertów i pełne wsparcie operacyjne - w tym legalizację pobytu i pracy międzynarodowych specjalistów IT
Do naszych projektów poszukujemy specjalistów Java, JavaScript, C embedded, C++, PHP, specjalistów od mobile, testerów oprogramowania, administratorów sieci i systemów i wielu innych.