Data Platform Engineer

23500-28600 PLN miesięcznie (B2B)

Infolet

Czym będziesz się zajmować?

You will
  • Design and implement a cloud-based data lakehouse platform ingesting engineering and security telemetry from multiple sources.
  • Build scalable Bronze/Silver/Gold data layers, ensuring reliable ingestion, transformation, and consumption patterns.
  • Develop and maintain streaming and batch pipelines using technologies such as Kafka, Flink, Spark Streaming, or Kinesis.
  • Model complex relationships using graph databases (Neo4j, Neptune, TigerGraph) to support advanced analytics and AI-driven use cases.
  • Collaborate with data scientists, platform engineers, and security teams to deliver high‑quality, production-grade data solutions.
  • Implement robust data transformation frameworks (dbt, Databricks SQL, Dataform, or custom SQL/Python pipelines).
  • Build and maintain orchestration workflows using Airflow, Prefect, Dagster, Step Functions, or similar tools.
  • Define and enforce data modeling standards, including dimensional modeling, SCD2, and normalization strategies.
  • Use Infrastructure as Code (Terraform, CloudFormation, Pulumi) to provision and manage cloud resources.
  • Ensure observability and reliability of pipelines through monitoring, alerting, and data quality validation.
  • Work autonomously in a fast‑moving environment, proposing solutions even when requirements are incomplete.

Kogo poszukujemy?

Must have:
  • 8+ years of experience in data engineering, including at least 2 years building lakehouse architectures.
  • Proven experience delivering production-grade data platforms in cloud environments.
  • Strong hands-on expertise with AWS (S3/Blob storage, RDS/SQL DB, managed Kafka, serverless compute).
  • Expert-level SQL and deep understanding of data modeling (dimensional, SCD2, normalization/denormalization).
  • Proficiency with Python or Scala for data processing and automation.
  • Experience with stream processing (Kafka, Flink, Spark Streaming, Kinesis).
  • Hands-on experience with dbt, Databricks SQL, Dataform, or similar transformation frameworks.
  • Strong orchestration skills using Airflow, Prefect, Dagster, or equivalent.
  • Solid understanding of IaC (Terraform, CloudFormation, Pulumi).
  • Ability to explain technical trade-offs (cost, performance, complexity) to non-technical stakeholders.
  • Strong problem-solving skills: debugging data quality issues, optimizing queries, resolving schema conflicts.
  • Fluent English (business and technical communication).
Nice to have:
  • Experience with search technologies (OpenSearch, Elasticsearch, Solr).
  • Advanced graph skills: Cypher, SPARQL, Gremlin, or graph ETL pipelines.
  • Familiarity with data quality frameworks (Great Expectations, dbt tests).

Czego wymagamy?

Znajomości:
Języki:
  • Polski
  • Angielski

Jakie warunki i benefity otrzymasz?

  • 140-170 PLN godzinowo (B2B)
  • B2B - Elastyczne godziny pracy (100%)
  • 16700-18600 PLN miesięcznie (Umowa o pracę)
  • Umowa o pracę - Elastyczne godziny pracy (100%)
  • Praca zdalna: Hybrydowo
  • Szkolenia wewnętrzne
  • Pakiet medyczny, Ubezpieczenie, Pakiet sportowy
  • Kawa / Herbata
  • Parking rowerowy
  • Pakiet relokacyjny

Gdzie będziesz pracował?

Wołoska, Warszawa lub hybrydowo

Kim jesteśmy?

Od 20 lat wspieramy liderów IT, dostarczając technologie, ekspertów i pełne wsparcie operacyjne - w tym legalizację pobytu i pracy międzynarodowych specjalistów IT Do naszych projektów poszukujemy specjalistów Java, JavaScript, C embedded, C++, PHP, specjalistów od mobile, testerów oprogramowania, administratorów sieci i systemów i wielu innych.