Data Engineer (Remote)

PTT Consulting

20200-25200 PLN miesięcznie (B2B)

Czym będziesz się zajmować?

Working hours12:00 PM – 8:00 PM (CEST)

Responsibilities

  • Design, build, and maintain production-grade data pipelines (ETL/ELT).
  • Develop scalable solutions for processing large datasets.
  • Model and structure data for analytics and reporting (fact/dimension models).
  • Optimize SQL queries and data storage for performance and cost efficiency.
  • Design and implement end-to-end data architectures (batch and near real-time).
  • Build and manage data workflows and orchestration pipelines (e.g., scheduling, dependencies, retries).
  • Ensure data quality, validation, and reliability across pipelines.
  • Implement monitoring, logging, and failure recovery mechanisms.
  • Work with cloud platforms to build secure and scalable data solutions.
  • Collaborate with analysts, data scientists, and other engineers to support data needs.
  • Apply software engineering best practices (version control, CI/CD, testing).
  • (Optional) Develop and maintain streaming / real-time data pipelines.

Kogo poszukujemy?

Requirements

  • Strong proficiency in Python for data processing and pipeline development.
  • Advanced knowledge of SQL (CTEs, window functions, query optimization).
  • Experience with data modeling:
    • relational schemas (3NF).
    • dimensional modeling (star/snowflake).
  • Hands-on experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server, Oracle).
  • Experience with data warehousing platforms (e.g., Snowflake, BigQuery, Redshift, Synapse).
  • Understanding of ETL vs ELT, batch vs near real-time processing.
  • Experience building and managing data pipelines and orchestration tools (e.g., Airflow, Prefect, Dagster, Azure Data Factory).
  • Experience with at least one cloud platform (AWS, Azure, or GCP).
  • Knowledge of data quality, validation, and governance practices.
  • Familiarity with DevOps practices:
    • Git / version control.
    • CI/CD pipelines.
    • Infrastructure as Code (e.g., Terraform, CloudFormation, ARM).
  • Understanding of security best practices (e.g., PII handling, RBAC, encryption).
Nice to have:
  • Spark (PySpark/Spark SQL).
  • Hadoop basics.
  • Streaming (Kafka, Kinesis, Pub/Sub, Event Hubs).
  • Java/Scala.
  • Bash.
  • dbt.
  • BI (Power BI, Tableau, Looker).
  • APIs/SaaS integration.
  • FinOps (cost optimization).
  • Data governance/GRC.

Czego wymagamy?

Znajomości:

Języki:

  • Angielski

Jakie warunki i benefity otrzymasz?

  • 20160-25200 PLN miesięcznie (B2B)
  • B2B - Stałe godziny pracy (100%)
  • Praca zdalna: Możliwa w całości
  • Pakiet medyczny

Gdzie będziesz pracował?

Pl. Bankowy 2, Warszawa lub zdalnie

Kim jesteśmy? – PTT Consulting

Data Engineer will work for a global leader in gaming, who deliver entertaining and responsible gaming experiences for players across all channels and regulated segments, from Gaming Machines and Lotteries to Sports Betting and Digital. Leveraging a wealth of compelling content, substantial investment in innovation, player insights, operational expertise, and leading-edge technology, company’s solutions deliver unrivaled gaming experiences that engage players and drive growth. The company has a well-established local presence and relationships with governments and regulators in more than 100 countries around the world, creating value by adhering to the highest standards of service, integrity, and responsibility.

Strona firmy: PTT Consulting