20 200 – 23 500 PLN
netto /miesiąc
B2BEtat: 100%
14 400 – 16 800 PLN
brutto /miesiąc
Umowa o pracęEtat: 100%
Oceń tę ofertę
Verita HR
Kraków
23.0k–29.0k PLN
B2B
#GCP#CI/CD#BigQuery#Looker Studio#Cloud Run#Artifact Registry#CloudSQL
Infolet
Kraków
20.2k–23.5k PLN
B2B / UoP
#GCP#SQL#ETL#Python
Scalo
Zdalnie
30.2k–35.3k PLN
B2B
#Java#AWS#AI#Python#RAG#Kotlin

Podobne ogłoszenia

Verita HR
Verita HR
Kraków
23.0k–29.0k PLN
B2B
#GCP#CI/CD#BigQuery#Looker Studio#Cloud Run#Artifact Registry#CloudSQL
Data Science#GCP#CI/CD#BigQuery#Looker Studio#Cloud Run#Artifact Registry#CloudSQL
23.0k–29.0k PLN
Kraków
Praca hybrydowa
Infolet
Infolet
Kraków
20.2k–23.5k PLN
B2B / UoP
#GCP#SQL#ETL#Python
Data Science#GCP#SQL#ETL#Python
20.2k–23.5k PLN
Kraków
Scalo
Scalo
Zdalnie
30.2k–35.3k PLN
B2B
#Java#AWS#AI#Python#RAG#Kotlin
Data Science#Java#AWS#AI#Python#RAG#Kotlin
30.2k–35.3k PLN
Praca zdalna
Verita HR
Verita HR
Kraków
25.2k–28.4k PLN
B2B
#SQL#ETL
Data Science#SQL#ETL
25.2k–28.4k PLN
Kraków
Power Media
Power Media
Kraków
7.0k–15.0k PLN
UoP
#SQL#PL/SQL#T-SQL#SQL Server#PostgreSQL#Business Analysis#ETL#Raportowanie wyników#Python
Data Science#SQL#PL/SQL#T-SQL#SQL Server#PostgreSQL#Business Analysis#ETL#Raportowanie wyników#Python
7.0k–15.0k PLN
Kraków
Verita HR
Verita HR
Zdalnie
26.3k–28.4k PLN
B2B
#Data analysis#Azure#Python#Data Engineer#SQL#ETL#Azure Data Factory#Data Management Systems#Databricks#Cloud
Data Science#Data analysis#Azure#Python#Data Engineer#SQL#ETL#Azure Data Factory#Data Management Systems#Databricks#Cloud
26.3k–28.4k PLN
Praca zdalna
apreel
apreel
Zdalnie
20.2k–29.4k PLN
B2B
#Software development#Python#RESTful#Distributed systems#Event driven architecture#SQL#Direct Search#LLM applications#RAG systems#Entity Framework#Observability#Azure#SEM#Problem solving
Data Science#Software development#Python#RESTful#Distributed systems#Event driven architecture#SQL#Direct Search#LLM applications#RAG systems#Entity Framework#Observability#Azure#SEM#Problem solving
20.2k–29.4k PLN
Praca zdalna
1dea
1dea
Zdalnie
25.2k–31.9k PLN
B2B
#Python#Databricks#Azure
Data Science#Python#Databricks#Azure
25.2k–31.9k PLN
Praca zdalna
DCG
DCG
Zdalnie
28.6k–31.9k PLN
B2B
#Big Data#ETL#Spark#Databricks#SQL#Python#Data warehouse#Azure#Fabric Data Factory#Airflow#Kafka#Hadoop
Data Science#Big Data#ETL#Spark#Databricks#SQL#Python#Data warehouse#Azure#Fabric Data Factory#Airflow#Kafka#Hadoop
28.6k–31.9k PLN
Praca zdalna
ITFS
ITFS
Zdalnie
26.9k–38.6k PLN
B2B
#Palantir Foundry#Python#SQL#ETL
Data Science#Palantir Foundry#Python#SQL#ETL
26.9k–38.6k PLN
Praca zdalna

(10/856)
Dataflow Engineer

Infolet
Kraków
Data Science
GCPETLSQLDatabases
Senior
PolskiAngielski
min. 5 lat doświadczenia
GCPETLSQLDatabasesPolskiAngielski
Senior
min. 5 lat doświadczenia

Kogo poszukujemy?

Must have

  • 5 - 10 years of experience as a Data Engineer, Dataflow Engineer, or in a similar role working with large-scale data systems
  • Strong programming skills in Python and Java
  • Hands-on experience with data pipeline orchestration tools (e.g., Google Dataflow)
  • Solid experience with Google Cloud Platform, including BigQuery and Dataflow
  • Strong background in ETL frameworks, real-time data streaming, and data processing
  • Experience with SQL and NoSQL databases
  • Knowledge of data governance, data quality, and data security best practices
  • Strong problem-solving and troubleshooting skills
  • Fluent English and Polish

Nice to have

  • Experience with additional cloud services or multi-cloud environments
  • Familiarity with data formats such as JSON, Parquet, or similar
  • Experience building highly available, low-latency data pipelines
  • Exposure to on-call or out-of-hours support models
  • Strong communication skills and experience working with both technical and non-technical stakeholders

Czym będziesz się zajmować?

Project

The project is part of a Data Analytics and Engineering program focused on building, optimizing, and operating scalable data pipelines to support analytics, reporting, and data-driven decision-making across the organization.

You will

  • Design, build, and maintain efficient data pipelines for collecting, transforming, and storing data
  • Integrate data from various sources, including cloud platforms (Google Cloud), SQL/NoSQL databases, APIs, and external services.
  • Optimize and troubleshoot existing pipelines to ensure high performance and reliability.
  • Implement ETL processes to transform raw data into analytics-ready datasets.
  • Collaborate with cross-functional teams (data engineering, operations) to understand data requirements and deliver solutions.
  • Support data applications when required during weekends or non-office hours
  • Build scalable, automated workflows capable of processing large data volumes with low latency.
  • Set up monitoring and alerting for data pipelines to minimize downtime
  • Create and maintain technical documentation for data flows and pipeline configurations

Jakie otrzymasz benefity?

Rozwój

Szkolenia wewnętrzne

Zdrowie

Pakiet medycznyUbezpieczeniePakiet sportowy

Kuchnia

Kawa / Herbata

Dojazd

Parking rowerowy

Inne

Pakiet relokacyjny

Gdzie i jak będziesz pracował?

Miłkowskiego, Kraków
Tryb pracy: Elastyczne godziny pracy
Godziny pracy biura: 7-20
Model pracy
Stacjonarnie
Hybrydowo
100% zdalnie
Map Preview

Kim jesteśmy?

Wielkość firmy: 60+
Od 20 lat wspieramy liderów IT, dostarczając technologie, ekspertów i pełne wsparcie operacyjne - w tym legalizację pobytu i pracy międzynarodowych specjalistów IT Do naszych projektów poszukujemy specjalistów Java, JavaScript, C embedded, C++, PHP, specjalistów od mobile, testerów oprogramowania, administratorów sieci i systemów i wielu innych.