Senior Data Engineer
Python Google Cloud Platform (GCP) Terraform Apache Airflow
Assignment description
We are looking for a senior data engineer with experience on platform and infrastructure development. Experience with working on GCP. Doesn't take long with getting up to speed. Our team currently focuses on Ingestion and loading of data using tools like Airflow. Other responsibilities include, Data storage, Managing permissions/security inside the data platform, Compliance etc. Work from the office in Stockholm at Söder, 3 days a week.
Must haves
Experience: -Building data platform -Automation -Ingestion frameworks and orchestration Languages: -Python Tooling: -Airflow -GCP -Terraform -Fivetran -Snowplow
- Department
- IT
- Role
- Data Engineer
- Locations
- Stockholm
- Remote status
- Hybrid Remote
Stockholm
Workplace & Culture
You just have a couple of seconds to capture the interest of your visitors. Explain how it is working at your company. Give the visitor an insight, show with text and pictures. Share your company culture.
Try to communicate your companies efforts to existing and prospective staff what makes your company a desirable workplace.
About Dynamx AB
Dynamx is a reinvented and reimagined consultancy, a new breed of the firm- We are a digital transformation consultancy that operates at the intersection of People, Data, and Technology.
We Dynamx AB, specialize in Artificial Intelligence (AI) and Data Platforms. With a strong focus on providing comprehensive insights into complex data sets, our AI-driven data analysis service is designed to cater to the needs of business executives in Sweden. Our team of highly skilled AI and Data experts is committed to delivering customized solutions that empower our clients to make informed decisions and drive their business forward.
Senior Data Engineer
Python Google Cloud Platform (GCP) Terraform Apache Airflow
Loading application form