dettagli di lavoro
Data Engineer
Start: 01.04.2025
Duration: Until the end of the year
Degree of employment: 100%
Zurich
We have an urgent requirement for a software engineer with excellent data skills who is comfortable working in Microsoft Azure and making changes across multiple areas of the solution i.e. application code, data storage and Azure platform. The assignment is to implement a solution for data multi-tenancy and seeding of data from different source system environments on our data landscape which runs on Azure Databricks.
Key Responsibilities
A typical day is performing analysis, design and implementation work. This will involve Python, PySpark, SQL and Azure platform based work. All work is tracked in Azure DevOps.
The project is part of our Finance Transformation. You will work closely on this with two team members in Zurich with support from the larger team, who are based in various locations.
Overview of the department / team (team size, backgrounds, personalities …):
You would join a team working in the Finance Data & Analytics area of an international financial services firm. The job requires an in-person presence at least four days a week in our offices in Zurich. The team has an on-site presence in Zurich and is supported with team members in multiple other locations. The team has a mix of Python, PySpark, data engineering and Azure platform skills.
Current challenges:
Tenantization is an urgent topic, hence why we are posting this role. Current challenge is an urgent need and a short timescale for implementation
The candidate must be pro-active and willing to get involved in the whole solution, not a specific part of it. The person must be able to make meaningful progress in an iterative manner.
Top 3 essentials skills/experience-based requirements?
- at least 7 years of experience working as a software engineer in financial services firms writing and operating code in production that interacts with relational databases. This must include considerable experience of writing code, testing of code via automation, refactoring, measuring and optimizing performance - good knowledge of Python and ability to be productive straightaway – must have - very good database knowledge including data modelling and SQL (DML and DDL) - must have - strong understanding of public cloud, preferably Azure - high levels of pro-activity, drive and practicality – must have - excellent communication skills in English, written and verbal
Desired Skills and Qualifications:
Azure Databricks, PySpark and Unity Catalog experience.
Candidate Value Proposition:
Monetary gain for external (clearly).
Opportunity to deliver an impactful end-to-end solution
Working with skilled individuals and the Azure platform incl. Azure Databricks
Data Engineer
Start: 01.04.2025
Duration: Until the end of the year
Degree of employment: 100%
Zurich
We have an urgent requirement for a software engineer with excellent data skills who is comfortable working in Microsoft Azure and making changes across multiple areas of the solution i.e. application code, data storage and Azure platform. The assignment is to implement a solution for data multi-tenancy and seeding of data from different source system environments on our data landscape which runs on Azure Databricks.
Key Responsibilities
A typical day is performing analysis, design and implementation work. This will involve Python, PySpark, SQL and Azure platform based work. All work is tracked in Azure DevOps.
The project is part of our Finance Transformation. You will work closely on this with two team members in Zurich with support from the larger team, who are based in various locations.
Overview of the department / team (team size, backgrounds, personalities …):
You would join a team working in the Finance Data & Analytics area of an international financial services firm. The job requires an in-person presence at least four days a week in our offices in Zurich. The team has an on-site presence in Zurich and is supported with team members in multiple other locations. The team has a mix of Python, PySpark, data engineering and Azure platform skills.
Current challenges:
Tenantization is an urgent topic, hence why we are posting this role. Current challenge is an urgent need and a short timescale for implementation
The candidate must be pro-active and willing to get involved in the whole solution, not a specific part of it. The person must be able to make meaningful progress in an iterative manner.
Top 3 essentials skills/experience-based requirements?
- at least 7 years of experience working as a software engineer in financial services firms writing and operating code in production that interacts with relational databases. This must include considerable experience of writing code, testing of code via automation, refactoring, measuring and optimizing performance - good knowledge of Python and ability to be productive straightaway – must have - very good database knowledge including data modelling and SQL (DML and DDL) - must have - strong understanding of public cloud, preferably Azure - high levels of pro-activity, drive and practicality – must have - excellent communication skills in English, written and verbal
Desired Skills and Qualifications:
Azure Databricks, PySpark and Unity Catalog experience.
Candidate Value Proposition:
Monetary gain for external (clearly).
Opportunity to deliver an impactful end-to-end solution
Working with skilled individuals and the Azure platform incl. Azure Databricks
Condividi questa opportunità