Détails de la mission
Lieu de la mission

Rabat Technopolis (Hybride)

Durée

6 mois renouvelable

Date de début

ASAP au plus tard sous un mois

Entreprise :

WeLinK est un cabinet de Conseil & Consulting IT fondé en 2011, spécialisé dans le Placement/Head Hunting de freelances IT.

WeLink est actuellement le N°1 des missions freelances au Maroc, et dispose de références solides au Maroc : CAPGEMINI, ATOS, IBM, SOFRECOM, CGI, GFI, Omnidata, SOPRA BANKING, HPS, UMANIS, SQLI, BDSI, WAFA ASSURANCE, DXC, ATTIJARI BANK, CFG BANK, MAJOREL, WEBHELP…
Ainsi que diverses références à l'international.

Depuis sa création, WELINK a pu intervenir sur plus de 600 Missions long termes (> 6 Mois). Quant à notre équipe actuelle, elle dépasse 150 Consultants Freelances IT en mission.

Poste

The data engineer is responsible for designing, developing, and maintaining the infrastructure and systems required for data storage, processing, and analysis. They play a crucial role in building and managing the data pipelines that enable efficient and reliable data integration, transformation, and delivery for all data users across the enterprise.
Responsibilities
1. Designs, develops, and maintains robust data pipelines using Databricks and Apache Spark, that extract data from various sources, transform it into the desired format, and load it into the appropriate data storage systems.
2. Implement and manage advanced data products in our medallion architecture.
3. Implement and manage advanced data models, including dimensional, relational, and Data Vault modeling techniques.
4. Integrates data from different sources, including databases, data warehouses, APIs, and external systems.
5. Ensures data consistency and integrity during the integration process, performing data validation and cleaning as needed.
6. Transforms raw data into a usable format by applying data cleansing, aggregation, filtering, and enrichment techniques.
7. Optimizes data pipelines and data processing workflows for performance, scalability, and efficiency.
8. Monitors and tunes data systems, identifies and resolves performance bottlenecks, and implements caching and indexing strategies to enhance query performance. Implements data quality checks and validations within data pipelines to ensure the accuracy, consistency, and completeness of data.
9. Manage Databricks Unity Catalog including administrative activities (policies, security, access control, cost control).
10. Optimize and administer Databricks environments to ensure high performance and reliability.
11. Cluster configuration and policy management in Databricks.
12. Build and maintain CI/CD pipelines (DevOps) in GitHub or Azure DevOps.
13. Collaborate with cross-functional teams to deliver comprehensive data solutions.
14. Ensure data quality, security, and governance across all data processes.
15. Communicate effectively with stakeholders to understand requirements and deliver actionable insights.
16. Operate within agile teams, contributing to continuous improvement in data engineering practices and processes.

Profil

1. Minimum of 5 years of experience in data engineering, with at least 2 years of hands-on experience with Databricks.
2. At least 5 years of work experience in data management disciplines, including data integration, modeling, optimization and data quality, or other areas directly relevant to data engineering responsibilities and tasks.

Compétences techniques requises:

1. Databricks Certifications (Associate, Professional, Administration).
2. Proficiency in modern data modeling techniques and data administration.
3. Strong knowledge of SQL, Python, PySpark.
4. Experience with cloud platforms such as AWS and Azure. Also, very familiar with AWS IAM and Azure AD security mechanisms.
5. Expert problem-solving skills, including debugging skills, allowing the determination of sources of issues in unfamiliar code or systems, and the ability to recognize and solve repetitive problems.
6. Strong communication skills and a proactive, “getting things done” mindset.
7. Experience working in agile teams and familiarity with agile methodologies.
8. Ability to design, build, and deploy data solutions that capture, explore, transform, and utilize data to support AI, ML, and BI.
9. Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS, Azure, GCP) and modern data warehouse tools (Snowflake, Databricks).
10. Experience with database technologies such as SQL, NoSQL, Oracle, Hadoop, or Teradata.

Aptitudes / Qualités personnelles requises :

1. Ability to collaborate within and across teams of different technical knowledge to support delivery and educate end users on data products.
2. Excellent business acumen and interpersonal skills; able to work across business lines at a senior level to influence and effect change to achieve common goals.
3. Ability to describe business use cases/outcomes, data sources and management concepts, and analytical approaches/options.
4. Ability to translate among the languages used by executive, business, IT, and quant stakeholders.