Responsibilities
Data analytics and reporting: Analyze and understand business process and data models, business logics, modelling logics, and ways to identify missing data.
Liaise with business contact or business analysts to understand the specifications of requirements.
Identify data requirements and data elements and ingest them in the data platform if missing.
Design and implement dashboards and reports in Power BI that provide actionable information.
Follow through on UAT identified issues and their resolution.
Publish data and reports to production. Microsoft Fabric/Azure Synapse: Develop database schemas, define relationships, and optimize performance based on the specific requirements of the data solution
Develop data products using SQL and PySpark.
Implement data quality checks and processes to ensure data accuracy, consistency, and completeness.
Implement security measures to protect sensitive data and ensure compliance with relevant regulations and standards. Optimize solutions for performance and scalability.
Identify and resolve performance bottlenecks, optimize SQL queries, and fine-tune data processing workflows.
Document data engineering processes, system architecture, and data flow diagrams for knowledge sharing and future reference.
Cooperating with cross-functional teams, including architects, data scientists, data analysts, and business stakeholders. Act as a mentor and guidance for colleagues to encourage the adoption of self-service analytics.
Provide technical guidance and training to team members on data management best practices and Microsoft data technologies
Experience:
5+ years of progressively responsible experience in data engineering with a focus on data modelling and 3+ years in data engineering (Microsoft Fabric or Azure Synapse) and report development (Power BI).
Proven track record as a Data Engineer or similar role, and in Power BI development, including Paginated or SSRS reports.
Core Expertise in the Microsoft Data Stack, in particular proficiency with Fabric, Azure Data Factory, Azure Synapse Analytics.
Experience in Data Lake and Data Lakehouse implementation (e.g., Microsoft Fabric, Azure Synapse, Databricks, Snowflake, Microsoft SQL Server, Apache Spark/Hadoop, or other similar big data or SQL databases).
Experience in Data Vault and dimensional data modeling techniques
Knowledge of Finance/Accounting logic in ERP, ideally with D365 F&O.
Familiarity with CI/CD pipelines for data workflows (e.g., using Azure DevOps).
Strong knowledge of Azure Cloud architecture and networking principles.
Strong strategic and conceptual thinking; setting meaningful, long-term vision and strategy, consider long-term potential, propose challenging strategic goals.
Propensity for embracing change and ambiguity: anticipate emerging conditions and demands, embrace widespread organisational change, navigate complex dynamics, view uncertainty and disruption as an opportunity
