中银集团成员

Data Engineer, Digital Transformation & Data (DT&D)


Reporting to Data and Analytics Lead, the successful candidate will be responsible for the following:

 

 

Data Management & Transformation

 

  • Designing and developing new data pipelines and managing existing data pipelines that extract data from various business applications, databases, and external systems.
  • Implementing data quality checks and validations within data pipelines to ensure the accuracy, consistency, and completeness of data.
  • Transforming data into the desired format by applying data cleansing, aggregation, filtering, and enrichment techniques.
  • Establishing the governance of data and algorithms used for analysis, analytical applications, and automated decision-making.
  • Manage the logical and physical data models to capture the structure, relationships, and constraints of relevant datasets.
  • Ensuring compliance with security and governance best practices.

Optimization & Automation

  • Implementing and maintaining continuous integrations and continuous delivery pipelines for deployments and cloud resource provisioning.
  • Optimising data pipelines and data processing workflows for performance, scalability, and efficiency.
  • Optimising models and algorithms for data quality, security, governance, performance and scalability needs.
  • Routinely assessing processor and storage capacity across the data warehouse and extract transform & load platforms, including capacity planning and forecasting.
  • Monitoring and tuning data system, identifying and resolving performance bottlenecks, issues, and implementing caching and indexing strategies to enhance query performance.
  • Monitoring the platform for credit consumption and housekeeping.
  • Supporting the deployment and maintenance of AI solutions in the data platform.

Collaboration

  • Working with data lead and business users to manage data as a business asset.
  • Guiding the business users to create and maintain reports and dashboards.

 

Job Requirements:

  • Bachelor’s degree in computer science, data science, software engineering, information systems, or related quantitative field; Master’s degree preferred.
  • At least ten years of work experience in data management disciplines, including data integration, modelling, optimization and data quality, or other areas directly relevant to data engineering responsibilities and tasks.
  • At least four years of work experience in designing and implementing data architectures in Azure cloud services and Databricks.
  • Strong proficiency in Python (PySpark) and SQL programming; experience with Java or Scala is an advantage.
  • Experience with relational and non-relational databases including SQL and NoSQL is a must while familiarity with legacy databases such as Oracle is preferred.
  • Experience using Azure DevOps, Databricks LakeFlow Jobs for DevOps practices including version control, CI/CD and pipeline deployment.
  • Experience with data catalogue tools including Unity Catalog and Microsoft Purview.
  • Familiarity with BI & visualisation tools such as Power BI and Python visual packages for data analytics is preferred.
  • Experience supporting AI/ML model deployment, feature engineering and production inference is preferred.
  • Align with and demonstrate BOC Aviation Core Values, which are Integrity; Teamwork; Accountability; Agility; and Ambition.

 

If you like working in an international environment, please email your CV to us with details of expected remuneration together with a recent photograph to:

 

Chief People Officer
BOC Aviation Limited
79 Robinson Road, #15-01
Singapore 068897
Email: [email protected]
Visit our website for more information: www.bocaviation.com