Data Engineer for Ciklum Digital

Islamabad, Pakistan

Apply

Ciklum is a Software Engineering and Solutions Company. Our 3,000+ IT professionals are located in the offices and delivery centres in Ukraine, Belarus, Poland and Spain.

As Ciklum employee, you'll have the unique possibility to communicate directly with the client when working in Extended Teams. Besides, Ciklum is the place to make your tech ideas tangible. The Vital Signs Monitor for the Children’s Cardiac Center as well as Smart Defibrillator, the winner of the US IoT World Hackathon, are among the cool things Ciklumers have developed.

Ciklum is a technology partner for Google, Intel, Micron, and hundreds of world-known companies. We are looking forward to seeing you as a part of our team!

Read more about the client

Description

On behalf of Ciklum Digital, we are looking for a Data Engineer to join the Islamabad team on a full-time basis.

 

Our customer is one of Denmark’s most well-known and reputable brands. Our customer sells goods in the categories of beauty, health and wellness for just over DKK 3.5 billion. per year.

Responsibilities

  • Responsible for the building, deployment, and maintenance of mission critical analytics solutions that process data quickly at big data scales
  • Contributes design, code, configurations, and documentation for components that manage data ingestion, real time streaming, batch processing, data extraction, transformation, and loading across multiple data storages

Requirements

  • 3+ years of experience coding in SQL, Python, and, desirably, Scala, with solid CS fundamentals, including data structure and algorithms design
  • 2+ years of contribution to production deployments of large backend data processing and analysis systems as a team lead
  • 1+ years of hands-on implementation experience working with SQL and NoSQL data warehouses, such as Spark

Desirable

  • 1+ years of experience in the Azure cloud data platforms
  • Good understanding of Azure Data Lake and Azure DevOps platform, including CI/CD Pipelines
  • Practical experience in deployments automation by using CI/CD pipelines
  • Good understanding of DevOps principles
  • Databricks hands-on experience
  • Knowledge of Apache Spark and PySpark
  • Knowledge of professional software engineering best practices for the full software
  • Knowledge of Data Warehousing, design, implementation and optimization
  • Knowledge of Data Quality testing, automation, and results visualization
  • Knowledge of BI reports and dashboards design and implementation

Personal skills

  • A curious mind and willingness to work with the client in a consultative manner to find areas to improve
  • Intermediate +++ English
  • Good analytical skills
  • Good team player motivated to develop and solve complex tasks
  • Self-motivated, self-disciplined and result-oriented
  • Strong attention to details and accuracy

What's in it for you

  • Unique working environment where you communicate and work directly with client
  • Variety of knowledge sharing, training and self-development opportunities
  • Competitive salary
  • State of the art, cool, centrally located offices with warm atmosphere which creates really good working conditions