Back to all jobs

Expert Data Engineer for Ciklum Digital

India Remote

Big Data & Analytics


On behalf of Ciklum Digital, Ciklum is looking for an Expert Data Engineer to join the UA team on a full-time basis. 

You will join a highly motivated team and will be working on a modern solution for our existing client. We are looking for technology experts who want to make an impact on new business by applying best practices and taking ownership

About the Client:

The largest fast fashion group in the world, revenue 20Bn+ EUR, operates over 7,200 stores in 93 markets worldwide.The company owns a large  number of global brands. Key facts:

Project description:

Together with the client, we are building a huge data-as-a-service tool to enable the client’s business to access real-time data to manage effectively a group of brands and its analytics to improve customer satisfaction and predict consumer behaviour to provide right in time and right in place goods. Becoming a part of the team, you will be working with top-notch industry experts to create advanced analytics based on terabytes of data. 

Technology Stack: Python, Kafka, Azure, Datbricks, Snowflake


  • Responsible for the building, deployment, and maintenance of mission-critical analytics solutions that process data quickly at big data scales
  • Contributes to design, code, configurations, and documentation for components that manage data ingestion, real-time streaming, batch processing, data extraction, transformation, and loading across multiple data storages
  • Owns one or more key components of the infrastructure and works to continually improve it, identifying gaps and improving the platform’s quality, robustness, maintainability, and speed
  • Cross-trains other team members on technologies being developed, while also continuously learning new technologies from other team members
  • Interacts with engineering teams and ensures that solutions meet customer requirements in terms of functionality, performance, availability, scalability, and reliability
  • Performs development, QA, and dev-ops roles as needed to ensure total end to end responsibility of solutions
  • Works directly with business analysts and data scientists to understand and support their use cases
  • Contribute to CoE activities and community building, participate in conferences, provide excellence in exercise and best practices
  • Help in sales activities, customer meetings and digital services


  • 5+ years of experience coding in Python, SQL, or Scala, with solid CS fundamentals including data structure and algorithm design
  • 3+ years contributing to production deployments of large backend data processing and analysis systems as a team lead
  • 2+ years of hands-on implementation experience working with a combination of the following technologies: Kafka, as well as Hadoop (Pig, Hive, Impala), Spark, Storm, SQL and NoSQL data platforms such as Databricks and/or Snowflake
  • 2+ years of hands-on experience in cloud data platforms – Azure is a must, optionally – AWS and GCP
  • Knowledge of SQL and MPP databases (e.g. Redshift, Azure Synapse)
  • Prior experience in managing large scale data project
  • Knowledge of professional software engineering best practices for the full software
  • Knowledge of Data Warehousing, design, implementation and optimization
  • Knowledge of Data Quality testing, automation and results visualization
  • Knowledge of BI reports and dashboards design and implementation
  • Knowledge of development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations
  • Experience participating in an Agile software development team, e.g. SCRUM
  • Experience designing, documenting, and defending designs for key components in large distributed computing systems
  • A consistent track record of delivering exceptionally high-quality software on large, complex, cross-functional projects
  • Demonstrated ability to learn new technologies quickly and independently
  • Ability to handle multiple competing priorities in a fast-paced environment
  • Undergraduate degree in Computer Science or Engineering from a top CS program required. Masters preferred
  • Experience with supporting data scientists and complex statistical usecases highly desirable


  • Understanding of cloud infrastructure design and implementation
  • Experience in data science and machine learning
  • Experience in backend development and deployment
  • Experience in CI/CD configuration
  • Good knowledge of data analysis in enterprises

Personal skills

  • Ability to work independently without supervision
  • Curious mind and willingness to work with client in consultative manner to find areas to improve
  • Upper-Intermediate English
  • Good analytical skills
  • Good team player, motivated to develop and solve complex tasks
  • Self-motivated, self-disciplined and result-oriented
  • Strong attention to details and accuracy

What's in it for you

  • A Centre of Excellence is ultimately a community that allows you to improve yourself and have fun. Our centres of excellence (CoE) bring together all Ciklumers from across the organization to share best practices, support, advice, industry knowledge and to create a strong community
  • Close cooperation with client
  • Dynamic and challenging tasks
  • Ability to influence project technologies
  • Projects from scratch
  • Team of professionals: learn from colleagues and gain recognition of your skills
  • European management style
  • Continuous self-improvement

Your dream job isn’t here? Let’s stay in touch! We’ll inform you about the best opportunities.


    filetypes: pdf | doc | docx limit - 1MB

    By submitting completed “Contact Us” form, your personal data will be processed by Ciklum Group and its subsidiary entities worldwide. Please read our Privacy Notice for more information. If you have any questions regarding your rights or would subsequently decide to withdraw your consent, please send your request to us.