Chief Technical Office is the most innovative unit of Ciklum, providing unique expertise and consultancy in Big Data &Analytics, Research and Development, as well as DevOps services. We create advanced solutions for entire Ciklum and its clients and accumulate world-class expertise, solving real-world problems that impact millions of people in areas such as: Artificial Intelligence, Machine Learning, Blockchain, IoT, VR/AR and many others.
Our main principles are:
People are over processes and hierarchy.
Flat and open collaboration/communication.
Explorer increases creativity and brings more value to business.
Investing to people and innovations ensures your future.
Reuse and share your experience - develop best practices, publicize and follow them.
Ciklum CTO Office is looking for a talented Middle Data Engineer in Kyiv to join the big data team.
The team is working closely with R&D Department on innovative solutions for entire Ciklum and its clients. The position involves working as an engineer responsible initially for implementing different big data solutions (which are not limited to Hadoop stack). This will involve small and “big” size projects of about from one to three months. You must possess strong writing and verbal English communication skills. You will be required to have strong customer facing skills and the ability to travel.
- Backend Python development or Scala with Java experience
- Implementing DWH architecture
- Implementing ETL process, workload with Lambda/Kappa implementations with data streaming
- Monitoring performance and advising any necessary infrastructure changes
- Python experience 3+ years or Scala 2+ years with Java 3+ years
- Understand problems with distributed algorithms and data structures
- Understand and handle multiprocessing, concurrency and parallelism
- Experienced/advanced Linux user
- Experience with one of AWS, GCP, Azure
- ETL or streaming experience on one of named clouds
- Experience with DWH
- Experience with Data Lakes
- Proficient understanding of distributed computing principles
- Experience with building stream-processing systems like Kafka, Apache Spark Streaming, Akka Streams, Kafka Streams, Storm, RabbitMQ or other
- ETL implementation based on Hadoop/Spark stack and/or Presto, Hive, etc..
- Experience with NoSQL databases, such as HBase
- Experience with RDBMS’s like Oracle, PostgreSQL and MySQL
- graduate in quantitative scientific, engineering and/or mathematical discipline ((Mathematics generally, Statistics/Probability, Physics, Electrical Engineering, Experimental Psychology, Chemistry, etc.) –– demonstrable deep knowledge in quantitative principles
- significant experience working closely with business subject matter experts (SME) to achieve business outcomes
- comfortable with travel when needed
- very entrepreneurial by nature, experience in startup culture
- ready for intensive self-education as a must
What's in it for you
- Realization of your innovative ideas in building new Ciklum Solutions and Services
- Friendly collaborative teams and enjoyable working environment
- Professional skills development and training programmes
- Variety of knowledge sharing, training and self-development opportunities
- State of the art, cool, centrally located offices with warm atmosphere which creates really good working conditions