Big Data Platform Developer

Job description

Thank You for reading this post! You rock! :)

We are SmartWays – we help people grow and make good career decisions. We are specialized in IT recruitment  and encourage you to join our IT Talent Community!

We work with an international companies across EMEA to drive their business outcomes with the best-in-class IT people. People who use their intelligence and passion to inspire their teammates and make an impact on our customers are highly valued by us. Please read this carefully to find out if this offer is for You.


American investment bank with only 5 000 employees and assets of over 5 trillion $, is working on its digital transformation journey. Now they are looking for the Big Data System Engineer to help with the configuration, architecture design and development of a brand-new Big Data platform based on Cloudera's distribution.

You will:

  • Ensure comprehensive support process to secure data lake on Big Data infrastructure
  • Provide technical leadership on best practices for chosen architecture and technology (Cloudera)
  • Deliver solutions that fulfill business needs and align with the information vision and strategy of enterprise data lake
  • Build, install, upgrade or migrate large size big data systems
  • Deliver high-quality solutions by designing and developing ETL logic and data flows
  • Use the following components of Cloudera distribution to achieve project objectives: HDFS, Hadoop, MapReduce, Sqoop, Hive, Impala, Solr, Oozie and Spark and much more
  • Identify and propose automated approaches for system administration tasks
  • Work as a subject matter expert



  • Bachelor's degree in Computer Science/related technical field/equivalent experience
  • Hands-on experience in work with processing large volumes of structured/unstructured data
  • Minimum 6 years of experience as an IT Developer (designing, developing, testing and implementing)
  • Strong experience in supporting Linux servers (all aspects of Linux systems including hardware, software and applications)
  • Technology stack which consists of: Spark or Hadoop, Java, Scala and/or Python, ETL, NoSQL Databases (HBase or MongoDB)
  • Experience with Apache Pig scripting and Apache HiveQL
  • End-to-end development life-cycle support and SDLC processes
  • Agile mindset
  • Independence in combination with openness to cooperation with the team

It will be great asset if you also have:

  • Knowledge of Python, MapReduce
  • Good understanding of OS concepts, networking, CPU, memory and storage, process management and resource scheduling
  • Cloudera Certified Professional (CCP) or Cloudera Certified Administrator (CCA)
  • Knowledge of Machine Learning libraries and exposure to Data Mining

Once getting on board, you will get:

  • Opportunities to grow your expertise in a multinational environment where collaborative model of work is fundamental
  • High-quality benefits such as Multikafeteria vouchers, attractive bonuses, budget for conferences, trainings and certifications
  • Rewarding work with the flexibility to enjoy personal and family experiences at every career stage.

Do not hesitate to apply if you like to do things that matter. If you do not see yourself in a such position, spread the word among other professionals who might be a fit!