This site uses cookies. To find out more, see our Cookies Policy

BIG Data DEVOPS Consultancy Training Programme in Wrocław at Atos

Date Posted: 11/9/2018

Job Snapshot

  • Employee Type:
  • Location:
  • Job Type:
  • Experience:
    Not Specified
  • Date Posted:

Job Description

Atos SE (Societas Europaea) is a leader in digital services with pro forma annual revenue of circa € 12 billion and circa 100,000 employees in 72 countries. Serving a global client base, the Group provides Consulting & Systems Integration services, Managed Services & BPO, Cloud operations, Big Data & Cyber-security solutions, as well as transactional services through Worldline, the European leader in the payments and transactional services industry. With its deep technology expertise and industry knowledge, the Group works with clients across different business sectors: Defense, Financial Services, Health, Manufacturing, Media, Utilities, Public sector, Retail, Telecommunications, and Transportation.

Atos is focused on business technology that powers progress and helps organizations to create their firm of the future. The Group is the Worldwide Information Technology Partner for the Olympic & Paralympic Games and is listed on the Euronext Paris market. Atos operates under the brands Atos, Atos Consulting, Atos Worldgrid, Bull, Canopy, Unify and Worldline.



Training Programme starts at 27th October and take place for 4 consecutive weekends in our office in Łódź. Sessions will cover:

  • Internet of Things
  • Big Data, Java
  • Google Cloud
  • Consulting core skills

After the training the best participants will be offered a job at Atos!

Apply on till 22nd October

Key tasks:

  • Management, administration, development and enhancements to systems based on Big Data
  • Building and maintaining of BIG Data environments including Cloud environments
  • Support automation of deployment, customizations, upgrades and monitoring via DevOps tools
  • Create automation across the BIG Data environments
  • Be active member of engineering DevOps team collaborating with customers all over the world
  • Create processes and procedures in the environments
  • Participation in R&D projects related to Cloud computing and Big Data
  • Suggesting of new standards and solutions for providing high quality of delivered services
  • Tracking trends and latest issues related to the domain of conducted projects
  • Creating technical documentation

Key Requirements:

  • More than 1 year of experience in Linux/Unix systems including installation, configuration, networking, backups, updates and patching; system security is a plus
  • Some experience in Open source Big Data platform, to include the following: Hadoop, HDFS, HBase, Spark
  • Some Experience in cloud solutions (ie.:Amazon, Google, Microsoft, Oracle)
  • Knowledge of different monitoring systems (ie.: Nagios)
  • Ability to communicate effectively both verbally and in writing
  • Good teamwork and interpersonal skills
  • Is open to learn new technologies
  • Is open to work with Big Data (processing of terabytes of data reliably in daily manner) and Fast data (processing tenth/hundreds thousands of events per second in cluster/cloud environment)
  • Good English speaker (at least B2 level)

Nice to have:

  • Experience with Spark streaming, Kafka, Nifi, Flume, ZooKeeper, Hive, Hawq, Cassandra, Impala, Scala, Java
  • Knowledge of VMware/Microsoft system administration
  • Knowledge of different types of integrations
  • Knowledge of RedHat, CentOS or Ubuntu
  • Has knowledge of application architecture patterns, development patterns
  • Knowledge of Anisble , PostgreSQL
  • Some Experience with languages (especially Scala, Python, R)
  • Experience with serverless development for Cloud purposes
  • Basic knowledge of Machine Learning
  • Basic knowledge of DWH
  • Participation in oprn source projects
  • German knowledge (at least B1 level)

We take care of your personal data privacy. More information about processing your personal data within recruitment process you can find on our website: