Join us on our exciting journey! IQVIA™ is The Human Data Science Company™, focused on using data and science to help healthcare clients find better solutions for their patients. Formed through the merger of IMS Health and Quintiles, IQVIA offers a broad range of solutions that harness advances in healthcare information, technology, analytics and human ingenuity to drive healthcare forward.
This requirement is for Hadoop experts with Administrator skills having minimum of 8 years of experience.
Excellent understanding / knowledge of Hadoop architecture (1.x and 2.x) and various components such as HDFS, Yarn, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce paradigm.
Solution provider for Capacity Planning of a cluster and creating roadmaps for Hadoop cluster deployment
Responsibilities also include any coding involved in Hadoop ecosystem technologies and assist other developers.
Experience in installation, configuration, supporting and managing Hadoop Clusters using Apache Hadoop, Cloudera Distribution Hadoop (CDH-5.x) distributions (preferred) and Hortonworks Data Platform (HDP-2.x) on Amazon Web Services (AWS)
Good working Knowledge in Hadoop security like Kerberos and sentry
Experienced in Cloudera installation, configuration and deployment on Linux distribution
Commissioning and Decommissioning of nodes as required
Managing and monitoring Hadoop services like Name node, Data node & Yarn
Experienced in loading data into the cluster from dynamically-generated files using Flume and RDBMS using Sqoop also from the local file system to the Hadoop cluster
Performance tuning, and solving Hadoop issues using CLI or by WebUI
Troubleshooting Hadoop cluster runtime errors and ensuring that they do not occur again
Accountable for storage and volume management of Hadoop clusters.
Ensuring that the Hadoop cluster is up and running all the time (High availability, big data cluster etc.)
Evaluation of Hadoop infrastructure requirements and design/deploy solutions
Backup and recovery task by Creating Snapshots Policies, Backup Schedules and recovery from node failure
Working experience in Installation of various components and daemons of Hadoop eco-system
Responsible for Configuring Alerts for different types of services which is running in Hadoop Ecosystem
Moving data from one cluster to another
Working knowledge and experience with wide range of big data components such as HDFS, Map Reduce, Sqoop, Flume, Pig, Hive, Hbase etc.
We know that meaningful results require not only the right approach but also the right people. Regardless of your role, we invite you to reimagine healthcare with us. You will have the opportunity to play an important part in helping our clients drive healthcare forward and ultimately improve human health outcomes. Whatever your career goals, we are here to ensure you get there! We invite you to join IQVIA™
Job ID: R1058791