Get job as soon as 2 days (HURRY UP) ENROLL NOW or Call   +1 (804)-454-3215

Job ID : 14366

Job Title : Hadoop Admin

Location : Maryland Heights, MO

Company Name : Global Force USA

Industry : Information Technology

Salary : $76,000 - $119,000  per year

Job Type : Contractor

Posted on: 2020-06-18

Required Skills : Amazon S3, Amazon Web Services, Ansible, Apache HBase, Apache Hadoop, Apache Hive

Benefits : No benefits are available

Job Description:Designs, develop, automate, implement and support big data clusters using components including Hadoop (HDFS), YARN, Spark, Kafka, HBase, NoSQL, Authorization and Kerberos Authentication to enhance data analytic capabilities.

Duties & Responsibilities:

  • Design, develop, automate and implement big data clusters using Hadoop, YARN, Hive, Zookeeper, Kafka, NoSQL components.
  • Perform Platform administration and automation of Hadoop and Kafka including installation, maintenance, and configuration.
  • Perform troubleshooting and resolution management, and provide support to the customer, users, and technical teams.
  • Resolve issues related to development, operations, implementations, and system status.
  • Research and recommend options for department direction on Big Data management. Manage and maintain all production and non-production Hadoop and Kafka clusters and its infrastructure.
  • Develop run books for Ansible, Shell script, and Python for automation.
  • Review, develop, and walk through Java and Scala code to implement best practices and tuning.
  • Support multiple clusters of medium to large complexity with multiple concurrent users, ensuring control, integrity, and accessibility of data.
  • Create and maintain standard operating procedures and templates for cluster user access.
  • Design and implement a toolset that simplifies provisioning and support of a large cluster environment.
  • Enable and configure Kerberos for Hadoop components and implement enterprise security for Hadoop and Kafka.
  • Enable data encryption at rest and at motion with TLS/SSL to meet the security standards.
  • Responsible for system backups, and coordinate with infrastructure team for storage and rotation of backups is accomplished.
  • Big Data tenant onboarding, Enable Sentry for RBAC (role-based access control) to have a privilege level access to the data in HDFS/Hive/Kafka as per the security policies.
  • Perform cluster maintenance as well as creation and removal of nodes
  • Design and implement Backup and Disaster Recovery strategy
  • Participate in new tool discovery and technical deep-dive sessions, Proof-Of-Concept (POC) development with prospects
  • Screen Hadoop cluster job performances, manage, troubleshoot, and review Hadoop log files.
  • Utilize expertise in technologies and tools, such as Kafka, Hadoop, Spark, and other storage systems as well as other cutting-edge tools and applications in Big Data space.
  • Performance tune Big Data components including Hive queries, and address performance issues related to schedulers and YARN
  • Participates in continuous performance improvement sessions to discuss opportunities to improve processes or standards.


Required Skills/Qualifications:

  • Education: Master’s degree or Bachelor degree (or foreign equivalent) in Computer Science, IT, MIS, or a closely related field.
  • 4+ years of experience in handling large-scale distributed platforms or integration projects or enterprise applications.
  • Must have 2 years of experience with Big Data technologies and its components such as Hadoop, Kafka, Map Reduce, YARN, Sqoop, Hive, Zookeeper, NoSQL, HBase, NiFi, etc
  • Experience in performance tuning Big Data components including Hadoop (HDFS), YARN, Spark, Kafka, HBase, NoSQL and other big data components
  • Enabling and configuring Kerberos for Hadoop components and implementing enterprise security for Hadoop and Kafka
  • Developing Automation Scripts using Unix Shell scripting or Python scripting or Ansible
  • Experience in storage systems like AWS S3/Isilon OneFS is a big plus
  • Ability to work seamlessly within a team as well as manage individual tasks
  • Creative and abstract thinking skills to envision and design innovative solutions to business opportunities and challenges
  • Ability to listen and evaluate all opinions without bias, and contribute to a common culture of excellence
  • Extensive technical knowledge of Information Technology field and computer systems
  • Strong communication skills (written, interpersonal, presentation), with the ability to easily and effectively interact and negotiate with business stakeholders
  • Strong ability to pick up complex concepts and processes quickly
  • Proven leadership abilities including the ability to share knowledge, resolve conflict and create consensus
  • Demonstrated ability to take the lead on the most complex projects

Latest Jobs For You

  • Project Engineer

    Journey Group Sioux Falls, SD

    Essential Duties and Responsibilities: Assist the Project Manager with project procurement, ...

    View Job
  • Software/Database Developer


    Ideal Experience/Skills: Knowledgeable in SQL, preferably Microsoft Transact-SQL Server 2012....

    View Job
  • Entry Level Computer Programmer

    Revature Chattanooga, TN

    Job Requirements: College degree (Associates or Bachelors) Must be authorized to work in the...

    View Job
  • Software Developer - Entry Level

    360 View Nashville, TN

    Job Description: Looking for talented graduates who are hungry to be challenged and grow their...

    View Job
  • Electrical Engineer – Entry and Mid-Level

    Elliott Bay Design Group Entry, WV

    Responsibilities & Duties: Work will primarily be in an office environment (5 days per ...

    View Job

Thanks For Your Feedback

We’re an equal opportunity provider.
All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. is not a Consulting Company/Training Company/H1B Sponsor.

Attach A Resume First