Copyright OPTnation. All rights reserved.

Hadoop Developer

Job ID : 38628

Job Title : Hadoop Developer

Location : Hoken, NJ

Comapny Name : Twelve Labs

Job Type : Full-Time, Part-Time, Contract, Training

Industry : Information Technology

Salary :  $25000 - $35000  per year

Work Authorization : ["OPT","CPT","Entry Level","F1","H4","L1","H1 Visa","TN Permit Holder","Green Card Holder","Canadian Citizen","All EAD","US Citizen"]

No. of Positions : I have ongoing need to fill this role

Posted on : 01-27-2025

Required Skills : Programming languages, Problem-solving, Communication skills.

Benefits : Medical Insurance, Dental Insurance, Vision Insurance, 401K, Life Insurance

Job Description :

Responsibilities:

  • Design, implement, and optimize distributed data pipelines using Hadoop components like HDFS, MapReduce, Hive, Pig, and Spark.
  • Work with data ingestion tools such as Flume, Sqoop, and Kafka to ensure efficient data transfer from various sources into Hadoop clusters.
  • Write and optimize complex queries in HiveQL, Spark SQL, and other Big Data frameworks to extract, manipulate, and analyze data at scale.
  • Build and maintain ETL pipelines to process structured and unstructured data from internal and external sources.
  • Implement performance improvements and monitor the performance of the Hadoop ecosystem, ensuring scalability, stability, and high availability of data platforms.
  • Troubleshoot issues in the Hadoop environment and optimize jobs for maximum efficiency.
  • Utilize Spark for distributed data processing and integrate it with Hadoop to achieve faster data analysis.
  • Leverage other Big Data tools such as HBase, Cassandra, and Kafka to handle real-time data processing.
  • Work closely with data scientists, data analysts, and other stakeholders to understand data needs and develop solutions accordingly.
  • Maintain clear documentation of data pipelines, algorithms, and system architecture.
  • Ensure that data management and storage processes comply with company policies, security protocols, and regulatory requirements.
  • Implement data security and access controls within the Hadoop ecosystem.
  • Keep up to date with new technologies, tools, and techniques in the Big Data space and propose improvements to the existing architecture.
  • Contribute to the implementation of best practices and standards for data processing and management.

Key Skills:

  • Programming languages.
  • Problem-solving.
  • Communication skills.
  •  Strong analytical.

Company Details :

Company Information hidden please Login to view details

Login To Apply Now! Register & Apply Now!