Get job as soon as 2 days (HURRY UP) ENROLL NOW or Call   +1 (804)-454-3215
+


Job ID : 38628

Location : Hoken, NJ

Company Name : Twelve Labs

Job Type : Full-Time, Part-Time, Contract, Training

Industry : Information Technology

Salary : $25000 - $35000  per year

No. of Positions : I have ongoing need to fill this role


Required Skills : Programming languages, Problem-solving, Communication skills..

Benefits : Medical Insurance, Dental Insurance, Vision Insurance, 401K, Life Insurance


Job Description :

Responsibilities:

  • Design, implement, and optimize distributed data pipelines using Hadoop components like HDFS, MapReduce, Hive, Pig, and Spark.
  • Work with data ingestion tools such as Flume, Sqoop, and Kafka to ensure efficient data transfer from various sources into Hadoop clusters.
  • Write and optimize complex queries in HiveQL, Spark SQL, and other Big Data frameworks to extract, manipulate, and analyze data at scale.
  • Build and maintain ETL pipelines to process structured and unstructured data from internal and external sources.
  • Implement performance improvements and monitor the performance of the Hadoop ecosystem, ensuring scalability, stability, and high availability of data platforms.
  • Troubleshoot issues in the Hadoop environment and optimize jobs for maximum efficiency.
  • Utilize Spark for distributed data processing and integrate it with Hadoop to achieve faster data analysis.
  • Leverage other Big Data tools such as HBase, Cassandra, and Kafka to handle real-time data processing.
  • Work closely with data scientists, data analysts, and other stakeholders to understand data needs and develop solutions accordingly.
  • Maintain clear documentation of data pipelines, algorithms, and system architecture.
  • Ensure that data management and storage processes comply with company policies, security protocols, and regulatory requirements.
  • Implement data security and access controls within the Hadoop ecosystem.
  • Keep up to date with new technologies, tools, and techniques in the Big Data space and propose improvements to the existing architecture.
  • Contribute to the implementation of best practices and standards for data processing and management.

Key Skills:

  • Programming languages.
  • Problem-solving.
  • Communication skills.
  •  Strong analytical.

Recommended jobs for you

  • AWS Migration Architect

    Linkage IT Private Limited Contract

    Responsibilities : 8+ years of iT experience, with 3+ years in AWS Migration. Strong in AWS services (EC2, VPC, RDS, IAM, Lambda, CloudFormation). Hands-on with migration tools (DMS, Migration Hub, Clou

    View Job
  • Video Editor

    Cae Inc San Antonio, TX

    Responsibilities : Job Title: Remote Video Editor – Simulation & Training Storyteller Location: Remote (US and Canada-based) Department: Crea

    View Job
  • Scrum Master

    Cae Inc San Antonio, TX

    Responsibilities : Job Title: Scrum Master – Agile Catalyst & Team Coach Location: Remote  Department: Product & Technology

    View Job
  • Trainee

    Zatin Systems Texas City, TX

    Responsibilities : Job Title: IT Training Program with On-Project Exposure [Always Hiring] Location: Remote / Multiple Locations Employment Type: Training

    View Job
  • Business Analyst

    SystemoneX Inc Remote, Hybrid, Onsite

    Responsibilities : We are hiring Business Analysts for contract roles across multiple industries. 🔹 Key Responsibilities: Work with stakeholders to

    View Job

Thanks For Your Feedback

We’re an equal opportunity provider.
All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status.
OPTnation.com is not a Consulting Company/Training Company/H1B Sponsor.

Attach A Resume First