Copyright OPTnation. All rights reserved.

Big Data Engineer

Job ID : 34736

Job Title : Big Data Engineer

Location : Los Angeles, CA

Comapny Name : Dart Point

Job Type : Full-Time, Part-Time, Contract, Training

Industry : Information Technology

Salary :  $41000 - $55000  per hour

Work Authorization : ["OPT","CPT","Entry Level","F1","H4","L1","H1 Visa","TN Permit Holder","Green Card Holder","Canadian Citizen","All EAD","US Citizen"]

No. of Positions : I have ongoing need to fill this role

Posted on : 09-18-2023

Required Skills : Apache Spark, Data mining, Modeling

Benefits : Medical Insurance, Dental Insurance, Vision Insurance, 401K, Life Insurance

Job Description :

Responsibilities:

  • Develop solutions related to Big Data, and Data Sciences from end-to-end.
  • Develop and maintain scalable data pipelines that will ingest, transform, and distribute data streams and/or batches within the AWS, Snowflake and Microsoft Platforms.
  • Identify, design, and implement process improvements for automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.
  • Self-motivated, self-directed with strong problem-solving skills.
  • Support and decommission legacy platforms.
  • Enable business strategy, lean processes, increased data velocity, and insights.
  • Embody a culture of continuous innovation and learning.
  • Adhere to programming/development standards and governance framework.
  • Collaborate with business, analytical teams, and data scientist to improve efficiency, increase the applicability of predictive models, and help translate ad-hoc analyses into scalable data delivery solutions.
  • Consulting on data ingestion, data modeling, security, and capabilities.
  • Manage the innovation cycle of conducting analyses, generating insights.
  • Assist with the selection and management of consultants and vendors.
  • Assist with the recruitment and development of talent.
  • Collaborate with DevOps team to integrate innovations and algorithms into a production system.
  • Support business decisions with ad hoc analysis as needed.
  • Work with the DevOps team to create and manage deployment workflows for all scripts and code using Microsoft Azure.

Key Skills:

  • Databases and SQL.
  • ETL and data warehousing.
  • Talend, IBM DataStage, Pentaho, and Informatica.
  • Operating system knowledge for Unix, Linux, Windows, and Solaris.
  • Hadoop.
  • Apache Spark.
  • Data mining and modeling.

Company Details :

Company Information hidden please Login to view details

Login To Apply Now! Register & Apply Now!