Copyright OPTnation. All rights reserved.

Data Engineer

Job ID : 39002

Job Title : Data Engineer

Location : Los Angeles, CA

Comapny Name : Atechstar

Job Type : Full-Time, Part-Time, Contract, Training

Industry : Information Technology

Salary :  $500000 - $900000  per hour

Work Authorization : ["OPT","CPT","Entry Level","F1","H4","L1","H1 Visa","Green Card Holder","Canadian Citizen","All EAD","US Citizen"]

No. of Positions : I have ongoing need to fill this role

Posted on : 03-06-2025

Required Skills : azure cloud services autosys scala

Benefits : Medical Insurance, Dental Insurance, Vision Insurance, Life Insurance

Job Description :

Responsibilities:

  • Write clean, maintainable, and efficient Python code for backend services or applications. 
  • Develop RESTful APIs or work on web applications using frameworks like Django or Flask. 
  • Implement data extraction, transformation, and loading (ETL) processes using Python. 
  • Collaborate with front-end developers and other team members to ensure seamless integration. 
  • Test and debug applications to ensure they meet quality and performance standards.
  • Participate in code reviews and contribute to the development of coding standards.
  • Keep up-to-date with Python libraries and tools relevant to the project.Responsibilities:
  • Develop new data pipelines using Databricks notebooks and Azure Data Factory to ingest and process data efficiently, ensuring reliability and scalability.
  • Utilize Databricks and Delta tables to optimize the performance of both new and existing data processing jobs, aiming to reduce operational costs and improve efficiency.
  • Maintain the data platform focusing on process monitoring, troubleshooting, and data readiness, ensuring high-quality data for regular reporting and system optimization.
  • Work with other data engineers to design and implement enhancements to the overall data platform, improving functionality and performance.
  • Effectively collaborate with operations, product management, and other departments to gather requirements, troubleshoot issues, and design system enhancements within an Agile SCRUM framework.
  • Participate in on-call support, addressing and resolving production issues as they arise, and coordinate with stakeholders to ensure continuous system operation.
  • Ensure a smooth transition of developed data pipelines to the L2 Support team for post-production maintenance, reducing the need for escalations.
  • Work independently on end-to-end implementation of data projects, from development through to deployment, within the Agile Scrum framework, demonstrating self-reliance and initiative.
  • Use DataFrame or PySpark operations to extract data from Azure Delta Lake, creating reports that support business decisions and meet client needs.
  • Actively engage in release activities, coordinating with cloud engineering teams for necessary infrastructure requirements.
  • Efficiently onboard new team members to the data cloud platform, organizing and granting access to ensure they can fully utilize the data platform for their work.
  • Strategically manage and integrate third-party data sources to complement and enhance our proprietary POS data, maximizing data value and insights.
  • Actively explore and evaluate new technologies or features through proof of concept (POC) and proof of value (POV) projects, driving innovation and technological advancement.
  • Build or improve data pipelines focusing on compliance, ensuring adherence to GDPR, CCPA, and other relevant regulations, and safeguarding data privacy and security.

Key skills:

  • Azure cloud services
  • autosys
  • scala
  • delta
  • pyspark

Company Details :

Company Information hidden please Login to view details

Login To Apply Now! Register & Apply Now!