AWS Data Engineer
Job ID : 36294
Job Title : AWS Data Engineer
Location : Dallas, TX
Comapny Name : Sabioinfotech
Job Type : Full-Time, Contract
Industry : Information Technology
Salary : $65 - $70 per hour
Work Authorization : ["OPT","CPT","H4","L1","H1 Visa","Green Card Holder","All EAD","US Citizen"]
No. of Positions : I have ongoing need to fill this role
Posted on : 02-28-2024
Required Skills : AWS, SNOWFLAKE,LAMBDA,GLUE ETC
Benefits : None of These
Job Description :
Title: Sr Data Engineer
Location: Dallas, TX
JD:
• Extensive Experience in Data migration is a must ( Teradata to Redshift preferred)
• Extensive testing Experience with SQL/Unix/Linux scripting is a must
• Extensive experience testing Cloud/On Prem ETL (e.g. Abinitio, Informatica, SSIS, Datastage, Alteryx, Glu)
• Extensive experience DBMS like Oracle, Teradata, SQL Server, DB2, Redshift, Postgres and Sybase.
• Extensive experience using Python scripting and AWS and Cloud Technologies.
• Extensive experience using Athena, EMR, Redshift, AWS, and Cloud Technologies
• Experienced in large-scale application development testing – Cloud/ On Prem Data warehouse, Data Lake, Data Science
• Experience with multi-year, large-scale projects
• Expert technical skills with hands-on testing experience using SQL queries.
• Extensive experience with both data migration and data transformation testing
• API/RestAssured automation, building reusable frameworks, and good technical expertise/acumen
• Java/Java Script - Implement core Java, Integration, Core Java and API.
• Functional/UI/ Selenium - BDD/Cucumber, Specflow, Data Validation/Kafka, BigData, also automation experience using Cypress.
• AWS/Cloud - Jenkins/ Gitlab/ EC2 machine, S3 and building Jenkins and CI/CD pipelines, SouceLabs.
• API/Rest API - Rest API and Micro Services using JSON, SoapUI
• Extensive experience in DevOps/Data Ops space.
• Strong experience in working with DevOps and build pipelines.
• Strong experience of AWS data services including Redshift, Glue, Kinesis, Kafka (MSK) and EMR/ Spark, Sage Maker etc…
• Experience with technologies like Kubeflow, EKS, Docker
• Extensive experience using No SQL data and unstructured data experience like MongoDB, Cassandra, Redis, ZooKeeper.
• Extensive experience in Map reduce using tools like Hadoop, Hive, Pig, Kafka, S4, Map R.
• Experience using Jenkins and Gitlab
• Experience using both Waterfall and Agile methodologies.
• Experience in testing storage tools like S3, HDFS
• Experience with one or more industry-standard defect or Test Case management Tools
• Great communication skills (regularly interacts with cross functional team members)
Company Details :
Company Information hidden please Login to view details
Python Developer
Comapny Name :Dart Point
Job Location :Charlotte, NC
Job Description:Software Engineer
Comapny Name :Emblue
Job Location :Claremont, CA
Job Description:Application Developer
Comapny Name :