Job ID : 36294
Location : Dallas, TX
Company Name : Sabioinfotech
Job Type : Full-Time, Contract
Industry : Information Technology
Salary : $65 - $70 per hour
No. of Positions : I have ongoing need to fill this role
Required Skills : AWS, SNOWFLAKE,LAMBDA,GLUE ETC.
Benefits : None of These
Title: Sr Data Engineer
Location: Dallas, TX
JD:
• Extensive Experience in Data migration is a must ( Teradata to Redshift preferred)
• Extensive testing Experience with SQL/Unix/Linux scripting is a must
• Extensive experience testing Cloud/On Prem ETL (e.g. Abinitio, Informatica, SSIS, Datastage, Alteryx, Glu)
• Extensive experience DBMS like Oracle, Teradata, SQL Server, DB2, Redshift, Postgres and Sybase.
• Extensive experience using Python scripting and AWS and Cloud Technologies.
• Extensive experience using Athena, EMR, Redshift, AWS, and Cloud Technologies
• Experienced in large-scale application development testing – Cloud/ On Prem Data warehouse, Data Lake, Data Science
• Experience with multi-year, large-scale projects
• Expert technical skills with hands-on testing experience using SQL queries.
• Extensive experience with both data migration and data transformation testing
• API/RestAssured automation, building reusable frameworks, and good technical expertise/acumen
• Java/Java Script - Implement core Java, Integration, Core Java and API.
• Functional/UI/ Selenium - BDD/Cucumber, Specflow, Data Validation/Kafka, BigData, also automation experience using Cypress.
• AWS/Cloud - Jenkins/ Gitlab/ EC2 machine, S3 and building Jenkins and CI/CD pipelines, SouceLabs.
• API/Rest API - Rest API and Micro Services using JSON, SoapUI
• Extensive experience in DevOps/Data Ops space.
• Strong experience in working with DevOps and build pipelines.
• Strong experience of AWS data services including Redshift, Glue, Kinesis, Kafka (MSK) and EMR/ Spark, Sage Maker etc…
• Experience with technologies like Kubeflow, EKS, Docker
• Extensive experience using No SQL data and unstructured data experience like MongoDB, Cassandra, Redis, ZooKeeper.
• Extensive experience in Map reduce using tools like Hadoop, Hive, Pig, Kafka, S4, Map R.
• Experience using Jenkins and Gitlab
• Experience using both Waterfall and Agile methodologies.
• Experience in testing storage tools like S3, HDFS
• Experience with one or more industry-standard defect or Test Case management Tools
• Great communication skills (regularly interacts with cross functional team members)
Taskimpetus Inc New York, NY
Responsibilities : Are you a curious and detail-oriented individual with a passion for uncovering insights from data? We are seeking a motivated Entry-Level Data Analyst to join our growing
America zenith Princeton Univer, NJ
Responsibilities : Company Description America Zenith is an IT consulting firm located in Princeton, NJ that specializes in providing innovative solutions in Application and Infrastructure
Twelve Labs Ohio City, OH
Responsibilities : Responsibilities: Create and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business require
Soro Technologies Los Angeles, CA
Responsibilities : Responsibilities: In collaboration with the Head of Market Data, manage exchange entitlement reporting. Responsible for owning the invoice reconciliati
wipros Manchester, MI
Responsibilities : Responsiblties: Translate research data and findings into actionable strategic insights using strong storytelling, data visualization, and presentation skills
We’re an equal opportunity provider.
All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status.
OPTnation.com is not a Consulting Company/Training Company/H1B Sponsor.