Get job as soon as 2 days (HURRY UP) ENROLL NOW or Call   +1 (804)-454-3215
+


Job ID : 34881

Location : Chicago, IL

Company Name : codesbright

Job Type : Full-Time, Part-Time, Contract, Training

Industry : Information Technology

Salary : $63000 - $70000  per year

No. of Positions : I have ongoing need to fill this role


Required Skills : SQL, data storage techniques, Big Data technologies..

Benefits : Medical Insurance, Dental Insurance, Vision Insurance, 401K, Life Insurance


Job Description :

Responsibilities:

  • Analyze data related to different applications Salesforce (Sales & Marketing Cloud), Oracle HR, CANSPAM, RCS, DNR, FFS, FAS, iRACDB2, GDW, SFMC, Data Platform, RECVUE, CARISMA,) using Informatica Analyst.
  • Data Profiling, Scorecards and Understanding data gaps and Interaction with Data Stewards and Business SME’s and apply Business and Data transformation rules.
  • Data Cleansing, Data Validation, Data Standardization, Customer Address Validation using Address Doctor and migrate the cleansed data to Salesforce (Sales & Marketing Cloud), AWS3 Buckets, Data Platform HIVE DB and Mule Soft, downstream applications using Informatica Power Center under AWS and Informatica Intelligence Cloud Services.
  • Preparation of ETL Technical specifications and identification of GAP's and get it reviewed by Data Architect's, Business SME's and proper sign off and developer handover.
  • Using Informatica Power Exchange connector to read DB2 Mainframe Sales system data and load into Salesforce (Sales & Marketing Clouds) using Salesforce Connector.
  • Bulk Data loads and Deletion in Salesforce objects by using automated informatica ETL Jobs.
  • Understanding GDW Teradata BTEQ scripts and enhancements based on new requirements and providing required Salesforce objects names and its data types under change list of Salesforce Agile accelerator and interaction with Salesforce team.
  • All the 3rd party source connectivity via MFT  mount point – Automation of files pickup and drop files from/to MFT server and writing data to different AWS3 buckets and data platform HIVE db.
  • ETL code development, Unit testing code peer review by Data Architects and proper hand over to QA team and then code deployment across the environments. Preparation of Technical design document. Post production and warranty.
  • Defect analysis and finding proper root cause and apply the code fixes, testing and code re deployment and proper closure of defects in HP ALM.

 Requirement:

  • Expert designer, coder and tester for data warehouse programming by using Informatica Power Center, Informatica Developer Tool, Informatica Power Exchange.
  • Experience in Data profiling, Data Cleansing and Standardization and building reusability logic.
  • Able to Read/Write data from/to different DB’s and applications like – Teradata, SQL Server, DB2, Oracle, Salesforce, AWS S3, DP Hive.
  • Expert in writing complex SQL and PL/SQL blocks for Sybase, Oracle, MSSQL, and UDB/DB2.
  • Experience in design reviews and extensive documentation of standards, best practices, and ETL procedures.
  • Evaluate all functional requirements and map documents and perform troubleshoot on all development processes.

Key Skills:

  • SQL competence.
  • Understanding of data modeling concepts.
  • Knowledge of at least one ETL tool.
  • Knowledge of different SQL/NoSQL data storage techniques and Big Data technologies.

Recommended jobs for you

  • Sr ETL Developer

    codesbright Chicago, IL

    Responsibilities : Responsibilities: Develop workflows and mappings to integrate data from and into various systems and data stores. Troubleshooting and correcting existi

    View Job

Thanks For Your Feedback

We’re an equal opportunity provider.
All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status.
OPTnation.com is not a Consulting Company/Training Company/H1B Sponsor.

Attach A Resume First