Job ID : 28802
Location : New York, NY
Company Name : Atechstar
Job Type : Full-Time, Parttime, Contract, Training
Industry : Information Technology
Salary : $358,900 - $499,000 per year
No. of Positions : I Have Ongoing Need To Fill This Role
Required Skills : Snowflake / Red Shift SQL Server Postgre Sql AWS
Benefits : Medical Insurance,Dental Insurance,Vision Insurance,401K,Life Insurance
Job Description :
- Create and maintain optimal data pipeline architecture.
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Azure big data technologies.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centers and Azure regions.
- Work with data and analytics experts to strive for greater functionality in our data systems.
- Test databases and perform bug fixes.
- Develop best practices for database design and development activities.
- Ability to quickly analyse existing T-SQL code and make improvements to enhance performance, take advantage of new SQL features, close security gaps, and increase robustness and maintainability of the code.
- Take on technical leadership responsibilities of database projects across various scrum teams
Required Knowledge & Skills:
- Expert knowledge in Database like PostgreSQL (preferably cloud hosted), any cloud based Data Warehouse (like Snowflake , Azure Synapse) with strong programming experience in SQL.
- Competence in data preparation and/or ETL tools like snapLogic, MATILLION,, AWS Glue, SSIS (preferably strong working experience in one or more) to build and maintain data pipelines and flows.
- Programming language experience in Golang, Python, shells scripts (bash/zsh, grep/sed/awk etc..).
- Deep knowledge of databases, stored procedures, optimizations of huge data
- In-depth knowledge of ingestion techniques, data cleaning, de-dupe, partitioning.
- Experience with building the infrastructure required for data ingestion and analytics
- Ability to fine tune report generating queries
- Solid understanding of normalization and denormalization of data, database exception handling, transactions, profiling queries, performance counters, debugging, database & query optimization techniques
- Understanding of index design and performance-tuning techniques
- Familiarity with SQL security techniques such as data encryption at the column level, Transparent Data Encryption (TDE), signed stored procedures, and assignment of user permissions
- Experience in understanding the source data from various platforms and mapping them into Entity Relationship Models (ER) for data integration and reporting
- Adhere to standards for all database e.g., Data Models, Data Architecture and Naming Conventions
- Exposure to Source control like GIT, Azure DevOps
- Understanding of Agile methodologies (Scrum, Kanban)
- Preferably experience with NoSQL database to migrate data into other type of databases with real time replication.
- Understanding of data modelling techniques and working knowledge with OLTP and OLAP systems
- Experience with automated testing and coverage tools
- Experience with CI/CD automation tools (desirable)
EducationUG : Any Graduate
Key SkillsSnowflake / Red shift SQL Server Postgre sql AWS