Job ID : 28127
Location : Remote, OR
Company Name : Advaana Staffing
Job Type : contract
Industry : Information Technology
Salary : $50000 - $120000 per year
No. of Positions : 10+
Required Skills : Python and Shell Scripting.
Benefits : Medical Insurance, Dental Insurance, Vision Insurance, 401K, Life Insurance
Sr. Cloud/DevOps Engineer Role- Automation (Remote) Job Title Sr. Cloud/DevOps Engineer Role- Automation (Remote) Location USA
Job Summary: Sr. Cloud/DevOps Engineer Role
Candidates should have 10+ years with heavy scripting experience (Python and Shell Scripting)
Responsibilities: • Drive the infrastructure required to support the application roadmap in close collaboration with development, product and other teams based on future product vision and to increase organizational efficiency. • Evaluate and provide feedback on team infrastructure architecture artifacts in the context of the architectural roadmap. • Identify risks and propose mitigation strategies. • Mentor/coach engineers to facilitate their DevOps development and provide technical leadership to them. • Develop infrastructure standards and support their adoption. Basic Qualifications: Core skills (AWS, Terraform, and Python or Java) • 10+ years of relevant DevOps or infrastructure engineering experience • BS or MS in Computer Science/Engineering or equivalent technical experience • Phenomenal communication, collaboration skills, and a strong teamwork ethic • Enthusiasm for driving consensus and ideas across multiple teams, roles and levels in the organization • Extensive experience with AWS managed services including but not limited to EC2, S3, RDS, DynamoDB, SNS, SQS • Quality-first mentality with a focus on test automation and CI/CD • Track record of successful implementation experience with AWS or other public cloud platforms using infrastructure-as-code tools like Terraform • Proficiency with at least one of the following languages: Go, Java, Python • Experience with observability platforms like DataDog, New Relic, etc. • Prior experience building internet-scale platforms – handling Peta- byte scale data, operationalizing clusters with hundreds of compute nodes in the cloud environment.
Preferred requirements – Optional
• Experience in operationalizing Machine Learning workflows to scale will be a huge plus. • Experience with supporting REST APIs & large distributed databases will be a plus • Experience with supporting Java/Scala/Python based applications in the Kubernetes environment will be a plus. • Proficiency with agile development methodologies shipping features every two weeks. It would be awesome if you have a robust portfolio on Github and / or open source contributions you are proud to share
Twelve Labs Ohio City, OH
Responsibilities : Responsibilities: Create and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business require
Soro Technologies Los Angeles, CA
Responsibilities : Responsibilities: In collaboration with the Head of Market Data, manage exchange entitlement reporting. Responsible for owning the invoice reconciliati
wipros Manchester, MI
Responsibilities : Responsiblties: Translate research data and findings into actionable strategic insights using strong storytelling, data visualization, and presentation skills
MSIT Texas City, TX
Responsibilities : Responsibilities: An expert in building digital solutions and comfortable with all layers of the application stack. Eager to learn new technology and b
Apollose Philadelphia, PA
Responsibilities : Responsibilities: Work with team to design, develop, test, and maintain software applications written in Angular and Java. Work with team to integrate
We’re an equal opportunity provider.
All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status.
OPTnation.com is not a Consulting Company/Training Company/H1B Sponsor.