Hadoop Developer
Job ID : 34694
Job Title : Hadoop Developer
Location : Chicago, IL
Comapny Name : codesbright
Job Type : Full-Time, Part-Time, Contract, Training
Industry : Information Technology
Salary : $25000 - $35000 per hour
Work Authorization : ["OPT","CPT","Entry Level","F1","H4","L1","H1 Visa","TN Permit Holder","Green Card Holder","Canadian Citizen","All EAD","US Citizen"]
No. of Positions : I have ongoing need to fill this role
Posted on : 09-14-2023
Required Skills : Proficiency in Programming Languages, Knowledge of SQL, Experience in ETL
Benefits : Medical Insurance, Dental Insurance, Vision Insurance, 401K, Life Insurance
Job Description :
Responsiblities:
- Design and Implementation of Backup and Disaster Recovery strategy based out of Cloudera BDR utility for Batch applications and Kafka mirror maker for real-time streaming applications.
- Implement Hadoop Security like Kerberos, Cloudera Key Trustee Server and Key Trustee Management Systems.
- Use Hive join queries to join multiple tables of a source system and load them to Elastic search tables.
- Work on Cassandra and Query Grid. Implementing shell scripts to move data from Relational Database to HDFS (Hadoop Distributed File System) and vice versa.
- Optimize and Performance tuning of the cluster by changing the parameters based on the benchmarking results.
- Run complex queries and work on Bucketing, Partitioning, Joins and sub-queries.
- Apply different HDFS formats and structure like Parquet, Avro, etc. to speed up analytics.
- Translate, load and exhibit unrelated data sets in various formats and sources like JSON, text files, Kafka queues, and log data.
- Work on writing complex workflow jobs using Oozie and set up multiple programs scheduler system which helped in managing multiple Hadoop, Hive, Sqoop, Spark jobs.
Requirement:
- A minimum of bachelor's degree in computer science or equivalent.
- Cloudrea Hadoop Cloudera Manager, Informatica Bigdata Edition, HDFS, Yarn, MapReduce, Hive, Impala, KUDU, Sqoop, Spark, Kafka, HBase, Teradata Studio Express, Teradata, Tableau, Kerberos, Active Directory, Sentry, TLS/SSL, Linux/RHEL, Unix Windows, SBT, Maven, Jenkins, Oracle, MS SQL Server, Shell Scripting, Eclipse IDE, Git, SVN.
- Must have strong problem-solving and analytical skills.
- Must have the ability to identify complex problems and review related information to develop and evaluate options and implement solutions.
Key Skills:
- Proficiency in Programming Languages.
- Knowledge of SQL.
- Experience in ETL and Data Warehousing.
- Experience with Big Data Technologies.
- Familiarity with Linux/Unix OS.
- Understanding of Distributed Computing.
- Familiarity with Cloud Platforms.
Company Details :
Company Information hidden please Login to view details