Get job as soon as 2 days (HURRY UP) ENROLL NOW or Call   +1 (804)-454-3215

Job ID : 34694

Location : Chicago, IL

Company Name : codesbright

Job Type : Full-Time, Part-Time, Contract, Training

Industry : Information Technology

Salary : $25000 - $35000  per hour

No. of Positions : I have ongoing need to fill this role

Required Skills : Proficiency in Programming Languages, Knowledge of SQL,  Experience in ETL.

Benefits : Medical Insurance, Dental Insurance, Vision Insurance, 401K, Life Insurance

Job Description :


  • Design and Implementation of Backup and Disaster Recovery strategy based out of Cloudera BDR utility for Batch applications and Kafka mirror maker for real-time streaming applications.
  • Implement Hadoop Security like Kerberos, Cloudera Key Trustee Server and Key Trustee Management Systems.
  • Use Hive join queries to join multiple tables of a source system and load them to Elastic search tables.
  • Work on Cassandra and Query Grid. Implementing shell scripts to move data from Relational Database to HDFS (Hadoop Distributed File System) and vice versa.
  • Optimize and Performance tuning of the cluster by changing the parameters based on the benchmarking results.
  • Run complex queries and work on Bucketing, Partitioning, Joins and sub-queries.
  • Apply different HDFS formats and structure like Parquet, Avro, etc. to speed up analytics.
  • Translate, load and exhibit unrelated data sets in various formats and sources like JSON, text files, Kafka queues, and log data.
  • Work on writing complex workflow jobs using Oozie and set up multiple programs scheduler system which helped in managing multiple Hadoop, Hive, Sqoop, Spark jobs.


  • A minimum of bachelor's degree in computer science or equivalent.
  • Cloudrea Hadoop Cloudera Manager, Informatica Bigdata Edition, HDFS, Yarn, MapReduce, Hive, Impala, KUDU, Sqoop, Spark, Kafka, HBase, Teradata Studio Express, Teradata, Tableau, Kerberos, Active Directory, Sentry, TLS/SSL, Linux/RHEL, Unix Windows, SBT, Maven, Jenkins, Oracle, MS SQL Server, Shell Scripting, Eclipse IDE, Git, SVN.
  • Must have strong problem-solving and analytical skills.
  • Must have the ability to identify complex problems and review related information to develop and evaluate options and implement solutions.

Key Skills:

  • Proficiency in Programming Languages. 
  • Knowledge of SQL. 
  • Experience in ETL and Data Warehousing. 
  • Experience with Big Data Technologies. 
  • Familiarity with Linux/Unix OS. 
  • Understanding of Distributed Computing. 
  • Familiarity with Cloud Platforms.

Recommended jobs for you

  • Hadoop Developer

    Apollose Chicago, IL

    Responsibilities : Requirements: 3-6 years' experience in Hadoop stack and storage technologies, HDFS, MapReduce, Yarn, HIVE, sqoop, Impala , spark, flume, kafka and oozie.

    View Job
  • Jr. Hadoop Developer

    Data Chicago, IL

    Responsibilities : Hadoop Developer Chicago, IL- Longterm contact Experience: At least 1 to 3 years of experience developing software,Good Knowledge in Java Hadoop Apache Spark HiVE,Good aptitude, strong problem solv

    View Job

Thanks For Your Feedback

We’re an equal opportunity provider.
All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. is not a Consulting Company/Training Company/H1B Sponsor.

Attach A Resume First