Title: Hadoop Administrator (Locals preferred)
Location: Baltimore, MD
Duration: 12 months (high chances of extension)
Interview Mode: F2F
Experience: 10 years
Responsible to build and maintain the Cloudera distribution of Hadoop. Perform in-depth analysis of the hardware and fine tune the environment to optimize ETL and analytics to achieve project goals. Participate in knowledge transfer sessions with the NDW team to learn the environment and integrate the Hadoop platform that can utilize disparate data sources and be accessible by the development team. Work closely with the development team and the project leads to implement Kerberos security on the cluster and follow industry standards to secure the environment. Use Cloudera maintenance, monitoring, and configuration tools to accomplish task goals and build reports for the management review.
|Required Experience & Skills
- Must possess 10 years of experience in a variety of IT and business areas and 7 years of experience in a Big Data Hadoop environment. Ability to understand and construct SQL experience.
- Specializes in functional or technical areas such as, but not limited to, business process reengineering, physical and data security, Earned Value Management (EVM), Quality Assurance, Project Management, training and Business Intelligence.
Required Skills and Experience:
- Hands-on installation and maintenance experience of Cloudera Hadoop environment.
- Experience implementing security standards such as Kerberos.
- Hadoop experience with: Zookeeper, HDFS, YARN, HUE, Hive, Impala, KAFKA, BDR
- Cloudera Manager; Sentry, KTS, KMS, CM API, Cloudera Navigator, Beeline.
- Required Linux/Systems administration experience with: Kerberos, LDAP ,Active Directory, SSL, MySQL
- Linux Administration (RHEL/CentOS/Fedora) Shell, Load Balancer.
- Highly effective communication and collaboration skills.
- Solr, Hbase, Talend, Tableau, Alteryx, DataRobot, Docker, Mesos, ElasticSearch, Kibana, DB Design.