Skip to main content

Search Jobs

Apply    

Job Details

Primary Location:

Bangalore, KA IN

Schedule:

Full-time

Job Level:

Individual Contributor

Education Level:

Bachelor's Degree (±16 years)

Job ID:

1804034

Job Description

Associate Database Administrator - Big data,Hadoop

 
Business Unit Overview
 
EI (Enterprise Infrastructure) India started operations in 2004 to help our partners take advantage of the global capability in providing operational support using the follow the sun model. Today the team covers multiple technologies in remote infrastructure support covering production, engineering, application and database support. EI India plays a key role in global workforce augmentation and supports all the EI verticals, including 24X7 operations in some areas. The team’s endeavor has been to provide increased value to our partners through operational excellence and innovation with a focus on stability.
 
Role Description
 
EI Web and Data Services (WDS) is a team of architects, product engineers, database administrators and support specialists servicing IT groups Fidelity-wide. We support database, BI and ETL technologies that provide the infrastructure environment for applications. The charter is to Provides efficient, scalable, and high-performing data lifecycle management across enterprise data platforms. Comprehensive global services enable our customers to make information-based, cost-effective decisions through our data engineering, business intelligence, and architecture services.
 
Roles and responsibilities
  • Support Big Data Infrastructure
  • Providing on-call support
  • Involve in delivering Infrastructure changes
  • Working on the service requests as raised by the customers
Candidate Description
  • Knowledge of HDFS, Hadoop Installation and Initial Configuration,
  • Knowledge on Hadoop Security and encryption and Data masking
  • Knowledge on hadoop ecosystem components like Hive, Impala and Pig, HBASE, Oozie
  • Knowledge on Spark, Hue and Solr
  • Managing and Scheduling Jobs and troubleshooting
  • Hadoop Client and Cloudera Manager
  • Unix Shell scripting , Knowledge on python and Java would be desirable
  • Experience with automation/configuration management using Chef, Ansible  or an equivalent
  • Knowledge on Git and Jenkins
  • Strong experience with any Linux distribution
  • Basic understanding of network technologies, CPU, memory and storage
  • Database administration and core Java experience a plus
  • Agile Scrum or Kanban experience
     
Behavioral Attributes
  • Excellent communications and interpersonal skills.
Education and Experience
  • B.S. Computer Science or equivalent
  • 2 to 4 years of experience with and detailed knowledge of Core Hadoop Components and Kafka
  • solutions and dashboards running on Big Data technologies such as Hadoop & Kafka
 
Apply    
Link for schema

Similar Jobs