Bigdata Hadoop Developer
PLANO , TX and
Phone and skype
Client – TCS
PP no is required
6- 8 years Experience in Python scripting is a plus
• Big data background with experience designing and implementing large scale systems
• Experience in Hadoop related technologies is a must (HDFS, MapReduce and HBase)
• Working experience with Hadoop, Enterprise Java development, NoSql data platforms (Cassandra), Pub/sub messaging (Kafka, ActiveMQ, JMS, etc), Stream
processing (Storm, Hbase, Nifi, Spark Streaming, etc), ETL Processing with tools such as Abinitio, Talend, Informatica, Hive/SQL and Visualization with Tableau/Cognos.
• Experience with NOSQL databases (MongoDB, Cassandra) and data modeling tools like Erwin
• Familiarity with some of the data consumption tools/databases (Hive, Impala, EsGyn, Trifacta, AtScale, etc.) involving HDFS files
• Working experience with Cloud development stacks like OpenStack, CloudStack, VCloud, AWS, EC2 for Hybrid Configurations and Managed Cloud Considerations
• Extensive experience with horizontally scalable and highly available system design and implementation, with focus on performance and resiliency
• Assisting with capacity planning for the environment to scale.
• Extensive experience profiling, debugging, and performance tuning complex distributed systems
• Willingness to commit extra effort to meet deadlines as required on a high profile and business critical project
• A minimum of 10 years work experience within IT
• Big Data Engineer Development/Operational experience in a multi-platform, multi-location, 7x24x365 global environment
• Experience of working with cross-geographic, cross-functional, and cross-lob teams
• Knowledge of best practice in change, problem, incident, configuration and system health management (ITIL)
• 2+ Years work experience in Big Data Environment (Hortonworks and Cloudera).
add me on hangout: snehachoudhari97
To apply for this job email your details to firstname.lastname@example.org