Seeking talented engineers and visionaries, passionate about software and have the potential to build cutting edge SaaS solutions on the emerging technology platforms, candidates with strong programming experience in Java with 4+ years of experience in analytics applications using Big Data infrastructure (especially Hadoop, NoSQL and Spark). You are a hands-on expert with Map Reduce jobs in Java and Spark streams from HDFS, Flume and Kafka. You love scaling distributed applications, making architectural trade-offs applying synchronous and asynchronous design patterns, writing code, and delivering with top-notch quality.
Job Responsibilities:-
• Creating complex, enterprise-transforming data analytics on the big data platform using the latest tools and techniques of the trade.
• Hands-on coding in highly collaborative teams, building quality code and deliverables.
• Understanding the business domain deeply and working closely with business stakeholders to develop different analytics data tiers on the platform.
Desired Characteristics:-
• Proven expertise in designing and programming large scale data driven solutions using any of the high-level programming languages viz. Java/ Scala (at least 4 years).
• Strong inclination towards mathematical modelling and solutions for real-world problems.
• Hands-on experience and expert level knowledge in advanced data management technologies offered by HDP (e.g. HDFS, Hive, Tez, YARN, Pig) and optimization techniques is a must.
• Distributed stream processing engines and data stores (Storm, Spark, Cassandra, MongoDB).
• Data mining and reporting tools (R, Weka, Tableau, TIBCO Spotfire).
• Strong problem solving, communication and team working skills.
• Experience building Web Services SOAP and RESTFUL.
• Exposure to version control systems like Git or Bitbucket.
• Hands-on experience on NoSQL, distributed and unstructured databases like MongoDB, Cassandra, HBase, CouchDB, etc.
• Working experience in either of the Big Data platforms (Cloudera or Hortonworks, latter one preferred).
• Experience working with, or an interest in Agile Methodologies, such as Extreme Programming (XP) and Scrum.
• Knowledge of software best practices, like Test-Driven Development (TDD) and Continuous Integration (CI).
• Self-driven, results-oriented and highly motivated to work with cross-culture teams.
Other Important Requisites:-
• Minimum Big Data Ecosystem experience of 3 years.
• Minimum Java experience of 4 years with at least 3 years in BFSI domain.
• Technology degree preferable. Nice to have experience in Banking, Cards, Payment or Credit Risk Domain.
Shatkone Human Capital is looking for Any Graduate profile candidates.