Big Data Developer
What makes Cognizant a unique place to work? The combination of rapid growth and an international and innovative environment! This is creating a lot of opportunities for people like YOU - people with an entrepreneurial spirit who want to make a difference in this world.
At Cognizant, together with your colleagues from all around the world, you will collaborate on creating solutions for the world's leading companies and help them become more flexible, more innovative and successful. And this is your chance to be part of the success story: we are looking for a Big Data Developer to join our Team. Role & Responsibilities
Understand requirements, build codes, support testing and fix defects, guide developers in the course of development activities in order to develop high standard stable codes within the limits of Cognizant and clients' processes, standards and guidelines.
Project Planning and Set-up. Understand the project scope, identify activities/ tasks, task level estimates, schedule, dependencies, risks and provide inputs to Module Lead for review. Provide inputs to testing strategy, configuration, deployment, hardware/software requirement etc. Review plan and provide feedback on gaps, timeline and execution feasibility etc. as required in the project Participate in KT sessions conducted by customer/ other business teams and provide feedback on requirements. Requirement understanding and Analysis
Analyse functional/non-functional requirements and seek clarifications for better understanding of requirements Based on understanding of system - upstream & downstream, provide feedback and inputs on gaps in requirements and technical feasibility of requirements. Design
Prepare the LLD/ detailed design documents based on HLD and briefing from Module Lead Seek inputs from the developers on specific modules as applicable Consolidate all modules and provide to Module Lead/ Architects/ Designers for review Suggest changes in design on technical grounds Develop components inventory for the code to be developed tying it to the non-functional requirements Perform sampling of data to understand the character/ quality of the data (project dependent- in the absence of data analyst or designer) Identify tools and technologies to be used in the project as well as reusable objects that could be customized for the project. Experience and qualifications
What you can expect
- 7+ years of experience in Big Data components with strong understanding of core components Hadoop, Hive, Kafka and Spark framework. Should have good experience in working in Big data Hadoop eco system, Data lakes & Unix.
- Good hands-on experience in building Real-time streaming data pipelines using Confluent Kafka with experience in working in latest version of Confluent Kafka.
- Should have strong experience in working Kafka Connect, KStreams, KSQL.
- Sr. Developer bachelor's in science, engineering or equivalent.
- Become part of a 'flag ship' success story - We go through enormous growth!
- Organization driven by technology - We have a tremendous technology backbone
- Open, 'can do' team spirit
- Environment where you can make your own ideas reality
- Drive your own career
- Market conform benefits #Assoc