Synechron is one of the fastest-growing digital, business consulting & technology firms in the world. Specialized in finan...
Synechron is one of the fastest-growing digital, business consulting & technology firms in the world. Specialized in financial services, the business' focus on embracing the most cutting-edge innovations combined with expert knowledge and technical expertise has allowed Synechron to reach $500+ million in annual revenue, 8,000 employees and 18 offices worldwide. Synechron is agile enough to invest R&D into the latest technologies to help financial services firms stand at the cutting-edge of innovation; yet, also large enough to scale any global project. Learn more at:
Synechron draws on over 15 years of financial services IT consulting experience to provide expert systems integration expertise and technical development work in highly-complex areas within financial services. This includes: Enterprise Architecture & Strategy, Application Development & Maintenance, Quality Assurance, Infrastructure Management, Data & Analytics and Cloud Computing. Synechron is one of the world's leading systems integrators for specialist technology solutions including: Murex, Calypso, Pega, and others and also provides traditional offshoring capabilities with off-shore development centers located in Pune, Bangalore, Hyderabad, and Chennai as well as near-shoring capabilities for European banks with development centers in Serbia. Synechron's technology team works with traditional technologies and platforms like Java, C++, Python, and others as well as the most cutting-edge technologies from blockchain to artificial intelligence. Learn more at:
Synechron Inc is seeking Senior Java/Spark Developer within financial services.
- The primary purpose of this role is to develop features using Java and Spark for Product Control application.
- Requirement is for a Senior Java Developer with a proven track record of producing complex software solutions.
- This role will require extensive development using Java, Spark, Big Data technologies.
- Candidate will be heavily involved with implementation of end to end solutions for various regulatory projects related to operation risk, P&L, PAA for all various financial products.
- Candidate will be involved with critical regulatory projects with aggressive timelines and will work extensively in redesigning data model of existing platform and maintain it going forward.
- Coach offshore development teams effectively for optimum productivity.
- Understanding Business Requirements and Functional Requirements provided by Business Analysts and to convert into Technical Design Documents and ability to develop solutions as per requirements.
- Design and build scalable infrastructure and platform to ingest, store and process very large amount of data.
- Leverage industry-standard data orchestration and automation tools to create efficient and reliable ETL jobs.
- Build best in class ETL based solutions for effective data ingestion and transformation.
- Gathering requirements for new functionality from business users; offering timely solutions; developing, testing and implementing the proposed solutions.
- Collaborate with various data source teams on effective strategies for data ingestion.
- Work closely with the IT and platform teams to deliver tech solutions.
- Be an expert in all things data wrangling, aggregation, summarization and analysis.
- Explore existing application systems, determines areas of complexity, potential risks to successful implementation.
- Bachelor of Science or Master degree in Computer Science or Engineering or related discipline; Master's degree preferred
- 5+ years' experience in software development using Core Java
- 3+ years' experience in technologies (Hadoop, Spark and Big Data platforms, RBDMS, NoSQL)
- Advanced SQL capabilities are required. Knowledge of database design techniques and experience working with extremely large data volumes is a plus
- Advanced experience in ETL and data wrangling using language Java
- Experience in other technologies such as HIVE, Kafka, TIBCO EMS, GemFire, Spring, etc.
- Experience in programing in Linux/Unix environment including shell scripting
- Experience in Agile SDLC, JIRA, BitBucket/Git, AWS ECS, RedShift is a plus
- Ability to work in a fast-paced environment both as an individual contributor and a tech lead
- Experience in implementing successful projects
- Ability to adjust priorities quickly as circumstances dictate
- Consistently demonstrates clear and concise written and verbal communication
We'd love to see:
- Experience with large database and DW implementation in language Java.
- Experience working in Big Data platform with technologies, especially Hadoop, Spark, Kafka, and HIVE.
- Understanding of VLDB performance aspects, such as table partitioning, shard, table distribution and optimization techniques.
- Programming experience, including Java, Spring, ksh.
- provided by Dice