Hadoop Developer

Company: Synechron Inc.
Location: Charlotte, North Carolina, United States
Type: Full-time
Posted: 12.JAN.2019

Summary

Synechron is one of the fastest-growing digital, business consulting technology firms in the world. Specialized in financial services, the b...

Description

Synechron is one of the fastest-growing digital, business consulting technology firms in the world. Specialized in financial services, the businessrsquo focus on embracing the most cutting-edge innovations combined with expert knowledge and technical expertise has allowed Synechron to reach 500+ million in annual revenue, 8,000 employees and 18 offices worldwide. Synechron is agile enough to invest RD into the latest technologies to help financial services firms stand at the cutting-edge of innovation yet, also large enough to scale any global project. Learn more at http Synechron draws on over 15 years of financial services IT consulting experience to provide expert systems integration expertise and technical development work in highly-complex areas within financial services. This includes Enterprise Architecture Strategy, Application Development Maintenance, Quality Assurance, Infrastructure Management, Data Analytics and Cloud Computing. Synechron is one of the worldrsquos leading systems integrators for specialist technology solutions including Murex, Calypso, Pega, and others and also provides traditional offshoring capabilities with off-shore development centres located in Pune, Bangalore, Hyderabad, and Chennia as well as near-shoring capabilities for European banks with development centres in Serbia. Synechronrsquos technology team works with traditional technologies and platforms like Java, C++, Python, and others as well as the most cutting-edge technologies from blockchain to artificial intelligence. Learn more at httpsynechron.comtechnology httpsynechron.comtechnology20 Synechron on behalf of our client a leading financial corporation is seeking for Bigdata Hadoop Developer in financial services space in Charlotte, NC. Job Responsibilities Develop new reference architecture and service mode. Lead designers and other developers in the team to guide them and help providing right technical solutions to the business. Set up the Cloudera platform and various tools (Sqoop, Hive, Impala, Spark). Create Hive tables and Impala metadata to access the data from HDFS. Lead Big Data projects for client Must be flexible enough to co-ordinate between multiple teams onsiteoffshore. Skills Required Hadoop HDFS ETL Staging Load for legacy systems S3 as Object data store where possible (evaluate Scality or Hitachi). Dedicated Spark clusters for each Data Mart. The capacity of the cluster is sized based on the usage of the Data Mart. Oracle or SQL Server for legacy Data Marts If the above role interests you, do share your updated resume at nitesh.ingalesynechron.com mailtonikhil.karmudesynechron.com .

 
Apply Now

Share

Free eBook

Flash-bkgn
Loader2 Processing ...