Solution Architect - BigData

Company: Synechron Inc.
Location: Charlotte, North Carolina, United States
Type: Full-time
Posted: 11.OCT.2018


Synechron is one of the fastest-growing digital, business consulting & technology firms in the world. Specialized in financial services, the...


Synechron is one of the fastest-growing digital, business consulting & technology firms in the world. Specialized in financial services, the business' focus on embracing the most cutting-edge innovations combined with expert knowledge and technical expertise has allowed Synechron to reach $500+ million in annual revenue, 8,000 employees and 18 offices worldwide. Synechron is agile enough to invest R&D into the latest technologies to help financial services firms stand at the cutting-edge of innovation; yet, also large enough to scale any global project. Learn more at :

Synechron draws on over 15 years of financial services IT consulting experience to provide expert systems integration expertise and technical development work in highly-complex areas within financial services. This includes: Enterprise Architecture & Strategy, Application Development & Maintenance, Quality Assurance, Infrastructure Management, Data & Analytics and Cloud Computing. Synechron is one of the world's leading systems integrators for specialist technology solutions including: Murex, Calypso, Pega, and others and also provides traditional offshoring capabilities with off-shore development centers located in Pune, Bangalore, Hyderabad, and Chennai as well as near-shoring capabilities for European banks with development centers in Serbia. Synechron's technology team works with traditional technologies and platforms like Java, C++, Python, and others as well as the most cutting-edge technologies from blockchain to artificial intelligence. Learn more at:

Synechron is seeking a Solution Architect - Big Data Space within the Data Management and Strategy

We are looking for a Solutions Architect who has a passion for data and the new technology patterns that support business insight and analytics. We need someone with experience implementing data solutions across a wide variety of tools and technologies. As a Solutions Architect you will be responsible for partnering with leadership to understand and interpret solution requirements. This role requires someone excited to advance a team through the transition from traditional data solutions to emerging data patterns.

Ensuring our delivery of Big Data solutions is first class - satisfying our demanding client-base, where we win engagements to deliver Big Data solutions. This will include leading engagements, team-building and overseeing deliveries.

What you'll do:

  • Work with business to understand business requirements and use cases
  • Create technical and business solutions architectures (logical and physical)
  • Resolve questions during design and implementation of architecture
  • Evaluate tools for use case fit, perform vendor/tool comparisons and present recommendations
  • Contribute to the capability roadmaps for data platforms
  • Review schemas, data models and data architecture for Hadoop environments
  • Prototype solutions for specific use cases
  • Advise peers and business partners on fit-for-use and technical complexities
  • Partner with other technical leaders for solution alignment with strategy and standards.
  • Proven leadership, including leadership within the Financial Services domain
  • Gravitas, comfort and capability of presenting and pitching to senior technology managers at major financials.
  • Appreciation of the business domains in Financial Services that use Big Data, including Risk, Regulation, Finance, Compliance, Fraud
  • Team building, hiring the best of the available talent pool, growing talent from within
  • Ability to pitch and manage multiple projects
  • Consulting background highly beneficial

Required Skills :

  • Hands-on experience with Hadoop, Hive, Sqoop, Splunk, STORM, SPARK, Kafka and HBASE (At least 2 years)
  • Experience with end-to-end solution architecture for data capabilities including.
  • ETL Staging & Load for legacy systems
  • Experience with Test Driven Code Development and SCM tools
  • Fluent understanding of best practices for building Data Lake and analytical architectures on Hadoop
  • Strong scripting / programming background (Unix, Python preferred)
  • Strong SQL experience with the ability to develop, tune and debug complex SQL applications
  • Expertise in schema design, developing data models and proven ability to work with complex data is required
  • Experience in real time and batch data ingestion
  • Proven experience in working in large environments such as RDBMS, EDW, NoSQL etc.
  • Understanding security, encryption and masking using various technologies including Kerberos, MapR-tickets, Vormetric and Voltage.
  • Shared HDFS storage for all Data Marts
  • S3 for Object data stores where applicable, especially for supporting OLTP or CRUD application capabilities, Apache Parquet or ORC for Columnar data stores
  • Apache Spark for data processing - Scala, Python or Java
  • Apache Spark for data access through Spark SQL, Data Frames
  • Dedicated Spark clusters for each Data Mart. The capacity of the cluster is sized based on the usage of the Data Mart.
  • Oracle or SQL Server for legacy Data Marts
  • Metadata Management, and Data Governance within the Big Data / NoSQL domain.
  • TML5 for UI , OBIEE for standard reporting ,Tableau for self-service reporting
  • Education: Bachelor's degree in computer science or related.

- provided by Dice Scala, Python, Hadoop, Java, Spark, SQL, Data Frames, Data Mart, Data Lake, Big Data, NoSQL, Unix

Apply Now


Free eBook

Loader2 Processing ...