The individual will be part of a small team of experts responsible for implementing innovative features into products and proof-of-concept. ...
The individual will be part of a small team of experts responsible for implementing innovative features into products and proof-of-concept. You will develop innovative solutions and viable products using data science concepts and modern analytics technologies.
Responsibilities: (1) Develop subject-matter expertise with domains we work on (2) Be a trusted partner to the engineering team and enable them to use data to run analytics at scale and integrate analytics with both product and experimental platforms (3) Engineer efficient, adaptable and scalable data pipelines to process structured and unstructured data (4) Enable analytics by building robust data sets that can power various ML techniques like regression, classification, clustering etc. (5) Produce high quality work and continuously improving the ways we use data. (6) Create data models, distributed data processing patterns, well-engineered data pipelines using big data technologies on large scale unstructured data sets. (7) Take vague requirements and crystallize them into scalable data solutions. (8) Communicate results with stakeholders.
Qualifications: (Please list all required qualifications) (1) MS or substantial experiences in highly quantitative fields.(2) Expertise in some of the following technical domains: AI/ML, Deep Learning, Cognitive Computing, Simulation, Optimization, Security, and Blockchain. (3) Working with research and product teams on data science initiatives. (4) Contributing to every stage of POC and product development. (5) Represent the output of the model in the context of the data that drove it and who can clearly articulate the business problem at hand. (6) Performing data wrangling of unstructured/messy datasets and develop innovative methods for cleaning, transforming and processing data. (7) Working and leveraging machine learning and visualization tools effectively. (8) Developing analytical data pipelines which can be used in production. (9) Integrate data, models and software modules into solutions. (10) Developing intellectual properties on a regular basis. (11) Conducting quality assurance and testing for the solutions before delivery to stakeholders. (12) Develop and manage data science experimental platforms. SDL2017