SAS Data Integration Consultant:
Position: Azure Pyspark
Pyspark + with Data warehousing and Azure will be add on.
· Must have low level design and development skills. Should able to design a solution for a given use cases.
· Agile delivery- Person must able to show design and code on daily basis
· Must be an experienced PySpark developer and Scala coding. Primary skill is PySpark
· Must have experience in designing job orchestration, sequence, metadata design, Audit trail, dynamic parameter passing and error/exception handling
· Good experience with unit, integration and UAT support
· Able to design and code reusable components and functions
· Should able to review design, code & provide review comments with justification
· Zeal to learn new tool/technologies and adoption
· Good to have experience with Devops and CICD
Nos of Resources required: 1 to 2
Work location: Bangalore
Experience: 8 yrs – 9 yrs
Mobilization Period in weeks: 2 weeks
Data Scientist:
• Resources having 3-5 years of experience in Data Science building predictive models
• Having knowledge of Optimization and Simulation
• Good proficiency in R and Python
• Knowledge of Trade Promotion Optimization would be plus
Experience:
Job location:
Duration:
Joining:
Position: Azure Snowflake
Azure :-Hands on experience in ADF – Azure Data Factory
Nos of Resources required: 1 to 2
Work location: As of now Remote (Bangalore)
Experience: 5 yrs – 6 yrs
Mobilization Period in weeks: 2 weeks
Duration: 6 to 12 months