Our Client, A leading Finance company is looking to augment in their team with appropriate candidates with following skills.
Job Description
Location: Delhi NCR region. (Onsite)
Contract period: Min 6 months
We are available over chat to address any queries.
Position: PySpark Developer
Must Have Technical Skills: PySpark
Good to have Technical Skills: Spark, DWH
· Hand on experience in Pyspark and Numpy, List, Pandas, Graph plotter.
· Excellent knowledge of Python framework and Airflow/Luigi/Dagster/Apache Beam.
· Python skills with all sorts of data libraries, visualization libraries,
· Good SQL knowledge.
· Ability to perform multiple task in continually changing environment.
· Exposure on Hive/HBase/Redis.
· Good knowledge of Data warehousing.
· Reading from Parquet and other format.
· Worked on Hive and HBase.
Nos of Resources required: 3 to 4
Work location: Remote
Qualification: BTech
Experience: 4 yrs – 5 yrs
Mobilization Period in weeks: 1 week
Duration: 3 to 6 months
We need to predict the demand for a particular category of merchandise. As of now we don't have a hang of how much to produce.
Power BI and Reporting:
1) Proficient in Building the PowerBI dashboards, SSAS Cubes and Data warehouse solutions, using Microsoft BI tools and with industry best practices.
2) Understand and deliver the business scenarios for improvement on the BI platform from a clients perspective
3) Manage interaction and expectation of client considering all the aspects of building long term sustainable solution
4) Can create amazing dashboard designs, SSAS models, data models on the database, suggest ETL data flow and collectively support team for developments
5) Should manage the onsite and offshore assignments
6) Good knowledge of data warehouse modeling
7) Report and support to management for day to day activities and deliveries
8) Maker-Checker approach to follow in the development
Experience:
Job location:
Duration:
Joining:
Position: Azure Pyspark
Pyspark + with Data warehousing and Azure will be add on.
· Must have low level design and development skills. Should able to design a solution for a given use cases.
· Agile delivery- Person must able to show design and code on daily basis
· Must be an experienced PySpark developer and Scala coding. Primary skill is PySpark
· Must have experience in designing job orchestration, sequence, metadata design, Audit trail, dynamic parameter passing and error/exception handling
· Good experience with unit, integration and UAT support
· Able to design and code reusable components and functions
· Should able to review design, code & provide review comments with justification
· Zeal to learn new tool/technologies and adoption
· Good to have experience with Devops and CICD
Nos of Resources required: 1 to 2
Work location: Bangalore
Experience: 8 yrs – 9 yrs
Mobilization Period in weeks: 2 weeks
SAS Data Quality Consultant: