Our Client, A leading Finance company is looking to augment in their team with appropriate candidates with following skills.
Job Description
Location: Delhi NCR region. (Onsite)
Contract period: Min 6 months
We are available over chat to address any queries.
Visualization expert:
· Resources having 3-5 years of experience creating user interfaces using Power BI and R-Shiny
· Have basic Analytics concepts like Predictive modelling, Optimization and Simulation
· Knowledge of Python would be desirable
· Knowledge of Trade Promotion Optimization would be plus
· Excellent written and verbal communication.
· Good organizational skills.
· Ability to work as part of a team.
Experience:
Job location:
Duration:
Joining:
Position: Hadoop Admin
Must Have Technical Skills: Hadoop Admin
Good to have Technical Skills: Linux Admin, ETL
· Extensive experience with RedHat Linux and Cloudera is mandatory.
· Experience in installing, configuring, upgrading and managing Hadoop environment.
· Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues.
· Work closely with data scientists and data engineers to ensure the smooth operation of the platform.
· End-to-end performance tuning of the clusters.
· Maintains and administers computing environments including computer hardware, systems software, applications software, and all configurations.
· Defines procedures for monitoring and evaluates, diagnoses and establishes work plan to resolve system issues.
· Working knowledge of entire Hadoop ecosystem like HDFS, Hive, Yarn, Oozie, Kafka, Impala, Hive, Kudu, HBase, Spark and Spark Streaming.
· Knowledge of Private and Public cloud computing and virtualization platform.
Nos of Resources required: 2 to 3
Work location: Remote
Qualification: BTech
Experience: 4 yrs – 5 yrs
Mobilization Period in weeks: 1 week
Duration: 3 to 6 months
Position: PySpark Developer
Must Have Technical Skills: PySpark
Good to have Technical Skills: Spark, DWH
· Hand on experience in Pyspark and Numpy, List, Pandas, Graph plotter.
· Excellent knowledge of Python framework and Airflow/Luigi/Dagster/Apache Beam.
· Python skills with all sorts of data libraries, visualization libraries,
· Good SQL knowledge.
· Ability to perform multiple task in continually changing environment.
· Exposure on Hive/HBase/Redis.
· Good knowledge of Data warehousing.
· Reading from Parquet and other format.
· Worked on Hive and HBase.
Nos of Resources required: 3 to 4
Work location: Remote
Qualification: BTech
Experience: 4 yrs – 5 yrs
Mobilization Period in weeks: 1 week
Duration: 3 to 6 months
Web Scrapping Programmers:
· As a Python Developer, your role is to apply your knowledge set to fetch data from multiple online sources, cleanse it and build APIs on top of it
· Develop a deep understanding of our vast data sources on the web and know exactly how, when, and which data to scrap, parse and store this data
· Work closely with Database Administrators to store data in SQL and NoSQL databases
· Develop frameworks for automating and maintaining the constant flow of data from multiple sources
· Work independently with little supervision to research and test innovative solutions
Skills and Qualifications:
· Strong coding experience in Python (knowledge of Java, Javascript is a plus)
· Experience with SQL databases
· Experience with multi-processing, multi-threading and AWS/Azure
· Strong knowledge of scraping frameworks such as Python(Request, BeautifulSoup), Web-Harvest and others
· Previous experience with web crawling is a must
Experience:
Job location:
Duration:
Joining:
Data Scientist:
• Resources having 3-5 years of experience in Data Science building predictive models
• Having knowledge of Optimization and Simulation
• Good proficiency in R and Python
• Knowledge of Trade Promotion Optimization would be plus
Experience:
Job location:
Duration:
Joining: