Visualization expert:
· Resources having 3-5 years of experience creating user interfaces using Power BI and R-Shiny
· Have basic Analytics concepts like Predictive modelling, Optimization and Simulation
· Knowledge of Python would be desirable
· Knowledge of Trade Promotion Optimization would be plus
· Excellent written and verbal communication.
· Good organizational skills.
· Ability to work as part of a team.
Experience:
Job location:
Duration:
Joining:
We need to predict the demand for a particular category of merchandise. As of now we don't have a hang of how much to produce.
SAS Data Integration Consultant:
Position: Azure Snowflake
Azure :-Hands on experience in ADF – Azure Data Factory
Nos of Resources required: 1 to 2
Work location: As of now Remote (Bangalore)
Experience: 5 yrs – 6 yrs
Mobilization Period in weeks: 2 weeks
Duration: 6 to 12 months
Power BI and Reporting:
1) Proficient in Building the PowerBI dashboards, SSAS Cubes and Data warehouse solutions, using Microsoft BI tools and with industry best practices.
2) Understand and deliver the business scenarios for improvement on the BI platform from a clients perspective
3) Manage interaction and expectation of client considering all the aspects of building long term sustainable solution
4) Can create amazing dashboard designs, SSAS models, data models on the database, suggest ETL data flow and collectively support team for developments
5) Should manage the onsite and offshore assignments
6) Good knowledge of data warehouse modeling
7) Report and support to management for day to day activities and deliveries
8) Maker-Checker approach to follow in the development
Experience:
Job location:
Duration:
Joining:
Position: Hadoop Admin
Must Have Technical Skills: Hadoop Admin
Good to have Technical Skills: Linux Admin, ETL
· Extensive experience with RedHat Linux and Cloudera is mandatory.
· Experience in installing, configuring, upgrading and managing Hadoop environment.
· Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues.
· Work closely with data scientists and data engineers to ensure the smooth operation of the platform.
· End-to-end performance tuning of the clusters.
· Maintains and administers computing environments including computer hardware, systems software, applications software, and all configurations.
· Defines procedures for monitoring and evaluates, diagnoses and establishes work plan to resolve system issues.
· Working knowledge of entire Hadoop ecosystem like HDFS, Hive, Yarn, Oozie, Kafka, Impala, Hive, Kudu, HBase, Spark and Spark Streaming.
· Knowledge of Private and Public cloud computing and virtualization platform.
Nos of Resources required: 2 to 3
Work location: Remote
Qualification: BTech
Experience: 4 yrs – 5 yrs
Mobilization Period in weeks: 1 week
Duration: 3 to 6 months