Position: NiFi /Big Data Developer
Must Have Technical Skills: NiFi
Good to have Technical Skills: Python, ETL
Preferred Industry Experience: Telecom
· Extensive experience on Nifi to setup data pipeline.
· Hands on experience in using controllers and processors to setup ETL framework in Apache Nifi
· Extensive experience in Python
· Food understanding on Spark, Spark Streaming & PySpark.
· Good understanding of Big Data components
Nos of Resources required: 2 to 3
Work location: Remote
Qualification: BTech
Experience: 4 yrs – 5 yrs
Mobilization Period in weeks: 1 week
Duration: 3 to 6 months
Visualization expert:
· Resources having 3-5 years of experience creating user interfaces using Power BI and R-Shiny
· Have basic Analytics concepts like Predictive modelling, Optimization and Simulation
· Knowledge of Python would be desirable
· Knowledge of Trade Promotion Optimization would be plus
· Excellent written and verbal communication.
· Good organizational skills.
· Ability to work as part of a team.
Experience:
Job location:
Duration:
Joining:
Looking for visual representation of our analytics of past 3 years to understand the customer behaviour. We will provide complete access to our analytics account. Do let me know if you need any specific details to present this report. We will provide complete access to our analytics account. Do let me know if you need any specific details to present this report.
Tableau Developers:
Tableau Developer Responsibilities:
Tableau Developer Requirements:
Experience:
Job location:
Duration:
Joining:
Position: Hadoop Admin
Must Have Technical Skills: Hadoop Admin
Good to have Technical Skills: Linux Admin, ETL
· Extensive experience with RedHat Linux and Cloudera is mandatory.
· Experience in installing, configuring, upgrading and managing Hadoop environment.
· Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues.
· Work closely with data scientists and data engineers to ensure the smooth operation of the platform.
· End-to-end performance tuning of the clusters.
· Maintains and administers computing environments including computer hardware, systems software, applications software, and all configurations.
· Defines procedures for monitoring and evaluates, diagnoses and establishes work plan to resolve system issues.
· Working knowledge of entire Hadoop ecosystem like HDFS, Hive, Yarn, Oozie, Kafka, Impala, Hive, Kudu, HBase, Spark and Spark Streaming.
· Knowledge of Private and Public cloud computing and virtualization platform.
Nos of Resources required: 2 to 3
Work location: Remote
Qualification: BTech
Experience: 4 yrs – 5 yrs
Mobilization Period in weeks: 1 week
Duration: 3 to 6 months
SAS BI Consultant: