Position: Hadoop Admin
Must Have Technical Skills: Hadoop Admin
Good to have Technical Skills: Linux Admin, ETL
· Extensive experience with RedHat Linux and Cloudera is mandatory.
· Experience in installing, configuring, upgrading and managing Hadoop environment.
· Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues.
· Work closely with data scientists and data engineers to ensure the smooth operation of the platform.
· End-to-end performance tuning of the clusters.
· Maintains and administers computing environments including computer hardware, systems software, applications software, and all configurations.
· Defines procedures for monitoring and evaluates, diagnoses and establishes work plan to resolve system issues.
· Working knowledge of entire Hadoop ecosystem like HDFS, Hive, Yarn, Oozie, Kafka, Impala, Hive, Kudu, HBase, Spark and Spark Streaming.
· Knowledge of Private and Public cloud computing and virtualization platform.
Nos of Resources required: 2 to 3
Work location: Remote
Qualification: BTech
Experience: 4 yrs – 5 yrs
Mobilization Period in weeks: 1 week
Duration: 3 to 6 months
Background & Objective:
Our client is looking to create online education programs in the hospitality space and is looking to get a comprehensive ed-tech platform. The solution required has three parts:
1. Web Development
2. Mobile App Development
3. Learning Management System (LMS) Development
Requirements:
We invite bids from prospective partners who have capability to develop the solution comprising the above three modules. The prospective bidders are required to include the followng in their bids:
1. Functional Capabilties.
2. Technology Stack: We'll need details of what technology will be used for each of the three modules. For Mobile and Web App partners may suggest their recommendations, however, for LMS the preference is for Moodle to be used.
3. Timelines & Project Schedule
4. Support for Delivery
5. Costs: One Time and Post Production Support.
Please get in touch through chat should you need any more details.
We have two plant which we manufacture over 500 items. At both two plants operations run in 2 shifts. Both plant have capability to manufacture all 500 items, but the costs to manufacture is differnt at both plant.
Our senior mangement want to have dshboard that give visibiity:
1. What is made at which plant
2. Costs to manufacture item wise
3. Best production plan that will ensure products are made at the cheapest price
4. Ability to change the plan and see
Experts whi can help, please get in touch.
Position: Azure Snowflake
Azure :-Hands on experience in ADF – Azure Data Factory
Nos of Resources required: 1 to 2
Work location: As of now Remote (Bangalore)
Experience: 5 yrs – 6 yrs
Mobilization Period in weeks: 2 weeks
Duration: 6 to 12 months
Position: NiFi /Big Data Developer
Must Have Technical Skills: NiFi
Good to have Technical Skills: Python, ETL
Preferred Industry Experience: Telecom
· Extensive experience on Nifi to setup data pipeline.
· Hands on experience in using controllers and processors to setup ETL framework in Apache Nifi
· Extensive experience in Python
· Food understanding on Spark, Spark Streaming & PySpark.
· Good understanding of Big Data components
Nos of Resources required: 2 to 3
Work location: Remote
Qualification: BTech
Experience: 4 yrs – 5 yrs
Mobilization Period in weeks: 1 week
Duration: 3 to 6 months