AWS Developer
Nos of Resources required: 5 to 6
Work location: Remote (Bangalore/Mumbai)
Qualification: BTech
Experience: 4 yrs – 8 yrs
Mobilization Period in weeks: 2 week
Duration: 6 to 12 months
Position: PySpark Developer
Must Have Technical Skills: PySpark
Good to have Technical Skills: Spark, DWH
· Hand on experience in Pyspark and Numpy, List, Pandas, Graph plotter.
· Excellent knowledge of Python framework and Airflow/Luigi/Dagster/Apache Beam.
· Python skills with all sorts of data libraries, visualization libraries,
· Good SQL knowledge.
· Ability to perform multiple task in continually changing environment.
· Exposure on Hive/HBase/Redis.
· Good knowledge of Data warehousing.
· Reading from Parquet and other format.
· Worked on Hive and HBase.
Nos of Resources required: 3 to 4
Work location: Remote
Qualification: BTech
Experience: 4 yrs – 5 yrs
Mobilization Period in weeks: 1 week
Duration: 3 to 6 months
Position: NiFi /Big Data Developer
Must Have Technical Skills: NiFi
Good to have Technical Skills: Python, ETL
Preferred Industry Experience: Telecom
· Extensive experience on Nifi to setup data pipeline.
· Hands on experience in using controllers and processors to setup ETL framework in Apache Nifi
· Extensive experience in Python
· Food understanding on Spark, Spark Streaming & PySpark.
· Good understanding of Big Data components
Nos of Resources required: 2 to 3
Work location: Remote
Qualification: BTech
Experience: 4 yrs – 5 yrs
Mobilization Period in weeks: 1 week
Duration: 3 to 6 months
We have two plant which we manufacture over 500 items. At both two plants operations run in 2 shifts. Both plant have capability to manufacture all 500 items, but the costs to manufacture is differnt at both plant.
Our senior mangement want to have dshboard that give visibiity:
1. What is made at which plant
2. Costs to manufacture item wise
3. Best production plan that will ensure products are made at the cheapest price
4. Ability to change the plan and see
Experts whi can help, please get in touch.
We are desirous of increasing the conversions amongst the unique visitors that come to our website each day. We capture basic prospecting data about our visitors and would like to leverage that data to increase our leads/sales. We are looking at agencies that could help us with a tangible improvement with a short term project. Subsequently, we could convert this into long term retainership based contract.
Position: Hadoop Admin
Must Have Technical Skills: Hadoop Admin
Good to have Technical Skills: Linux Admin, ETL
· Extensive experience with RedHat Linux and Cloudera is mandatory.
· Experience in installing, configuring, upgrading and managing Hadoop environment.
· Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues.
· Work closely with data scientists and data engineers to ensure the smooth operation of the platform.
· End-to-end performance tuning of the clusters.
· Maintains and administers computing environments including computer hardware, systems software, applications software, and all configurations.
· Defines procedures for monitoring and evaluates, diagnoses and establishes work plan to resolve system issues.
· Working knowledge of entire Hadoop ecosystem like HDFS, Hive, Yarn, Oozie, Kafka, Impala, Hive, Kudu, HBase, Spark and Spark Streaming.
· Knowledge of Private and Public cloud computing and virtualization platform.
Nos of Resources required: 2 to 3
Work location: Remote
Qualification: BTech
Experience: 4 yrs – 5 yrs
Mobilization Period in weeks: 1 week
Duration: 3 to 6 months