Hire an ideal data scientist for your project
  • 300+
    Data Scientists
  • 475+
    Users
Related Work
Please Select..
For ubbjk
Fixed Price - Est. Time: 12,

We need a proficient PySpark resource with good experience.

Fixed Price - Est. Time: 6 months,

Position: NiFi /Big Data Developer 

Must Have Technical Skills: NiFi

Good to have Technical Skills: Python, ETL

Preferred Industry Experience: Telecom

·         Extensive experience on Nifi to setup data pipeline.

·         Hands on experience in using controllers and processors to setup ETL framework in Apache Nifi

·         Extensive experience in Python

·         Food understanding on Spark, Spark Streaming & PySpark.

·         Good understanding of Big Data components

Nos of Resources required: 2 to 3

Work location: Remote

Qualification: BTech

Experience: 4 yrs – 5 yrs

Mobilization Period in weeks: 1 week

Duration: 3 to 6 months  

Fixed Price - Est. Time: 1 month,

Please see the attached SOW. We are seeking talented CRO freelancers who could help us with our conversion rates for new product sales. We\'d like to try out the services on a project basis and then scale it up as an ongoing assigment on a retainership basis.

Only qualified freelenacers should apply. 

Fixed Price - Est. Time: 1 month,

We have two plant which we manufacture over 500 items. At both two plants operations run in 2 shifts. Both plant have capability to manufacture all 500 items, but the costs to manufacture is differnt at both plant.

Our senior mangement want to have dshboard that give visibiity:

1. What is made at which plant

2. Costs to manufacture item wise

3. Best production plan that will ensure products are made at the cheapest price

4. Ability to change the plan and see 

Experts whi can help, please get in touch. 

 

 

 

Fixed Price - Est. Time: 6 months,

Position: Hadoop Admin

Must Have Technical Skills: Hadoop Admin

Good to have Technical Skills: Linux Admin, ETL

·         Extensive experience with RedHat Linux and Cloudera is mandatory.

·         Experience in installing, configuring, upgrading and managing Hadoop environment.

·         Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues.

·         Work closely with data scientists and data engineers to ensure the smooth operation of the platform.

·         End-to-end performance tuning of the clusters.

·         Maintains and administers computing environments including computer hardware, systems software, applications software, and all configurations.

·         Defines procedures for monitoring and evaluates, diagnoses and establishes work plan to resolve system issues.

·         Working knowledge of entire Hadoop ecosystem like HDFS, Hive, Yarn, Oozie, Kafka, Impala, Hive, Kudu, HBase, Spark and Spark Streaming.

·         Knowledge of Private and Public cloud computing and virtualization platform.

Nos of Resources required: 2 to 3

Work location: Remote

Qualification: BTech

Experience: 4 yrs – 5 yrs

Mobilization Period in weeks: 1 week

Duration: 3 to 6 months