Hire an ideal data scientist for your project
  • 300+
    Data Scientists
  • 475+
    Users
Related Work
Fixed Price - Est. Time: 12 months,

We are looking at augmenting our team with an ML Engineer. We are a retail house and need an ML Engineer to train for our upcoming e commerce venture. 

Fixed Price - Est. Time: 6 months,

Position: Hadoop Admin

Must Have Technical Skills: Hadoop Admin

Good to have Technical Skills: Linux Admin, ETL

·         Extensive experience with RedHat Linux and Cloudera is mandatory.

·         Experience in installing, configuring, upgrading and managing Hadoop environment.

·         Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues.

·         Work closely with data scientists and data engineers to ensure the smooth operation of the platform.

·         End-to-end performance tuning of the clusters.

·         Maintains and administers computing environments including computer hardware, systems software, applications software, and all configurations.

·         Defines procedures for monitoring and evaluates, diagnoses and establishes work plan to resolve system issues.

·         Working knowledge of entire Hadoop ecosystem like HDFS, Hive, Yarn, Oozie, Kafka, Impala, Hive, Kudu, HBase, Spark and Spark Streaming.

·         Knowledge of Private and Public cloud computing and virtualization platform.

Nos of Resources required: 2 to 3

Work location: Remote

Qualification: BTech

Experience: 4 yrs – 5 yrs

Mobilization Period in weeks: 1 week

Duration: 3 to 6 months  

Please Select..
Sample project
Fixed Price - Est. Time: 12 months,

AWS Developer

  • Should have strong knowledge of AWS Cloud libraries
  • Proficiency in developing, deploying, and debugging cloud-based applications using AWS Services - EC2, ECS, EKS, Lambda, DynamoDB, SQS, Cognito, CloudFormation
  • Knowledge on Java, Spring boot, Spring Integration, Node JS
  • Ability to use the AWS service APIs, AWS CLI, and SDKs to write applications
  • Ability to use a CI/CD pipeline to deploy applications on AWS
  • Ability to write code using AWS security best practices (e.g., not using secret and access keys in the code, instead using IAM roles)
  • Hands-on experience working in an agile/iterative development environment
  • Develop and deploy AWS cloud-based solutions, services, and interfaces
  • Participate in all phases of software engineering including; requirements, design, coding and testing.
  • Design and implement product features in collaboration with product managers and stakeholder.
  • Design reusable components, frameworks and libraries or micro-services
  • Participate in an Agile/Scrum methodology to deliver high-quality software releases every 2 weeks

Nos of Resources required: 5 to 6

Work location: Remote (Bangalore/Mumbai)

Qualification: BTech

Experience: 4 yrs – 8 yrs

Mobilization Period in weeks: 2 week

Duration: 6 to 12 months  

Fixed Price - Est. Time: 3 months,

Power BI and Reporting:

1) Proficient in Building the PowerBI dashboards, SSAS Cubes and Data warehouse solutions, using Microsoft BI tools and with industry best practices.

2) Understand and deliver the business scenarios for improvement on the BI platform from a clients perspective

3) Manage interaction and expectation of client considering all the aspects of building long term sustainable solution

4) Can create amazing dashboard designs, SSAS models, data models on the database, suggest ETL data flow and collectively support team for developments

5) Should manage the onsite and offshore assignments

6) Good knowledge of data warehouse modeling

7) Report and support to management for day to day activities and deliveries

8) Maker-Checker approach to follow in the development

Experience:

  • 2 to 3 years

Job location:

  • Bangalore/Chennai/ Hyderabad (WFH at the moment)

Duration:

  • 3 months. May get extended.

Joining:

  • immediate
Fixed Price - Est. Time: 12 months,

Position: Azure Pyspark

Pyspark + with Data warehousing and Azure will be add on.

·         Must have low level design and development skills.  Should able to design a solution for a given use cases. 

·         Agile delivery- Person must able to show design and code on daily basis

·         Must be an experienced PySpark developer and Scala coding.   Primary skill is PySpark

·         Must have experience in designing job orchestration, sequence, metadata design, Audit trail, dynamic parameter passing and error/exception handling

·         Good experience with unit, integration and UAT support

·         Able to design and code reusable components and functions

·         Should able to review design, code & provide review comments with justification

·         Zeal to learn new tool/technologies and adoption

·         Good to have experience with Devops and CICD

Nos of Resources required: 1 to 2

Work location: Bangalore

Experience: 8 yrs – 9 yrs

Mobilization Period in weeks: 2 weeks