Hire an ideal data scientist for your project
  • 300+
    Data Scientists
  • 475+
    Users
Related Work
Fixed Price - Est. Time: 6 months,

Position: NiFi /Big Data Developer 

Must Have Technical Skills: NiFi

Good to have Technical Skills: Python, ETL

Preferred Industry Experience: Telecom

·         Extensive experience on Nifi to setup data pipeline.

·         Hands on experience in using controllers and processors to setup ETL framework in Apache Nifi

·         Extensive experience in Python

·         Food understanding on Spark, Spark Streaming & PySpark.

·         Good understanding of Big Data components

Nos of Resources required: 2 to 3

Work location: Remote

Qualification: BTech

Experience: 4 yrs – 5 yrs

Mobilization Period in weeks: 1 week

Duration: 3 to 6 months  

Fixed Price - Est. Time: 12 months,

AWS Developer

  • Should have strong knowledge of AWS Cloud libraries
  • Proficiency in developing, deploying, and debugging cloud-based applications using AWS Services - EC2, ECS, EKS, Lambda, DynamoDB, SQS, Cognito, CloudFormation
  • Knowledge on Java, Spring boot, Spring Integration, Node JS
  • Ability to use the AWS service APIs, AWS CLI, and SDKs to write applications
  • Ability to use a CI/CD pipeline to deploy applications on AWS
  • Ability to write code using AWS security best practices (e.g., not using secret and access keys in the code, instead using IAM roles)
  • Hands-on experience working in an agile/iterative development environment
  • Develop and deploy AWS cloud-based solutions, services, and interfaces
  • Participate in all phases of software engineering including; requirements, design, coding and testing.
  • Design and implement product features in collaboration with product managers and stakeholder.
  • Design reusable components, frameworks and libraries or micro-services
  • Participate in an Agile/Scrum methodology to deliver high-quality software releases every 2 weeks

Nos of Resources required: 5 to 6

Work location: Remote (Bangalore/Mumbai)

Qualification: BTech

Experience: 4 yrs – 8 yrs

Mobilization Period in weeks: 2 week

Duration: 6 to 12 months  

Fixed Price - Est. Time: 12 months,

SAS Data Quality Consultant:

  • Data Quality developer will be responsible for analyzing and understanding data sources and end-user requirements using the SAS DQ.
  • Must be aware of Data Quality dimensions and their implementation in SAS DQ tool.
  • No of Position-2
  • Role & Responsibilities - Dev and Testing Skills
  • Experience - 3-6 years of experience
  • Location – Delhi NCR
Fixed Price - Est. Time: 12 months,

SAS Data Integration Consultant:

  • Graduate or Post graduate in Engineering or a Post graduate in non-engineering disciplines
  • Certification in Base SAS/ Advanced SAS will be an added advantage
  • SAS Base
  • SAS DI
  • Good basic knowledge in Base and Advance SAS
  • No of position-3
  • Role & Responsibilities - Dev and Testing Skills
  • Experience - 3-6 years of experience
  • Location – Delhi NCR
Fixed Price - Est. Time: 3 months,

Power BI and Reporting:

1) Proficient in Building the PowerBI dashboards, SSAS Cubes and Data warehouse solutions, using Microsoft BI tools and with industry best practices.

2) Understand and deliver the business scenarios for improvement on the BI platform from a clients perspective

3) Manage interaction and expectation of client considering all the aspects of building long term sustainable solution

4) Can create amazing dashboard designs, SSAS models, data models on the database, suggest ETL data flow and collectively support team for developments

5) Should manage the onsite and offshore assignments

6) Good knowledge of data warehouse modeling

7) Report and support to management for day to day activities and deliveries

8) Maker-Checker approach to follow in the development

Experience:

  • 2 to 3 years

Job location:

  • Bangalore/Chennai/ Hyderabad (WFH at the moment)

Duration:

  • 3 months. May get extended.

Joining:

  • immediate
Fixed Price - Est. Time: 12 months,

Position: Azure Pyspark

Pyspark + with Data warehousing and Azure will be add on.

·         Must have low level design and development skills.  Should able to design a solution for a given use cases. 

·         Agile delivery- Person must able to show design and code on daily basis

·         Must be an experienced PySpark developer and Scala coding.   Primary skill is PySpark

·         Must have experience in designing job orchestration, sequence, metadata design, Audit trail, dynamic parameter passing and error/exception handling

·         Good experience with unit, integration and UAT support

·         Able to design and code reusable components and functions

·         Should able to review design, code & provide review comments with justification

·         Zeal to learn new tool/technologies and adoption

·         Good to have experience with Devops and CICD

Nos of Resources required: 1 to 2

Work location: Bangalore

Experience: 8 yrs – 9 yrs

Mobilization Period in weeks: 2 weeks