Hire an ideal data scientist for your project
  • 300+
    Data Scientists
  • 475+
    Users
Related Work
Fixed Price - Est. Time: 12 months,

AWS Developer

  • Should have strong knowledge of AWS Cloud libraries
  • Proficiency in developing, deploying, and debugging cloud-based applications using AWS Services - EC2, ECS, EKS, Lambda, DynamoDB, SQS, Cognito, CloudFormation
  • Knowledge on Java, Spring boot, Spring Integration, Node JS
  • Ability to use the AWS service APIs, AWS CLI, and SDKs to write applications
  • Ability to use a CI/CD pipeline to deploy applications on AWS
  • Ability to write code using AWS security best practices (e.g., not using secret and access keys in the code, instead using IAM roles)
  • Hands-on experience working in an agile/iterative development environment
  • Develop and deploy AWS cloud-based solutions, services, and interfaces
  • Participate in all phases of software engineering including; requirements, design, coding and testing.
  • Design and implement product features in collaboration with product managers and stakeholder.
  • Design reusable components, frameworks and libraries or micro-services
  • Participate in an Agile/Scrum methodology to deliver high-quality software releases every 2 weeks

Nos of Resources required: 5 to 6

Work location: Remote (Bangalore/Mumbai)

Qualification: BTech

Experience: 4 yrs – 8 yrs

Mobilization Period in weeks: 2 week

Duration: 6 to 12 months  

Fixed Price - Est. Time: 12 months,

SAS BI Consultant:

  • Rich experience on SAS BI Tools
  • SAS Web Reporting Studio, SAS OLAP Cubes, SAS IMAP, SAS Management Console, SAS IDP, SAS AMO, SAS stored process
  • Knowledge in SAS BASE
  • Shell scripting, DML, DDL ,Data warehousing
  • Design reports using SAS BI
  • No of position-3
  • Role & Responsibilities - Dev and Testing Skills
  • Experience - 3-6 years of experience
  • Location – Delhi NCR
Please Select..
For ubbjk
Fixed Price - Est. Time: 6 months,

Position: Hadoop Admin

Must Have Technical Skills: Hadoop Admin

Good to have Technical Skills: Linux Admin, ETL

·         Extensive experience with RedHat Linux and Cloudera is mandatory.

·         Experience in installing, configuring, upgrading and managing Hadoop environment.

·         Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues.

·         Work closely with data scientists and data engineers to ensure the smooth operation of the platform.

·         End-to-end performance tuning of the clusters.

·         Maintains and administers computing environments including computer hardware, systems software, applications software, and all configurations.

·         Defines procedures for monitoring and evaluates, diagnoses and establishes work plan to resolve system issues.

·         Working knowledge of entire Hadoop ecosystem like HDFS, Hive, Yarn, Oozie, Kafka, Impala, Hive, Kudu, HBase, Spark and Spark Streaming.

·         Knowledge of Private and Public cloud computing and virtualization platform.

Nos of Resources required: 2 to 3

Work location: Remote

Qualification: BTech

Experience: 4 yrs – 5 yrs

Mobilization Period in weeks: 1 week

Duration: 3 to 6 months  

Fixed Price - Est. Time: 12 months,

SAS Analytics Consultant: 

  • Identifies trends, patterns, and critical business insights using SAS
  • Analytical Model Development like-Predictive Modelling /Logistics Regression/ Linear Regression/ Forecasting Modelling/ Time Series Etc.
  • Manage all data for specific modules of analytical projects
  • Contribute to building the analytical construct directed towards solving the business problem
  • Work with the business team to specify analysis tasks to generate insights
  • Design and execute the analysis tasks
  • No of postion-2
  • Role & Responsibilities - Dev and Testing Skills
  • Experience - 3-6 years of experience
  • Location – Delhi NCR
Fixed Price - Est. Time: 12 months,

Position: Support Engineer

Must Have Technical Skills: Azure Data Factory

Good to have Technical Skills: Knowledge of sql or hive, python, pyspark, hdfs or adls

Preferred Industry Experience: Manufacturing

 Role:

Monitor Batch Pipelines in the Data Analytical Platform and provide workaround to the problems.

Troubleshoot issues, provide workaround, determine root cause, upon failures.

Desired Qualification:

·         0-3 years of DE or batch support experience.

·         Must be willing to work completely in the night shift.

·         Ability to work on a task independently or with minimal supervision.

·         Knowledge of Data Orchestration Tool, preferably ADF.

·         Knowledge of sql or hive, python, pyspark, hdfs or adls.

·         Work location: Remotely.

Nos of Resources required: 1 to 2

Work location: Remote

Qualification: BE, BTech, MBA, MCA Prefer Comp. Sci. background

Experience: 0-3 Years

Mobilization Period in weeks: 1 week

Duration: 6 to 12 months