Hire an ideal data scientist for your project
  • 300+
    Data Scientists
  • 475+
    Users
Related Work
Fixed Price - Est. Time: 12 months,

SAS Data Integration Consultant:

  • Graduate or Post graduate in Engineering or a Post graduate in non-engineering disciplines
  • Certification in Base SAS/ Advanced SAS will be an added advantage
  • SAS Base
  • SAS DI
  • Good basic knowledge in Base and Advance SAS
  • No of position-3
  • Role & Responsibilities - Dev and Testing Skills
  • Experience - 3-6 years of experience
  • Location – Delhi NCR
Fixed Price - Est. Time: 2 months,

We need to predict the demand for a particular category of merchandise. As of now we don't have a hang of how much to produce.

Fixed Price - Est. Time: 6 months,

Our Client, A leading Finance company is looking to augment in their team with appropriate candidates with following skills. 

Job Description

  1. 3-5 yrs of experience in Cognos 11.x,10.x
  2. Good knowledge of SQL
  3. Knowledge on creating reports(crosstab, list) in report studio
  4. Hands on experience on Cognos framework manager
  5. Hands on experience on Cognos Admin related activities    
  6. Sound knowledge of data warehousing concept
  7. Budget is just a numbers.

Location: Delhi NCR region. (Onsite)

Contract period: Min 6 months 

We are available over chat to address any queries.

Fixed Price - Est. Time: 12,

We need a proficient PySpark resource with good experience.

Fixed Price - Est. Time: 12 months,

SAS Data Quality Consultant:

  • Data Quality developer will be responsible for analyzing and understanding data sources and end-user requirements using the SAS DQ.
  • Must be aware of Data Quality dimensions and their implementation in SAS DQ tool.
  • No of Position-2
  • Role & Responsibilities - Dev and Testing Skills
  • Experience - 3-6 years of experience
  • Location – Delhi NCR
Fixed Price - Est. Time: 6 months,

Position: Hadoop Admin

Must Have Technical Skills: Hadoop Admin

Good to have Technical Skills: Linux Admin, ETL

·         Extensive experience with RedHat Linux and Cloudera is mandatory.

·         Experience in installing, configuring, upgrading and managing Hadoop environment.

·         Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues.

·         Work closely with data scientists and data engineers to ensure the smooth operation of the platform.

·         End-to-end performance tuning of the clusters.

·         Maintains and administers computing environments including computer hardware, systems software, applications software, and all configurations.

·         Defines procedures for monitoring and evaluates, diagnoses and establishes work plan to resolve system issues.

·         Working knowledge of entire Hadoop ecosystem like HDFS, Hive, Yarn, Oozie, Kafka, Impala, Hive, Kudu, HBase, Spark and Spark Streaming.

·         Knowledge of Private and Public cloud computing and virtualization platform.

Nos of Resources required: 2 to 3

Work location: Remote

Qualification: BTech

Experience: 4 yrs – 5 yrs

Mobilization Period in weeks: 1 week

Duration: 3 to 6 months