Hire an ideal data scientist for your project
  • 300+
    Data Scientists
  • 475+
    Users
Related Work
Fixed Price - Est. Time: 12 months,

SAS BI Consultant:

  • Rich experience on SAS BI Tools
  • SAS Web Reporting Studio, SAS OLAP Cubes, SAS IMAP, SAS Management Console, SAS IDP, SAS AMO, SAS stored process
  • Knowledge in SAS BASE
  • Shell scripting, DML, DDL ,Data warehousing
  • Design reports using SAS BI
  • No of position-3
  • Role & Responsibilities - Dev and Testing Skills
  • Experience - 3-6 years of experience
  • Location – Delhi NCR
Fixed Price - Est. Time: 12 months,

Position: Azure Pyspark

Pyspark + with Data warehousing and Azure will be add on.

·         Must have low level design and development skills.  Should able to design a solution for a given use cases. 

·         Agile delivery- Person must able to show design and code on daily basis

·         Must be an experienced PySpark developer and Scala coding.   Primary skill is PySpark

·         Must have experience in designing job orchestration, sequence, metadata design, Audit trail, dynamic parameter passing and error/exception handling

·         Good experience with unit, integration and UAT support

·         Able to design and code reusable components and functions

·         Should able to review design, code & provide review comments with justification

·         Zeal to learn new tool/technologies and adoption

·         Good to have experience with Devops and CICD

Nos of Resources required: 1 to 2

Work location: Bangalore

Experience: 8 yrs – 9 yrs

Mobilization Period in weeks: 2 weeks

Fixed Price - Est. Time: 6 months,

Our Client, A leading Finance company is looking to augment in their team with appropriate candidates with following skills. 

Job Description

  1. 3-5 yrs of experience in Cognos 11.x,10.x
  2. Good knowledge of SQL
  3. Knowledge on creating reports(crosstab, list) in report studio
  4. Hands on experience on Cognos framework manager
  5. Hands on experience on Cognos Admin related activities    
  6. Sound knowledge of data warehousing concept
  7. Budget is just a numbers.

Location: Delhi NCR region. (Onsite)

Contract period: Min 6 months 

We are available over chat to address any queries.

Fixed Price - Est. Time: 3 months,

Our client, a home appliances manufacturer, wants to understand the past effectiveness of the marketing spend and take decisions regarding their future spend directions. We are looking for providers who could assist us in establishing the impact of key marketing variables.

Fixed Price - Est. Time: 3 months,

Web Scrapping Programmers:

·        As a Python Developer, your role is to apply your knowledge set to fetch data from multiple online sources, cleanse it and build APIs on top of it

·        Develop a deep understanding of our vast data sources on the web and know exactly how, when, and which data to scrap, parse and store this data

·        Work closely with Database Administrators to store data in SQL and NoSQL databases

·        Develop frameworks for automating and maintaining the constant flow of data from multiple sources

·        Work independently with little supervision to research and test innovative solutions


Skills and Qualifications:

·        Strong coding experience in Python (knowledge of Java, Javascript is a plus)

·        Experience with SQL databases

·        Experience with multi-processing, multi-threading and AWS/Azure

·        Strong knowledge of scraping frameworks such as Python(Request, BeautifulSoup), Web-Harvest and others

·        Previous experience with web crawling is a must

Experience:

  • 1 to 2 years

Job location:

  • Bangalore/Chennai/ Hyderabad

Duration:

  • 3 months. May get extended.

Joining:

  • immediate
Fixed Price - Est. Time: 6 months,

Position: Hadoop Admin

Must Have Technical Skills: Hadoop Admin

Good to have Technical Skills: Linux Admin, ETL

·         Extensive experience with RedHat Linux and Cloudera is mandatory.

·         Experience in installing, configuring, upgrading and managing Hadoop environment.

·         Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues.

·         Work closely with data scientists and data engineers to ensure the smooth operation of the platform.

·         End-to-end performance tuning of the clusters.

·         Maintains and administers computing environments including computer hardware, systems software, applications software, and all configurations.

·         Defines procedures for monitoring and evaluates, diagnoses and establishes work plan to resolve system issues.

·         Working knowledge of entire Hadoop ecosystem like HDFS, Hive, Yarn, Oozie, Kafka, Impala, Hive, Kudu, HBase, Spark and Spark Streaming.

·         Knowledge of Private and Public cloud computing and virtualization platform.

Nos of Resources required: 2 to 3

Work location: Remote

Qualification: BTech

Experience: 4 yrs – 5 yrs

Mobilization Period in weeks: 1 week

Duration: 3 to 6 months