Hire an ideal data scientist for your project
  • 300+
    Data Scientists
  • 475+
    Users
Related Work
Fixed Price - Est. Time: 1 month,

Please see the attached SOW. We are seeking talented CRO freelancers who could help us with our conversion rates for new product sales. We\'d like to try out the services on a project basis and then scale it up as an ongoing assigment on a retainership basis.

Only qualified freelenacers should apply. 

Fixed Price - Est. Time: 12 months,

SAS Data Integration Consultant:

  • Graduate or Post graduate in Engineering or a Post graduate in non-engineering disciplines
  • Certification in Base SAS/ Advanced SAS will be an added advantage
  • SAS Base
  • SAS DI
  • Good basic knowledge in Base and Advance SAS
  • No of position-3
  • Role & Responsibilities - Dev and Testing Skills
  • Experience - 3-6 years of experience
  • Location – Delhi NCR
Fixed Price - Est. Time: 6 months,

Our Client, A leading Finance company is looking to augment in their team with appropriate candidates with following skills. 

Job Description

  1. 3-5 yrs of experience in Cognos 11.x,10.x
  2. Good knowledge of SQL
  3. Knowledge on creating reports(crosstab, list) in report studio
  4. Hands on experience on Cognos framework manager
  5. Hands on experience on Cognos Admin related activities    
  6. Sound knowledge of data warehousing concept
  7. Budget is just a numbers.

Location: Delhi NCR region. (Onsite)

Contract period: Min 6 months 

We are available over chat to address any queries.

Fixed Price - Est. Time: 3 months,

Tableau Developers:

Tableau Developer Responsibilities:

  • Developing, maintaining, and managing advanced reporting, analytics, dashboards and other BI solutions.
  • Performing and documenting data analysis, data validation, and data mapping/design.
  • Reviewing and improving existing systems and collaborating with teams to integrate new systems.
  • Conducting unit tests and developing database queries to analyze the effects and troubleshoot any issues.
  • Creating tools to store data within the organization.

Tableau Developer Requirements:

  • Degree in Mathematics, Computer Science, Information Systems, or related field.
  • Relevant work experience.
  • A solid understanding of SQL, rational databases, and normalization.
  • Proficiency in use of query and reporting analysis tools.
  • Competency in Excel (macros, pivot tables, etc.)
  • Extensive experience in developing, maintaining and managing Tableau driven dashboards & analytics and working knowledge of Tableau administration/architecture.

Experience:

  • 2 to 3 years

Job location:

  • Bangalore/Chennai/ Hyderabad (WFH at the moment)

Duration:

  • 3 months. May get extended.

Joining:

  • immediate
Fixed Price - Est. Time: 12 months,

Position: Azure Pyspark

Pyspark + with Data warehousing and Azure will be add on.

·         Must have low level design and development skills.  Should able to design a solution for a given use cases. 

·         Agile delivery- Person must able to show design and code on daily basis

·         Must be an experienced PySpark developer and Scala coding.   Primary skill is PySpark

·         Must have experience in designing job orchestration, sequence, metadata design, Audit trail, dynamic parameter passing and error/exception handling

·         Good experience with unit, integration and UAT support

·         Able to design and code reusable components and functions

·         Should able to review design, code & provide review comments with justification

·         Zeal to learn new tool/technologies and adoption

·         Good to have experience with Devops and CICD

Nos of Resources required: 1 to 2

Work location: Bangalore

Experience: 8 yrs – 9 yrs

Mobilization Period in weeks: 2 weeks

Fixed Price - Est. Time: 6 months,

Position: Hadoop Admin

Must Have Technical Skills: Hadoop Admin

Good to have Technical Skills: Linux Admin, ETL

·         Extensive experience with RedHat Linux and Cloudera is mandatory.

·         Experience in installing, configuring, upgrading and managing Hadoop environment.

·         Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues.

·         Work closely with data scientists and data engineers to ensure the smooth operation of the platform.

·         End-to-end performance tuning of the clusters.

·         Maintains and administers computing environments including computer hardware, systems software, applications software, and all configurations.

·         Defines procedures for monitoring and evaluates, diagnoses and establishes work plan to resolve system issues.

·         Working knowledge of entire Hadoop ecosystem like HDFS, Hive, Yarn, Oozie, Kafka, Impala, Hive, Kudu, HBase, Spark and Spark Streaming.

·         Knowledge of Private and Public cloud computing and virtualization platform.

Nos of Resources required: 2 to 3

Work location: Remote

Qualification: BTech

Experience: 4 yrs – 5 yrs

Mobilization Period in weeks: 1 week

Duration: 3 to 6 months