Hire an ideal data scientist for your project
  • 300+
    Data Scientists
  • 475+
    Users
Related Work
Fixed Price - Est. Time: 3 months,

Tableau Developers:

Tableau Developer Responsibilities:

  • Developing, maintaining, and managing advanced reporting, analytics, dashboards and other BI solutions.
  • Performing and documenting data analysis, data validation, and data mapping/design.
  • Reviewing and improving existing systems and collaborating with teams to integrate new systems.
  • Conducting unit tests and developing database queries to analyze the effects and troubleshoot any issues.
  • Creating tools to store data within the organization.

Tableau Developer Requirements:

  • Degree in Mathematics, Computer Science, Information Systems, or related field.
  • Relevant work experience.
  • A solid understanding of SQL, rational databases, and normalization.
  • Proficiency in use of query and reporting analysis tools.
  • Competency in Excel (macros, pivot tables, etc.)
  • Extensive experience in developing, maintaining and managing Tableau driven dashboards & analytics and working knowledge of Tableau administration/architecture.

Experience:

  • 2 to 3 years

Job location:

  • Bangalore/Chennai/ Hyderabad (WFH at the moment)

Duration:

  • 3 months. May get extended.

Joining:

  • immediate
Fixed Price - Est. Time: 3 months,

Data Scientist:

• Resources having 3-5 years of experience in Data Science building predictive models
• Having knowledge of Optimization and Simulation
• Good proficiency in R and Python
• Knowledge of Trade Promotion Optimization would be plus

Experience:

  • 3 to 5 years

Job location:

  • Bangalore/Chennai/ Hyderabad (WFH at the moment)

Duration:

  • 3 months. May get extended.

Joining:

  • immediate
Fixed Price - Est. Time: 12 months,

SAS Data Integration Consultant:

  • Graduate or Post graduate in Engineering or a Post graduate in non-engineering disciplines
  • Certification in Base SAS/ Advanced SAS will be an added advantage
  • SAS Base
  • SAS DI
  • Good basic knowledge in Base and Advance SAS
  • No of position-3
  • Role & Responsibilities - Dev and Testing Skills
  • Experience - 3-6 years of experience
  • Location – Delhi NCR
Fixed Price - Est. Time: 12 months,

SAS Data Quality Consultant:

  • Data Quality developer will be responsible for analyzing and understanding data sources and end-user requirements using the SAS DQ.
  • Must be aware of Data Quality dimensions and their implementation in SAS DQ tool.
  • No of Position-2
  • Role & Responsibilities - Dev and Testing Skills
  • Experience - 3-6 years of experience
  • Location – Delhi NCR
Fixed Price - Est. Time: 3 months,

Background & Objective:

Our client is looking to create online education programs in the hospitality space and is looking to get a comprehensive ed-tech platform. The solution required has three parts:

1. Web Development

2. Mobile App Development

3. Learning Management System (LMS) Development

Requirements:

We invite bids from prospective partners who have capability to develop the solution comprising the above three modules. The prospective bidders are required to include the followng in their bids:

1. Functional Capabilties.

2. Technology Stack: We'll need details of what technology will be used for each of the three modules. For Mobile and Web App partners may suggest their recommendations, however, for LMS the preference is for Moodle to be used.

3. Timelines & Project Schedule

4. Support for Delivery

5. Costs: One Time and Post Production Support.

Please get in touch through chat should you need any more details.

 

 

 

 

Fixed Price - Est. Time: 6 months,

Position: Hadoop Admin

Must Have Technical Skills: Hadoop Admin

Good to have Technical Skills: Linux Admin, ETL

·         Extensive experience with RedHat Linux and Cloudera is mandatory.

·         Experience in installing, configuring, upgrading and managing Hadoop environment.

·         Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues.

·         Work closely with data scientists and data engineers to ensure the smooth operation of the platform.

·         End-to-end performance tuning of the clusters.

·         Maintains and administers computing environments including computer hardware, systems software, applications software, and all configurations.

·         Defines procedures for monitoring and evaluates, diagnoses and establishes work plan to resolve system issues.

·         Working knowledge of entire Hadoop ecosystem like HDFS, Hive, Yarn, Oozie, Kafka, Impala, Hive, Kudu, HBase, Spark and Spark Streaming.

·         Knowledge of Private and Public cloud computing and virtualization platform.

Nos of Resources required: 2 to 3

Work location: Remote

Qualification: BTech

Experience: 4 yrs – 5 yrs

Mobilization Period in weeks: 1 week

Duration: 3 to 6 months