Hire an ideal data scientist for your project
  • 300+
    Data Scientists
  • 475+
    Users
Related Work
Fixed Price - Est. Time: 1 week,

Looking for visual representation of our analytics of past 3 years to understand the customer behaviour. We will provide complete access to our analytics account. Do let me know if you need any specific details to present this report. We will provide complete access to our analytics account. Do let me know if you need any specific details to present this report.

Fixed Price - Est. Time: 6 months,

Position: Hadoop Admin

Must Have Technical Skills: Hadoop Admin

Good to have Technical Skills: Linux Admin, ETL

·         Extensive experience with RedHat Linux and Cloudera is mandatory.

·         Experience in installing, configuring, upgrading and managing Hadoop environment.

·         Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues.

·         Work closely with data scientists and data engineers to ensure the smooth operation of the platform.

·         End-to-end performance tuning of the clusters.

·         Maintains and administers computing environments including computer hardware, systems software, applications software, and all configurations.

·         Defines procedures for monitoring and evaluates, diagnoses and establishes work plan to resolve system issues.

·         Working knowledge of entire Hadoop ecosystem like HDFS, Hive, Yarn, Oozie, Kafka, Impala, Hive, Kudu, HBase, Spark and Spark Streaming.

·         Knowledge of Private and Public cloud computing and virtualization platform.

Nos of Resources required: 2 to 3

Work location: Remote

Qualification: BTech

Experience: 4 yrs – 5 yrs

Mobilization Period in weeks: 1 week

Duration: 3 to 6 months  

Fixed Price - Est. Time: 12 months,

Position: Azure Pyspark

Pyspark + with Data warehousing and Azure will be add on.

·         Must have low level design and development skills.  Should able to design a solution for a given use cases. 

·         Agile delivery- Person must able to show design and code on daily basis

·         Must be an experienced PySpark developer and Scala coding.   Primary skill is PySpark

·         Must have experience in designing job orchestration, sequence, metadata design, Audit trail, dynamic parameter passing and error/exception handling

·         Good experience with unit, integration and UAT support

·         Able to design and code reusable components and functions

·         Should able to review design, code & provide review comments with justification

·         Zeal to learn new tool/technologies and adoption

·         Good to have experience with Devops and CICD

Nos of Resources required: 1 to 2

Work location: Bangalore

Experience: 8 yrs – 9 yrs

Mobilization Period in weeks: 2 weeks

Fixed Price - Est. Time: 1,
React web app.
Fixed Price - Est. Time: 12 months,

Position: Azure Snowflake

  • Demonstrated ability to have successfully completed multiple, complex technical projects and create high-level design and architecture of the solution, including class, sequence and deployment infrastructure diagrams.
  • Take ownership of technical solutions from design and architecture perspective for projects in presales phase as well as on-going projects.
  • Prior experience with application delivery using an Onshore/Offshore model
  • Experience with gathering end user requirements and writing technical documentation
  • Time management and multitasking skills to effectively meet deadlines under time-to-market pressure
  • Suggest innovative solutions based on new technologies and latest trends to sales team.
  • Review the architectural/ technological solutions for ongoing projects and ensure right choice of solution.
  • Work closely with sales team and clients to understand their business, capture requirements, identify pain areas, accordingly, propose an ideal solution and win business.

Azure :-Hands on experience in ADF – Azure Data Factory

  • Hands on experience in Big Data & Hadoop ECO Systems
  • Exposure to Azure Service categories like PaaS components and IaaS subscriptions
  • Ability to Design, Develop ingestion & processing frame work for ETL applications
  • Hands on experience in powershell scripting, deployment on Azure
  • Experience in performance tuning and memory configuration
  • Should be adaptable to learn & work on new technologies
  • Should have Good written and spoken communication skill

Nos of Resources required: 1 to 2

Work location: As of now Remote (Bangalore)

Experience: 5 yrs – 6 yrs

Mobilization Period in weeks: 2 weeks

Duration: 6 to 12 months 

Fixed Price - Est. Time: 12 months,

AWS Developer

  • Should have strong knowledge of AWS Cloud libraries
  • Proficiency in developing, deploying, and debugging cloud-based applications using AWS Services - EC2, ECS, EKS, Lambda, DynamoDB, SQS, Cognito, CloudFormation
  • Knowledge on Java, Spring boot, Spring Integration, Node JS
  • Ability to use the AWS service APIs, AWS CLI, and SDKs to write applications
  • Ability to use a CI/CD pipeline to deploy applications on AWS
  • Ability to write code using AWS security best practices (e.g., not using secret and access keys in the code, instead using IAM roles)
  • Hands-on experience working in an agile/iterative development environment
  • Develop and deploy AWS cloud-based solutions, services, and interfaces
  • Participate in all phases of software engineering including; requirements, design, coding and testing.
  • Design and implement product features in collaboration with product managers and stakeholder.
  • Design reusable components, frameworks and libraries or micro-services
  • Participate in an Agile/Scrum methodology to deliver high-quality software releases every 2 weeks

Nos of Resources required: 5 to 6

Work location: Remote (Bangalore/Mumbai)

Qualification: BTech

Experience: 4 yrs – 8 yrs

Mobilization Period in weeks: 2 week

Duration: 6 to 12 months