Hire an ideal data scientist for your project
  • 300+
    Data Scientists
  • 475+
    Users
Related Work
Fixed Price - Est. Time: 12 months,

Position: Azure Snowflake

  • Demonstrated ability to have successfully completed multiple, complex technical projects and create high-level design and architecture of the solution, including class, sequence and deployment infrastructure diagrams.
  • Take ownership of technical solutions from design and architecture perspective for projects in presales phase as well as on-going projects.
  • Prior experience with application delivery using an Onshore/Offshore model
  • Experience with gathering end user requirements and writing technical documentation
  • Time management and multitasking skills to effectively meet deadlines under time-to-market pressure
  • Suggest innovative solutions based on new technologies and latest trends to sales team.
  • Review the architectural/ technological solutions for ongoing projects and ensure right choice of solution.
  • Work closely with sales team and clients to understand their business, capture requirements, identify pain areas, accordingly, propose an ideal solution and win business.

Azure :-Hands on experience in ADF – Azure Data Factory

  • Hands on experience in Big Data & Hadoop ECO Systems
  • Exposure to Azure Service categories like PaaS components and IaaS subscriptions
  • Ability to Design, Develop ingestion & processing frame work for ETL applications
  • Hands on experience in powershell scripting, deployment on Azure
  • Experience in performance tuning and memory configuration
  • Should be adaptable to learn & work on new technologies
  • Should have Good written and spoken communication skill

Nos of Resources required: 1 to 2

Work location: As of now Remote (Bangalore)

Experience: 5 yrs – 6 yrs

Mobilization Period in weeks: 2 weeks

Duration: 6 to 12 months 

Fixed Price - Est. Time: 6 months,

Position: Hadoop Admin

Must Have Technical Skills: Hadoop Admin

Good to have Technical Skills: Linux Admin, ETL

·         Extensive experience with RedHat Linux and Cloudera is mandatory.

·         Experience in installing, configuring, upgrading and managing Hadoop environment.

·         Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues.

·         Work closely with data scientists and data engineers to ensure the smooth operation of the platform.

·         End-to-end performance tuning of the clusters.

·         Maintains and administers computing environments including computer hardware, systems software, applications software, and all configurations.

·         Defines procedures for monitoring and evaluates, diagnoses and establishes work plan to resolve system issues.

·         Working knowledge of entire Hadoop ecosystem like HDFS, Hive, Yarn, Oozie, Kafka, Impala, Hive, Kudu, HBase, Spark and Spark Streaming.

·         Knowledge of Private and Public cloud computing and virtualization platform.

Nos of Resources required: 2 to 3

Work location: Remote

Qualification: BTech

Experience: 4 yrs – 5 yrs

Mobilization Period in weeks: 1 week

Duration: 3 to 6 months  

Please Select..
Sample project
Fixed Price - Est. Time: 12,

We need a proficient PySpark resource with good experience.

Fixed Price - Est. Time: 1 month,

We need help with our inventory planning. More details to be shared. Interested partie please contact

Fixed Price - Est. Time: 1 month,

Please see the attached SOW. We are seeking talented CRO freelancers who could help us with our conversion rates for new product sales. We\'d like to try out the services on a project basis and then scale it up as an ongoing assigment on a retainership basis.

Only qualified freelenacers should apply.