Hire an ideal data scientist for your project
  • 300+
    Data Scientists
  • 475+
    Users
Related Work
Fixed Price - Est. Time: 12 months,

Position: Azure Snowflake

  • Demonstrated ability to have successfully completed multiple, complex technical projects and create high-level design and architecture of the solution, including class, sequence and deployment infrastructure diagrams.
  • Take ownership of technical solutions from design and architecture perspective for projects in presales phase as well as on-going projects.
  • Prior experience with application delivery using an Onshore/Offshore model
  • Experience with gathering end user requirements and writing technical documentation
  • Time management and multitasking skills to effectively meet deadlines under time-to-market pressure
  • Suggest innovative solutions based on new technologies and latest trends to sales team.
  • Review the architectural/ technological solutions for ongoing projects and ensure right choice of solution.
  • Work closely with sales team and clients to understand their business, capture requirements, identify pain areas, accordingly, propose an ideal solution and win business.

Azure :-Hands on experience in ADF – Azure Data Factory

  • Hands on experience in Big Data & Hadoop ECO Systems
  • Exposure to Azure Service categories like PaaS components and IaaS subscriptions
  • Ability to Design, Develop ingestion & processing frame work for ETL applications
  • Hands on experience in powershell scripting, deployment on Azure
  • Experience in performance tuning and memory configuration
  • Should be adaptable to learn & work on new technologies
  • Should have Good written and spoken communication skill

Nos of Resources required: 1 to 2

Work location: As of now Remote (Bangalore)

Experience: 5 yrs – 6 yrs

Mobilization Period in weeks: 2 weeks

Duration: 6 to 12 months 

Fixed Price - Est. Time: 6 months,

Position: PySpark Developer

Must Have Technical Skills: PySpark

Good to have Technical Skills: Spark, DWH

·         Hand on experience in Pyspark and Numpy, List, Pandas, Graph plotter.

·         Excellent knowledge of Python framework and Airflow/Luigi/Dagster/Apache Beam.

·         Python skills with all sorts of data libraries, visualization libraries,

·         Good SQL knowledge.

·         Ability to perform multiple task in continually changing environment.

·         Exposure on Hive/HBase/Redis.

·         Good knowledge of Data warehousing.

·         Reading from Parquet and other format.

·         Worked on Hive and HBase.

 

Nos of Resources required: 3 to 4

Work location: Remote

Qualification: BTech

Experience: 4 yrs – 5 yrs

Mobilization Period in weeks: 1 week

Duration: 3 to 6 months  

Fixed Price - Est. Time: 6 months,

Position: NiFi /Big Data Developer 

Must Have Technical Skills: NiFi

Good to have Technical Skills: Python, ETL

Preferred Industry Experience: Telecom

·         Extensive experience on Nifi to setup data pipeline.

·         Hands on experience in using controllers and processors to setup ETL framework in Apache Nifi

·         Extensive experience in Python

·         Food understanding on Spark, Spark Streaming & PySpark.

·         Good understanding of Big Data components

Nos of Resources required: 2 to 3

Work location: Remote

Qualification: BTech

Experience: 4 yrs – 5 yrs

Mobilization Period in weeks: 1 week

Duration: 3 to 6 months  

Fixed Price - Est. Time: 3 months,

Power BI and Reporting:

1) Proficient in Building the PowerBI dashboards, SSAS Cubes and Data warehouse solutions, using Microsoft BI tools and with industry best practices.

2) Understand and deliver the business scenarios for improvement on the BI platform from a clients perspective

3) Manage interaction and expectation of client considering all the aspects of building long term sustainable solution

4) Can create amazing dashboard designs, SSAS models, data models on the database, suggest ETL data flow and collectively support team for developments

5) Should manage the onsite and offshore assignments

6) Good knowledge of data warehouse modeling

7) Report and support to management for day to day activities and deliveries

8) Maker-Checker approach to follow in the development

Experience:

  • 2 to 3 years

Job location:

  • Bangalore/Chennai/ Hyderabad (WFH at the moment)

Duration:

  • 3 months. May get extended.

Joining:

  • immediate
Please Select..
For ubbjk
Fixed Price - Est. Time: 6 months,

Position: Hadoop Admin

Must Have Technical Skills: Hadoop Admin

Good to have Technical Skills: Linux Admin, ETL

·         Extensive experience with RedHat Linux and Cloudera is mandatory.

·         Experience in installing, configuring, upgrading and managing Hadoop environment.

·         Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues.

·         Work closely with data scientists and data engineers to ensure the smooth operation of the platform.

·         End-to-end performance tuning of the clusters.

·         Maintains and administers computing environments including computer hardware, systems software, applications software, and all configurations.

·         Defines procedures for monitoring and evaluates, diagnoses and establishes work plan to resolve system issues.

·         Working knowledge of entire Hadoop ecosystem like HDFS, Hive, Yarn, Oozie, Kafka, Impala, Hive, Kudu, HBase, Spark and Spark Streaming.

·         Knowledge of Private and Public cloud computing and virtualization platform.

Nos of Resources required: 2 to 3

Work location: Remote

Qualification: BTech

Experience: 4 yrs – 5 yrs

Mobilization Period in weeks: 1 week

Duration: 3 to 6 months