Position: Support Engineer
Must Have Technical Skills: Azure Data Factory
Good to have Technical Skills: Knowledge of sql or hive, python, pyspark, hdfs or adls
Preferred Industry Experience: Manufacturing
Role:
Monitor Batch Pipelines in the Data Analytical Platform and provide workaround to the problems.
Troubleshoot issues, provide workaround, determine root cause, upon failures.
Desired Qualification:
· 0-3 years of DE or batch support experience.
· Must be willing to work completely in the night shift.
· Ability to work on a task independently or with minimal supervision.
· Knowledge of Data Orchestration Tool, preferably ADF.
· Knowledge of sql or hive, python, pyspark, hdfs or adls.
· Work location: Remotely.
Nos of Resources required: 1 to 2
Work location: Remote
Qualification: BE, BTech, MBA, MCA Prefer Comp. Sci. background
Experience: 0-3 Years
Mobilization Period in weeks: 1 week
Duration: 6 to 12 months
Machine Learning Programmers:
Responsibilities
Requirements
Experience:
Job location:
Duration:
Joining:
Visualization expert:
· Resources having 3-5 years of experience creating user interfaces using Power BI and R-Shiny
· Have basic Analytics concepts like Predictive modelling, Optimization and Simulation
· Knowledge of Python would be desirable
· Knowledge of Trade Promotion Optimization would be plus
· Excellent written and verbal communication.
· Good organizational skills.
· Ability to work as part of a team.
Experience:
Job location:
Duration:
Joining:
Position: Hadoop Admin
Must Have Technical Skills: Hadoop Admin
Good to have Technical Skills: Linux Admin, ETL
· Extensive experience with RedHat Linux and Cloudera is mandatory.
· Experience in installing, configuring, upgrading and managing Hadoop environment.
· Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues.
· Work closely with data scientists and data engineers to ensure the smooth operation of the platform.
· End-to-end performance tuning of the clusters.
· Maintains and administers computing environments including computer hardware, systems software, applications software, and all configurations.
· Defines procedures for monitoring and evaluates, diagnoses and establishes work plan to resolve system issues.
· Working knowledge of entire Hadoop ecosystem like HDFS, Hive, Yarn, Oozie, Kafka, Impala, Hive, Kudu, HBase, Spark and Spark Streaming.
· Knowledge of Private and Public cloud computing and virtualization platform.
Nos of Resources required: 2 to 3
Work location: Remote
Qualification: BTech
Experience: 4 yrs – 5 yrs
Mobilization Period in weeks: 1 week
Duration: 3 to 6 months
SAS Data Integration Consultant:
Web Scrapping Programmers:
· As a Python Developer, your role is to apply your knowledge set to fetch data from multiple online sources, cleanse it and build APIs on top of it
· Develop a deep understanding of our vast data sources on the web and know exactly how, when, and which data to scrap, parse and store this data
· Work closely with Database Administrators to store data in SQL and NoSQL databases
· Develop frameworks for automating and maintaining the constant flow of data from multiple sources
· Work independently with little supervision to research and test innovative solutions
Skills and Qualifications:
· Strong coding experience in Python (knowledge of Java, Javascript is a plus)
· Experience with SQL databases
· Experience with multi-processing, multi-threading and AWS/Azure
· Strong knowledge of scraping frameworks such as Python(Request, BeautifulSoup), Web-Harvest and others
· Previous experience with web crawling is a must
Experience:
Job location:
Duration:
Joining: