Web Scrapping Programmers:
· As a Python Developer, your role is to apply your knowledge set to fetch data from multiple online sources, cleanse it and build APIs on top of it
· Develop a deep understanding of our vast data sources on the web and know exactly how, when, and which data to scrap, parse and store this data
· Work closely with Database Administrators to store data in SQL and NoSQL databases
· Develop frameworks for automating and maintaining the constant flow of data from multiple sources
· Work independently with little supervision to research and test innovative solutions
Skills and Qualifications:
· Strong coding experience in Python (knowledge of Java, Javascript is a plus)
· Experience with SQL databases
· Experience with multi-processing, multi-threading and AWS/Azure
· Strong knowledge of scraping frameworks such as Python(Request, BeautifulSoup), Web-Harvest and others
· Previous experience with web crawling is a must
Experience:
Job location:
Duration:
Joining:
Please see the attached SOW. We are seeking talented CRO freelancers who could help us with our conversion rates for new product sales. We\'d like to try out the services on a project basis and then scale it up as an ongoing assigment on a retainership basis.
Only qualified freelenacers should apply.
Data Scientist:
• Resources having 3-5 years of experience in Data Science building predictive models
• Having knowledge of Optimization and Simulation
• Good proficiency in R and Python
• Knowledge of Trade Promotion Optimization would be plus
Experience:
Job location:
Duration:
Joining:
Position: Hadoop Admin
Must Have Technical Skills: Hadoop Admin
Good to have Technical Skills: Linux Admin, ETL
· Extensive experience with RedHat Linux and Cloudera is mandatory.
· Experience in installing, configuring, upgrading and managing Hadoop environment.
· Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues.
· Work closely with data scientists and data engineers to ensure the smooth operation of the platform.
· End-to-end performance tuning of the clusters.
· Maintains and administers computing environments including computer hardware, systems software, applications software, and all configurations.
· Defines procedures for monitoring and evaluates, diagnoses and establishes work plan to resolve system issues.
· Working knowledge of entire Hadoop ecosystem like HDFS, Hive, Yarn, Oozie, Kafka, Impala, Hive, Kudu, HBase, Spark and Spark Streaming.
· Knowledge of Private and Public cloud computing and virtualization platform.
Nos of Resources required: 2 to 3
Work location: Remote
Qualification: BTech
Experience: 4 yrs – 5 yrs
Mobilization Period in weeks: 1 week
Duration: 3 to 6 months
Tableau Developers:
Tableau Developer Responsibilities:
Tableau Developer Requirements:
Experience:
Job location:
Duration:
Joining:
SAS Data Integration Consultant: