Remote Data Mining And Management Job In Data Science And Analytics

Scraping is cooL — Shakespeare

Find more Data Mining And Management remote jobs posted recently Worldwide

Building an in-house SG Dev team to solidify our ongoing project on the biggest, baddest, most gangsta scraper ever.


Not for the faint of heart.
About the recuiter
Member since Mar 14, 2020
Erik Nerum
from Central, Uganda

Skills & Expertise Required

Appium Web Crawling Machine Learning Scrapy Selenium 

Candidate shortlisted and hiredHiring open till - Apr 28, 2024

Work from Anywhere

40 hrs / week

Hourly Type

Remote Job

$26.83

Cost

Looking for help? Checkout our video tutorial
How to search and apply for jobs

How to apply? Do you have more questions about the Job?
See frequently asked questions

Similar Projects

Collect data about Zoom webinars in certain categories

Im looking for a company or an individual that will collect all the zoom webinars posted on Eventbrite, Meetup, Facebook events, and others into one list including a link, emails of organizers, email of participants and subject.

I need a li...read more

fuzzy matching exercise - i need to fuzzy match names from a land records database to an existing Table I have created.

I am an academic working on a research project, that uses data from a land records database. I need to standardize the names in the database and hence i need to fuzzy match names to one another (i.e. bnk of America is the same as Bank of America)

Develop NLP SaaS to Analyze Insights from Online Data

Seeking to develop a cloud-based financial software program that utilizes natural language processing (NLP) techniques to extract key information and insights from data obtained via web scraping online sources, including news articles, documents, vid...read more

Extract Content from PDF

I want someone to create a script/application that is able, to extract content from PDF. The PDF will contain content/Customer data such as Name, address, etc. We should be able to extract the content and put it csv or json or whatever. I will prefer...read more

Web Crawling

We need a web crawler to index websites so that we can use a single search page and find results from all the pages we have crawled.