Remote Data Mining And Management Job In Data Science And Analytics

ETL Expert with experience in Google and Social APIs + Knime

Find more Data Mining And Management remote jobs posted recently Worldwide

We are looking for a ETL Expert that can help us automate data acquisition. We have plans and connections and principles, need someone that can replicate and implement. The tools you HAVE to be comfortable with are:
- MYSQL
- Google APIs
- Social Media APIs
- Knime (is a node based ETL tool that any ETL person can learn in a few hours)
- Python for things we need to script and scraping (unless you can build in Knime)
- Google Cloud APIs
- JDBC basis
- OAuth
- Spotfire as a Bonus

Most people will know all of the above but Knime, Knime is just a way for us and be able to mange and update. If you can use Knime or not willing to, we will not be working with you. We are happy to pay qa few hours on the project to pay for learning curve, but its a MUST!


About the recuiter
Member since Aug 28, 2017
Lucas
from Madrid, Spain

Candidate shortlisted and hired
Hiring open till - Oct 19, 2018

Work from Anywhere

40 hrs / week

Hourly Type

Remote Job

$12.51

Cost

Looking for help? Checkout our video tutorial
How to search and apply for jobs

How to apply? Do you have more questions about the Job?
See frequently asked questions

Similar Projects

Looking for Python programmer to write downloader for Google Open Images based on image type

Hello,

We are looking to hire a Python programmer for a small project-based commission of writing a downloader for Google Open Images.

Details are available in private, thank you.

European Fake Licence Plates Generator in Python

Hey,

I need a generator to create a dataset for European Licence Plate.

Please check to have a view of european licence plate if you are not from europe

https://en.wikipedia.org/wiki/Vehicle_registration_plates_of_Europeread more

Web Scraping/Data Extraction Needed - Fourth World

We are looking for a web scraper who can gather data for our front-end designer of our website. This will be the first iteration of the website, and we need an individual to get us out to market as we develop out the rest of the website. We have the...read more

scraper part 2

1- so take the csv called count_fast which is the output file, and use this as the ingest file
2- replace the columns name called videocount with the timecode that the scraper is launched (so on each launch the header is the new date
3- also...read more