Remote Data Mining And Management Job In Data Science And Analytics

Build out Advance Analytics and BI Platform

Find more Data Mining And Management remote jobs posted recently Worldwide

I will share my use case and am looking to build out Advance analytics and BI platform thats cost-effective yet viable (capable of working successfully) and scalable with shortliste candidates.

Current Use Case is to collect data from CRM service providers like SalesForce &
Microsoft CRM 365 and then transform into meaningful data by logically joining
different entities received and then persisting it into Data Warehouse.
Business Intelligence Dashboards will be developed and integrate it with Data
Warehouse and perform analysis using analytical queries (joining, grouping, sorting
etc.

Along with this, the Advanced Analytics Platform will be developed in which Data Science
The team will perform basic analysis first and Then build & train their Machine Learning
Models on top of it for Predictive Analytics & Recommendation Engines.

Create below modules -

1. Data Sources Management
Using this Module, User will be able to configure their Data Sources from which Data
needs to be collected.
Once User configure the Data Sources, Data Ingestion Job will be submitted to Apache
Gobblin and Gobblin will start collecting data from Data Sources.

2. Real-Time Analytics
Once Ingested Data is available on Kafka Streams, Structured Spark Streaming will be
used to process & transform it in a distributed way and write that to MariaDB Column
Store.

3. Data Lake
Data Lake will be required as Lot of Data will be ingested and Data Warehouse will be
having cleaned & transformed version of data .
All Data collected from CRM Data Sources will be coming in JSON format . So Apache
Gobblin will convert the json data into parquet format before loading in Data Lake for
better I/O & Low Latency Reads

4. Data Processing
For Data Processing, Apache Spark will be used which is distributed data processing
engine and Data Processing Jobs will be scheduled using Apache Airflow and it will
read latest data from Data Lake and apply required transformations and then persist the
data to Data Warehouse.

5. Data Warehousing
For Data Warehouse, Hive on Minio will be used and File Format will be Parquet. The hive
will act as a MetaStore and Schemas will be defined in it for various tables and Tables
will be pointing to their corresponding Minio Storage Location.

Business Intelligence
Both Querying Engines i.e. Spark sql on Hive and MariaDb ColumnStore supports JDBC .
So , Any BI Tool can connect to them using Standard JDBC Connections and execute
analytics queries and create various charts/graphs.
About the recuiter
Member since Sep 13, 2017
Lori Layne
from Alabama, United States

Skills & Expertise Required

Apache Hive Apache Kafka Apache Spark Backend Rest API Minio 

Candidate shortlisted and hired
Hiring open till - Jun 23, 2019

Work from Anywhere

40 hrs / week

Fixed Type

Remote Job

$13,910.54

Cost

Looking for help? Checkout our video tutorial
How to search and apply for jobs

How to apply? Do you have more questions about the Job?
See frequently asked questions

Similar Projects

API expert needed to connect PHP application to multiple other apps (Gmail, Zoho, Twilio, etc.)

Our web application (project management portal) needs to be connected to multiple third party apps using their APIs.

Examples of this jobs requirements are:
1. Sign in with Google feature
2. Sign in with LinkedIn feature
3....read more

Create Spark Helm Chart (latest version)

Hi, what I need is the following:

- I need a helm script (kubernetes) which installs the latest spark version as yarn cluster on kubernetes

- This setup should provide Zeppelin

- All of this has to work with sparks lat...read more

R programming skills Instructor

Teach R programming and Data Analysis.

Teach advanced R programming skills, including:

Refactor and refine components.

Extending the framework.

RTables experience.

Must provide syllabus/curriculum in respo...read more

Java Technical Architect

Need an experienced Java Technical architect who has over 10 years of experience in Java worked on Struts frame 1.3 has excellent skills in debugging performance issues.

Solution Architect and Consultant

Required Skills:


- Preferred Qualifications and/or Education
- 5+ years of experience with AWS, Azure and Google cloud
- 5+ years of experience with cloud-based provisioning, monitoring, troubleshooting, and related DevOps techn...read more