Test Automation for Website and Mobile and SoapUI For API

By Rajesh B.

Software Test Automation Engineer (QA)
India
Contact Seller
$103.45 Cost
7 days Delivery

Description


Hi,

I am Freelancer with more than a 1.5 years of experience with latest trending tools in Software Testing domain. I have been working as SDET (Software Development Engineer in Testing) known as Test Automator.

I worked more and got great experience in testing with these trending domains like E-Commerce, ERP (Enterprise Resource Planning) and also, Digital Marketing. These type of trending online website I do well.

Work Experience:

I had been working as a Software Automation Test Engineer since more than couple years to till today. I have a great product knowledge in the different type of domains which are trending now like ERP, E-Comm, DigMarketing, etc,..




I have a close relationship with "Selenium web Driver" which is Great Loving Open Source Tool for Test Automators like me.

While doing automation we have different types of Framework processes to follow. Here mostly I used Data Driven Framework for that projects as per the client suggestion. and also I developed some of the scripts using Hybrid Framework (Module Driven with TestNG).


And recently I started work "Behavioral Driven Development Framework" using "Cucumber" with "Maven Project " it's very nice when compared to previous frameworks and it's especially for the client base to understand script easily. so it would be better in future implementations, and trending one.



Here I can also provide Web-Service Testing using with "Groovy" and SoapUi for all Soap Based and Rest Based services by using Web Service Description Language and Web Application Description Language (WSDL & WADL service URLs)

While doing test Automation, developing scripts I used different types of tools and plugins to Automate any type of websites developed by using HTML, Java, .Net, PHP, Ajax, Flux, Non-HTML Pages and Also Windows Based Application (1 Tire, 2 &3& N -Tire App's ) to automate these type website most of the times I worked on these technologies like Selenium, Java-Robot, Sikuli, Actions, Javascript, Maven, TestNG, Cucumber, Java Oops, Keyboard functions, SoaupUI, Groovy,etc..

I have good experience using Appium for Mobile .apk Automation Testing, Here I can create Virtual Android Devices to Automate your .apk (mobile application) with latest API Level (latest android versions technologies like 7.0,6.0,5.0,.. ) and same execution with Real Device by using Appium Server and UI Automator Viewer I will automate your .apk file, And good experience launching application form computer to mobile or mobile to computer or server to mobile device using commands I can install your .apk into mobile from the server, using UI Automator Viewer Perform the Testing.

I can also provide details log report for both Website Testing, .apk Testing


Work Style:


Based on the website and client need,
I will create test suites to cover all functionalities for each of the suite (with successive steps).using Selenium, Java, Sikuli, etc,..codes I' will implement by following exactly step by step each action made into the system,

Each test we can able to run individually with respective module wise or I can also write a script for build file to run the entire project by the single click, at this time I'll follow client suggestion with their need, make the specific action in order to cover all the functionalities of the system.

All the developed scripts we can run for retesting, regression testing, and it can be easy to update when new enhancements will come or any changes needed in existing scripts it would be easy and understandable to the new one.

Preferred and well-Versed technologies :

Eclipse Java, Intelli-J
Selenium Web Driver, Sikuli, Cucumber, BDD
TestNG, Maven, Jenkins, Appium, SoaupUI, Groovy



- Quality Testing
- Layout and Visual Design
- Navigation & Information Architecture(Flow of Project)
- Writing & Content Quality

- Output LogFiles through emails
- ErrorLog
- Detailed documents for work efforts andtracking details


Feel free to contact me with any questions before Buy an Hourlie

What seller need from the Buyer to get started?

If Manual Testing:
1.I need URL of the website
2.Requirements Document
3.Test cases and model Test data if client has

If Automation Testing:
1. URL of the website
2 .Test cases or document to get clarity on script flow to cover all functionality
3. Clent suggestions to follow framework process if client has

Web Service /API/SOA:

1.WSDL or WADL URL (REST or SOAP)
2. And functionality document to know what should I test

Mobile App:
1. App package details
2. Device OS name and Version
3. Test Environment details

You'll find all feedbacks here

Other services by Rajesh B.

Test Automation for Website and Mobile and SoapUI For API
by Rajesh B.
Test Automation for Website and Mobile and SoapUI...

Hi, I am Freelancer with more than a 1.5 years of experience with latest trending tools in Software Testing domain...

$103.45
Test Automation (Selenium) and Manual Testing
by Rajesh B.
Test Automation (Selenium) and Manual Testing

Hi, I am a Strong skilled in Selenium Wed Driver which is Great Loving Open Source Tool for Automaton. And I used Data Driven Fra...

$431.03

Get The Best Similar Services

Find the best services you need to help you successfully meet your project planning goals and deadline

Articles Related To Selenium


The importance of extracting information from the web is becoming increasingly loud and clear. Every few weeks, I realize myself in a situation where we need to extract information from the web to create a machine learning model. We have to pull or extract a large amount of information from websites and we would like to do it as quickly as possible. How would we do it without manually going to every web site and getting the data? Web Scraping simply makes this job easier and faster.

Why is web scraping needed?

Web scraping is used to collect large information from websites. But why does someone have to collect such large data from websites? Let’s look at the applications of web scraping: 

  1. Price Comparison: Services such as ParseHub use web scraping to collect data from online shopping websites and use it to compare the prices of products.
  2. Social Media Scraping: Web scraping is used to collect data from Social Media websites such as Twitter to find out what’s trending.
  3. Email address gathering: Many companies that use email as a medium for marketing, use web scraping to collect email ID and then send bulk emails.
  4. Research and Development: Web scraping is used to collect a large set of data (Statistics, General Information, Temperature, etc.) from websites, which are analyzed and used to carry out Surveys or for R&D.
  5. Job listings: Details regarding job openings, interviews are collected from different websites and then listed in one place so that it is easily accessible to the user.

 

Web scraping is an automated method used to extract large amounts of data from websites. The data on the websites are unstructured. Web scraping helps collect these unstructured data and store it in a structured form. There are different ways to scrape websites such as online Services, APIs or writing your own code.

Why Python is best for Web Scraping

Features of Python which makes it more suitable for web scraping:

  1. Ease of Use: Python is simple to code. You do not have to add semi-colons “;” or curly-braces “{}” anywhere. This makes it less messy and easy to use.
  2. Large Collection of Libraries: Python has a huge collection of libraries such as Numpy, Matlplotlib, Pandas etc., which provides methods and services for various purposes. Hence, it is suitable for web scraping and for further manipulation of extracted data.
  3. Dynamically typed: In Python, you don’t have to define datatypes for variables, you can directly use the variables wherever required. This saves time and makes your job faster.
  4. Easily Understandable Syntax: Python syntax is easily understandable mainly because reading a Python code is very similar to reading a statement in English. It is expressive and easily readable, and the indentation used in Python also helps the user to differentiate between different scope/blocks in the code.
  5. Small code, large task: Web scraping is used to save time. But what’s the use if you spend more time writing the code? Well, you don’t have to. In Python, you can write small codes to do large tasks. Hence, you save time even while writing the code.
  6. Community: What if you get stuck while writing the code? You don’t have to worry. Python community has one of the biggest and most active communities, where you can seek help from.

How does web scraping work

To extract data using web scraping with python, you need to follow these basic steps:

  1. Find the URL that you want to scrape
  2. Inspecting the Page
  3. Find the data you want to extract
  4. Write the code
  5. Run the code and extract the data
  6. Store the data in the required format

Example: Scraping a website to get product details

Pre-requisite:

  • Python 2.x or Python 3.x
  • Selenium Library
  • BeautifulSoup Library
  • Pandas Library
  1. We are going scrape online shopping website to extract the Price, Name, and rating of products, go to products URL
  2. The data is usually nested in tags. So, we inspect the page to examine, under which tag the information we would like to scrape is nested. To inspect the page, just right click on the element and click on “Inspect”. When you click on the “Inspect” tab, you will see a “Browser Inspector Box” open.
  3. Let’s extract the Price, Name, and Rating which is nested in the “div” tag respectively.
  4. Write code:

#Let us import all the necessary libraries

from selenium import webdriver

from BeautifulSoup import BeautifulSoup

import pandas as pd

driver = webdriver.Chrome("/usr/lib/chromium-browser/chromedriver")

products=[] #List to store name of the product

prices=[] #List to store price of the product

ratings=[] #List to store rating of the product

driver.get("Product_URL")

content = driver.page_source

soup = BeautifulSoup(content)

for a in soup.findAll('a',href=True, attrs={'class':'.…'}):

name=a.find('div', attrs={'class': '….'})

price=a.find('div', attrs={'class':'….'})

rating=a.find('div', attrs={'class':'….'})

products.append(name.text)

ratings.append(rating.text)

df = pd.DataFrame({'Product Name':products,'Price':prices,'Rating':ratings})

df.to_csv('products.csv', index=False, encoding='utf-8')

 

To run the code, a file name “products.csv” is created and this file contains the extracted data.

What our users are discussing about Selenium