Free and easy to use web scraping tool for everyone. With a simple point-and-click interface, the ability to extract thousands of records from a website takes only a few minutes of scraper setup.
QuickCode is the new name for the original ScraperWiki product. We renamed it, as it isn’t a wiki or just for scraping any more. It’s a Python and R data analysis environment, ideal for economists, statisticians and data managers who are new to coding. 単純作業を自動化するツールはRPAソフトが出ていますが、実はPythonでも似たようなことが出来ます。 RPAに関しては、興味があれば『RPAツールUiPathを利用してみた感想 メリットやデメリット』で実際に使った感想 'PythonのGUIで業務自動化する方法 マウスを動かしてクリックする ' の続きを読む.
Recently I come across a tool that takes care of many of the issues you usually face while scraping websites. The tool is called Scraper API which provides an easy to use REST API to scrape a different kind of websites(Simple, JS enabled, Captcha, etc) with quite an ease. Before I proceed further, allow me to introduce Scraper API.
What is Scraper API
If you visit their website you’d find their mission statement:
Web Scraper Python
Scraper API handles proxies, browsers, and CAPTCHAs, so you can get the HTML from any web page with a simple API call!
As it suggests, it is offering you all the things to deal with the issues you usually come across while writing your scrapers.
Scraper API provides a REST API that can be consumed in any language. Since this post is related to Python so I’d be mainly focusing on
requests library to use this tool.
You must first signup with them and in return, they will provide you an API KEY to use their platform. They provide 1000 free API calls which are enough to test their platform. Otherwise, they offer different plans from starter to the enterprise which you can view here.
Let’s try a simple example which is also giving in the documentation.
API_KEY='<YOUR API KEY>'
Assuming you are registered and have got an API which you can find on the dashboard, you can start working right away after having it. When you run this program it shows the IP address of your request.
Do you see, every time it returns a new IP address, cool, isn’t it?
There are some scenarios where you would like to use the same proxy to give the impression that a single user is visiting a different part of the website. For that, you can pass
session_number parameter in the
payload variable above.
And it’d produce the following result:
Can you notice the same proxy IP here?
Python 3 Web Scraping
Creating OLX Scrapper
Like previous scraping related posts, I am going to pick OLX again for this post. I will iterate the list first and then will scrape individual items. Below is the complete code.
I am using
Beautifulsoup to parse HTML. I have only extracted Price here because the purpose is to tell about the API itself than Beautifulsoup. You should see my post here in case you are new into scraping and Python.
Sample Web Scraper Python
Oh if you sign up here with my referral link or enter promo code adnan10, you will get a 10% discount on it. In case you do not get the discount then just let me know via email on my site and I’d sure help you out.
In the coming days, I’d be writing more posts about Scraper API discussing further features.
Planning to write a book about Web Scraping in Python. Click here to give your feedback