Web Scraping API Introduction The Decodo Web Scraping API provides fully managed web data extraction, handling proxies, browsers, CAPTCHAs, and anti-bot mechanisms so you can focus on utilizing the data.Documentation Index
Fetch the complete documentation index at: https://help.decodo.com/llms.txt
Use this file to discover all available pages before exploring further.
Pricing Structure
Web Scraping API offers flexible pricing depending on your scraping volume and needs. Scraping job price is affected by two parameter selections:- Proxy Pool. In the Universal scraper, you can choose from Standard and Premium proxy pools. Standard is suitable for simple web pages; therefore, requests are cost-optimized, while Premium can bypass advanced anti-bot protections. Keep in mind that Target Templates use Premium proxies by default.
- JavaScript rendering. In Universal scraper and most Target Templates, you can choose if JS should be rendered with the request. Enabling rendering increases request cost, since it consumes additional resources.
Each Self Service plan offers different rates for 1k successful requests. Specific rate depends on the combination of Proxy pool and JS Rendering used:
| Proxy Pool | JS rendering | Description |
|---|---|---|
| Standard | - | Designed for extraction on simple static web pages with low to moderate security. Available for Universal scraper, has 8 geo locations available. |
| Standard | headless parameter set to html or png | Designed for scraping websites that load content dynamically via JavaScript, where advanced anti-bot masking is not required. Available for Universal scraper, has 8 geo locations available. |
| Premium | - | Designed for extracting data from websites with advanced anti-bot protection, Handles complex security challenges to ensure consistent access to protected static content. Used for all dedicated Target Templates and available for Universal scraper. |
| Premium | headless parameter set to html or png | Designed for extracting data from websites with advanced anti-bot protection, handles complex security challenges to ensure consistent access to protected static content. Available for all dedicated Target Templates and Universal scraper. |
API Playground
Unsure which configuration fits your needs? Validate your parameters and preview real-time results using our Scraper API Playground.Choose the Right Solution
Not sure which parameters suit your scraping needs? Try for free before you commit!API Playground
Validate your parameters and preview real-time results using our Scraper API Playground.- The API Playground is limited to
10requests per day.
How it works
- Start collecting data with the easy-to-use dashboard scraping solution:
- Choose a target – select the website or data source you need to scrape from our extensive templates list, or choose the universal Web target for any others, then type your URL.
- Specify your parameters – choose your JS rendering and parsing options and customize request parameters, such as language and location.
- Use the built-in scraper – ready-made scraper handles proxies, browsers, and anti-bot bypassing automatically.
- Receive clean data – get structured output in
HTML,JSONformats, ready for analysis. - You can copy
cURL,Python, andNodeexamples straight from the dashboard Request Examples for easy integration.
- Explore your usage statistics in the Dashboard: see traffic breakdowns by domain, success rates, and hourly, daily, and monthly insights.
We Do Not Support Scraping Post-login Data
Scraping data after logging in often involves accessing personal or sensitive information, which many websites explicitly forbid in their terms of service. To help you stay compliant with your targeted sites and avoid any potential legal issues, we focus on providing tools for scraping only publicly available data.Support
Need help or just want to say hello? Our support is available 24/7.
You can also reach us anytime via email at support@decodo.com.
You can also reach us anytime via email at support@decodo.com.
Feedback
Can’t find what you’re looking for? Request an article!
Have feedback? Share your thoughts on how we can improve.
Have feedback? Share your thoughts on how we can improve.