Here are the functions of the Netpeak Spider.
The 'Windows' and 'macOS' columns indicate the version of the program in which a certain function appeared.
'Declined' refers to features that are no longer supported, while 'planned' denotes features to be added in the future.
A complete list of changes is indicated in change log.
Application automatic update
We regularly update our products, and update installing is a breeze. Launch Netpeak Spider and it will automatically update to the latest version. You don’t need to download anything from the site or reload.
Use the app on two devices
One user can connect two devices to the account. It is convenient if you use two machines: at home and at work or PC and laptop.
Netpeak Spider supports the English, Ukrainian, and Russian languages.
Change parameters during crawling
Turn on and turn off certain parameters before and during crawling. This allows you to reduce crawling time and the size of the project.
The parameters can also be switched off while analyzing the data to leave only the data you need in the table.
Check 100+ technical issues
Netpeak Spider is the first and foremost a tool for in-depth technical audit: check for broken links, duplicate pages and meta tags, canonical and pagination issues. These and dozens of other issues are highlighted by color in a sidebar, as well as in the main table. The program stores entire documentation on every issue: its description, threat it presents, how to fix it, and even a list of useful links.
Enhanced issue description
In the ‘Info’ panel at the bottom, you will find complex documentation on issues: description, threat, how to fix, and even some useful links.
The ‘Site structure’ tab contains the full tree structure of a website that allows you to view pages for any directory.
The dashboard is presented in two forms depending on the crawling stage:
Internal Database for Working with Huge Number of URLs
Netpeak Spider allows working with an internal database which is especially useful when you need to analyze millions of URLs. Right in the program, you can open any parameter from the database (incoming / outgoing / internal / external links, images, H1-H6 headings, canonical tags, redirects, issues) for a particular URL, selected or filtered URLs, or all results from the current table. It can also be done literally in one click via the context menu.
Crawl website from initial URL
Crawl a list of URLs
Crawl websites using basic authentication
Quick search allows finding a specific URL or parameter value in program tables.
You can set any filtering rules and combine them using the ‘AND’ and ‘OR’ rules. By default all reports in the program are built based on filtering (the ‘Dashboard’ and other reports in a sidebar), so don’t hesitate to use filters and create segments on their basis for further data analysis.
When crawling is finished, you can change a data view in the program limiting it to a particular segment (same as segmentation in Google Analytics). Set a segment manually or use any available filter as a segment (use the corresponding button above the main table).
Segmentation is a powerful feature that has lots of use in the program and helps analyze data from a completely different angle. Keep in mind that using segments affects all reports in the program, including ‘Issues’, ‘Overview’, ‘Site structure’, ‘Dashboard’, etc.
Data saving and copying
The program back up the collected data automatically. This is useful when there is a risk of a sudden computer shutdown and data loss. Additionally, after the crawling is over, you can save the project to have quick access to the crawled data, share the file with colleagues, and even open the list of URLs in Netpeak Checker (yup, our tools are cross-integrated, duh).
Use good ol` Ctrl+C hotkeys, or the 'Copy' option in the context menu. Oh, and make sure to try the extended copy button in the sidebar: in just one click you can copy to the clipboard the contents of the 'Issues', 'Overview', 'Site structure', 'Scraping' tabs.
This is an advanced feature for professionals with numerous projects: using it, you can open the program in additional windows and work in each of them simultaneously.
Custom website scraping and data extraction
Custom search of source code / text according to the 4 types of search: ‘Contains’, ‘RegExp’, ‘CSS Selector’, or ‘XPath’.
It allows scraping websites' source code and extract data from web pages.'
Copy / paste scraping parameters
Copy parameters' and 'Paste parameters' buttons on the 'Scraping' tab in the crawling settings. It allows to export or import scraping settings in the JSON format to / from the clipboard.
Internal PageRank calculation
Internal PageRank calculation tool will help you get deep insights into the project:
While creating this tool, we used official Google documentation and patents.
Source code and HTTP headers analysis
You don’t have to open a page in a browser to quickly analyze its source code – just use the built-in tool in Netpeak Spider. It allows checking HTTP request and response headers, data on redirects, GET parameters in URL, and all extracted text without the HTML code, which is especially useful for text analysis.
Sitemap generation (XML, Image, HTML)
XML Sitemap validation
The validator in Netpeak Spider finds more than 30 issues in Sitemap files and Sitemap index files. The tool automatically detects Sitemap index files, unzips .gz archives, and helps you use sitemaps for your projects to the fullest extent.
Validation of Image Sitemap files
SEO Audit Data Enrichment
Upload your data from Netpeak Checker or other services, merge and compare them to carry out more complex SEO audits. You can work with imported parameters just like you do with the main ones: filter, segment and analyze them.
An opportunity to set simultaneous crawling of multiple domains. The algorithm is simple: upload the list of domains, run simultaneous crawling, analyze parameters and issues in a single report.
Custom HTTP request headers
Opportunity to set up custom HTTP request headers in the 'HTTP headers' tab in the program settings.
Data export in Netpeak Spider is limited only by your imagination :) Data is exported in the same form as it’s shown in the results table including sorting, grouping, columns width, and order. You can also set parameters for the reports.
With a special export menu, in just one click you can export all available reports, a set of specific reports, or a particular report you need for further work – to set a task for a developer or report to your clients.
Basic technical SEO audit in PDF
Just in two clicks, you can export a report containing technical SEO audit in PDF. It shows key information required for a site audit – simply add your own recommendations and you can send it to your client or colleagues for further implementation.
This feature allows you to get the best of 'two worlds' in Netpeak Spider: the in-depth analysis and customization of the desktop tool, and the results visualization like in the most advanced online products.
White label reports with SEO audit
Opportunity to remove Netpeak Software branded elements from the PDF report with technical SEO audit and to add your own logo and necessary contact details in just a few clicks.
White label reports will help increase brand awareness and interest potential clients at the SEO services pre-sales stage. Also, they’ll help save time / energy and earn more for:
Google Analytics 4
Google Search Console
Import of Search Queries from Google Search Console
Export of reports to Google Drive & Sheets
Advanced Results Table
The table in Netpeak Spider is highly optimized for working with heavy data (100,000 pages and more).
Such features make it special:
Table with infinite scroll
Table with pagination
Data sorting by multiple columns
Filter by value’ option
Opportunity to use hotkeys, including adding URL from clipboard to the table via Ctrl+V or using Drag'n'Drop;
Check for duplicates
Check whether all hreflang attributes are set correctly on your website, if it's multilingual.
Validate AMP pages
Crawler defines the AMP pages that do not meet the requirements of the AMP Project as an “Bad AMP HTML format” issue, and also finds indexed AMP pages.
Detect Structured Data
With this new feature, you can detect what pages have structured data and which ones are missing it, though it should be there. Also, you can check what structured data formats (JSON-LD, Microdata, RDFa) and types the pages contain.
Automatic data backup during/after the crawling
When you start the crawl, and even if the forces of fate are against you, the next time you run the program, it will open a temporary project, which was saved during the last backup in Netpeak Spider. You can save this project to your computer or continue crawling the website.
The program automatically backups data:
You can check the spelling on the entire page and separately in the title, description, image alt tags, H1-H6 headings. In short, spell checker drills down all important and necessary places on your website.
Create templates of parameters, settings, filters and segments.
Was this article helpful?
Thank you for your feedback
Sorry! We couldn't be helpful
Thank you for your feedback
We appreciate your effort and will try to fix the article