Here are the functions of the Netpeak Spider.
The 'Windows' and 'macOS' columns indicate the version of the program in which a certain function appeared.
'Declined' refers to features that are no longer supported, while 'planned' denotes features to be added in the future.
A complete list of changes is indicated in change log.
Feature | Windows | macOS |
---|---|---|
Application automatic updateWe regularly update our products, and update installing is a breeze. Launch Netpeak Spider and it will automatically update to the latest version. You don’t need to download anything from the site or reload. | 2.0.0.0 | 0.1.0.0 |
Use the app on two devicesOne user can connect two devices to the account. It is convenient if you use two machines: at home and at work or PC and laptop. | 2.0.0.0 | 0.1.0.0 |
Multilingual interfaceNetpeak Spider supports the English, Ukrainian, and Russian languages. | 2.1.0.4 | 0.1.0.0 |
Change parameters during crawlingTurn on and turn off certain parameters before and during crawling. This allows you to reduce crawling time and the size of the project. The parameters can also be switched off while analyzing the data to leave only the data you need in the table. | 3.0.0.0 | 0.1.0.0 |
Check 100+ technical issuesNetpeak Spider is the first and foremost a tool for in-depth technical audit: check for broken links, duplicate pages and meta tags, canonical and pagination issues. These and dozens of other issues are highlighted by color in a sidebar, as well as in the main table. The program stores entire documentation on every issue: its description, threat it presents, how to fix it, and even a list of useful links. | 2.1.0.0 | 0.1.0.0 |
Enhanced issue descriptionIn the ‘Info’ panel at the bottom, you will find complex documentation on issues: description, threat, how to fix, and even some useful links. | 3.2.0.0 | 0.1.0.0 |
Site structureThe ‘Site structure’ tab contains the full tree structure of a website that allows you to view pages for any directory. | 3.0.0.0 | 0.1.0.0 |
DashboardThe dashboard is presented in two forms depending on the crawling stage:
| 3.0.0.0 | planned |
Internal Database for Working with Huge Number of URLsNetpeak Spider allows working with an internal database which is especially useful when you need to analyze millions of URLs. Right in the program, you can open any parameter from the database (incoming / outgoing / internal / external links, images, H1-H6 headings, canonical tags, redirects, issues) for a particular URL, selected or filtered URLs, or all results from the current table. It can also be done literally in one click via the context menu. | 3.0.0.0 | 0.1.0.0 |
Crawl website from initial URL | 2.1.0.0 | 0.1.0.0 |
Crawl a list of URLs | 2.1.0.8 | 0.1.0.0 |
Virtual robots.txt | 3.0.0.0 | 0.1.0.0 |
Crawl websites using basic authentication | 2.1.4.0 | 0.1.0.0 |
JavaScript renderingThis feature comes in handy when crawling websites with JS scripts (meaning, almost all sites on the Internet). If you try crawling such site in without JS rendering, a crawler will be unable to detect data which is added with the help of JavaScript (links, descriptions, images, etc.), and therefore won't be able to analyze the pages correctly. To execute JavaScript when crawling CSR sites, Netpeak Spider uses one of the latest versions of Chromium, which makes crawling as advanced and close to Googlebot as possible. | 3.2.0.0 | planned |
Quick searchQuick search allows finding a specific URL or parameter value in program tables. | 3.0.0.0 | 0.1.0.0 |
Data filteringYou can set any filtering rules and combine them using the ‘AND’ and ‘OR’ rules. By default all reports in the program are built based on filtering (the ‘Dashboard’ and other reports in a sidebar), so don’t hesitate to use filters and create segments on their basis for further data analysis. | 3.0.0.0 | 0.1.0.0 |
Data segmentationWhen crawling is finished, you can change a data view in the program limiting it to a particular segment (same as segmentation in Google Analytics). Set a segment manually or use any available filter as a segment (use the corresponding button above the main table). Segmentation is a powerful feature that has lots of use in the program and helps analyze data from a completely different angle. Keep in mind that using segments affects all reports in the program, including ‘Issues’, ‘Overview’, ‘Site structure’, ‘Dashboard’, etc. | 3.0.0.0 | 0.1.0.0 |
Data saving and copyingThe program back up the collected data automatically. This is useful when there is a risk of a sudden computer shutdown and data loss. Additionally, after the crawling is over, you can save the project to have quick access to the crawled data, share the file with colleagues, and even open the list of URLs in Netpeak Checker (yup, our tools are cross-integrated, duh). Use good ol` Ctrl+C hotkeys, or the 'Copy' option in the context menu. Oh, and make sure to try the extended copy button in the sidebar: in just one click you can copy to the clipboard the contents of the 'Issues', 'Overview', 'Site structure', 'Scraping' tabs. | 2.0.0.0 | 0.1.0.0 |
Multi-window modeThis is an advanced feature for professionals with numerous projects: using it, you can open the program in additional windows and work in each of them simultaneously. | 2.0.0.0 | planned |
Custom website scraping and data extractionCustom search of source code / text according to the 4 types of search: ‘Contains’, ‘RegExp’, ‘CSS Selector’, or ‘XPath’. It allows scraping websites' source code and extract data from web pages.' | 2.1.1.4 | 0.1.0.0 |
Copy / paste scraping parametersCopy parameters' and 'Paste parameters' buttons on the 'Scraping' tab in the crawling settings. It allows to export or import scraping settings in the JSON format to / from the clipboard. | 3.9.0.0 | 0.1.0.0 |
Internal PageRank calculationInternal PageRank calculation tool will help you get deep insights into the project:
While creating this tool, we used official Google documentation and patents. | 2.1.1.2 | planned |
Source code and HTTP headers analysisYou don’t have to open a page in a browser to quickly analyze its source code – just use the built-in tool in Netpeak Spider. It allows checking HTTP request and response headers, data on redirects, GET parameters in URL, and all extracted text without the HTML code, which is especially useful for text analysis. | 2.1.1.0 | planned |
Sitemap generation (XML, Image, HTML) | 2.1.1.3 | planned |
XML Sitemap validationThe validator in Netpeak Spider finds more than 30 issues in Sitemap files and Sitemap index files. The tool automatically detects Sitemap index files, unzips .gz archives, and helps you use sitemaps for your projects to the fullest extent. | 2.1.0.7 | planned |
Validation of Image Sitemap files | 2.1.1.3 | planned |
SEO Audit Data EnrichmentUpload your data from Netpeak Checker or other services, merge and compare them to carry out more complex SEO audits. You can work with imported parameters just like you do with the main ones: filter, segment and analyze them. Read more | 3.11.0.1 | 0.1.0.0 |
Multi-domain crawlingAn opportunity to set simultaneous crawling of multiple domains. The algorithm is simple: upload the list of domains, run simultaneous crawling, analyze parameters and issues in a single report. | 3.5.0.0 | 0.1.0.0 |
Custom HTTP request headersOpportunity to set up custom HTTP request headers in the 'HTTP headers' tab in the program settings. | 3.6.0.0 | 0.1.0.0 |
Bulk exportData export in Netpeak Spider is limited only by your imagination :) Data is exported in the same form as it’s shown in the results table including sorting, grouping, columns width, and order. You can also set parameters for the reports. With a special export menu, in just one click you can export all available reports, a set of specific reports, or a particular report you need for further work – to set a task for a developer or report to your clients. | 3.0.0.0 | 0.1.0.0 |
Basic technical SEO audit in PDFJust in two clicks, you can export a report containing technical SEO audit in PDF. It shows key information required for a site audit – simply add your own recommendations and you can send it to your client or colleagues for further implementation. This feature allows you to get the best of 'two worlds' in Netpeak Spider: the in-depth analysis and customization of the desktop tool, and the results visualization like in the most advanced online products. | 3.2.0.0 | planned |
White label reports with SEO auditOpportunity to remove Netpeak Software branded elements from the PDF report with technical SEO audit and to add your own logo and necessary contact details in just a few clicks. White label reports will help increase brand awareness and interest potential clients at the SEO services pre-sales stage. Also, they’ll help save time / energy and earn more for:
| 3.4.0.0 | planned |
Universal Analytics | 3.3.0.0 | 0.1.0.0 |
Google Analytics 4 | 3.12.0.0 | 0.1.0.0 |
Google Search Console | 3.3.0.0 | 0.1.0.0 |
Import of Search Queries from Google Search Console | 3.6.0.0 | 0.1.0.0 |
Export of reports to Google Drive & Sheets | 3.7.0.0 | 0.1.0.0 |
Advanced Results TableThe table in Netpeak Spider is highly optimized for working with heavy data (100,000 pages and more). Such features make it special:
| 3.0.0.0 | 0.1.0.0 |
Table with infinite scroll | 3.0.0.0 | declined |
Table with pagination | declined | 0.1.0.0 |
Table State | 3.0.0.0 | planned |
Move column | 3.0.0.0 | 0.1.0.0 |
Pin column | 3.0.0.0 | planned |
Data grouping | 3.0.0.0 | planned |
Data sorting | 3.0.0.0 | 0.1.0.0 |
Data sorting by multiple columns | declined | 0.1.0.0 |
Select cells | 3.0.0.0 | declined |
Select rows | 3.0.0.0 | 0.1.0.0 |
Select columns | 3.0.0.0 | 0.1.0.0 |
Filter by value’ option | 3.1.0.0 | declined |
HotkeysOpportunity to use hotkeys, including adding URL from clipboard to the table via Ctrl+V or using Drag'n'Drop; | 3.0.0.0 | planned |
Context menu | 3.0.0.0 | 0.1.0.0 |
Check for duplicates | ||
Validate hreflangsCheck whether all hreflang attributes are set correctly on your website, if it's multilingual. | 3.5.0.0 | 0.1.0.0 |
Validate AMP pagesCrawler defines the AMP pages that do not meet the requirements of the AMP Project as an “Bad AMP HTML format” issue, and also finds indexed AMP pages. | 2.1.4.0 | 0.1.0.0 |
Detect Structured DataWith this new feature, you can detect what pages have structured data and which ones are missing it, though it should be there. Also, you can check what structured data formats (JSON-LD, Microdata, RDFa) and types the pages contain. | 3.7.0.0 | 0.1.0.0 |
Automatic data backup during/after the crawlingWhen you start the crawl, and even if the forces of fate are against you, the next time you run the program, it will open a temporary project, which was saved during the last backup in Netpeak Spider. You can save this project to your computer or continue crawling the website. The program automatically backups data:
Read more | 3.9.0.0 | 0.1.0.0 |
Spell checkingYou can check the spelling on the entire page and separately in the title, description, image alt tags, H1-H6 headings. In short, spell checker drills down all important and necessary places on your website. Read more | 3.9.0.0 | planned |
TemplatesCreate templates of parameters, settings, filters and segments. | 3.0.0.0 | 0.1.0.0 |
Was this article helpful?
That’s Great!
Thank you for your feedback
Sorry! We couldn't be helpful
Thank you for your feedback
Feedback sent
We appreciate your effort and will try to fix the article