Функції Netpeak Spider для Windows і macOS

Змінено Fri, 22 Dec 2023 о 02:14 PM

В статті перечислені функції Netpeak Spider.

Стовпці «Windows» і «macOS» вказують версію програми, в якій з’явилася певна функція.

«declined» стосується функцій, які більше не підтримуються, а «planned» означає функції, які будуть додані в майбутньому.

Повний список змін знаходиться в історії змін.


Application automatic update

We regularly update our products, and update installing is a breeze. Launch Netpeak Spider and it will automatically update to the latest version. You don’t need to download anything from the site or reload.

Use the app on two devices

One user can connect two devices to the account. It is convenient if you use two machines: at home and at work or PC and laptop.

Multilingual interface

Netpeak Spider supports the English, Ukrainian, and Russian languages.

Change parameters during crawling

Turn on and turn off certain parameters before and during crawling. This allows you to reduce crawling time and the size of the project.

The parameters can also be switched off while analyzing the data to leave only the data you need in the table.

Check 100+ technical issues

Netpeak Spider is the first and foremost a tool for in-depth technical audit: check for broken links, duplicate pages and meta tags, canonical and pagination issues. These and dozens of other issues are highlighted by color in a sidebar, as well as in the main table. The program stores entire documentation on every issue: its description, threat it presents, how to fix it, and even a list of useful links.

Enhanced issue description

In the ‘Info’ panel at the bottom, you will find complex documentation on issues: description, threat, how to fix, and even some useful links.

Site structure

The ‘Site structure’ tab contains the full tree structure of a website that allows you to view pages for any directory.


The dashboard is presented in two forms depending on the crawling stage:

  • During crawling, it shows a brief overview of current settings. It means you can notice incorrect settings, stop crawling and make changes.
  • When crawling is complete, the dashboard shows graphs and charts with useful insights into crawled pages. They are interactive → click on any element and the corresponding results will be filtered.

Internal Database for Working with Huge Number of URLs

Netpeak Spider allows working with an internal database which is especially useful when you need to analyze millions of URLs. Right in the program, you can open any parameter from the database (incoming / outgoing / internal / external links, images, H1-H6 headings, canonical tags, redirects, issues) for a particular URL, selected or filtered URLs, or all results from the current table. It can also be done literally in one click via the context menu.

Crawl website from initial URL

Crawl a list of URLs

Virtual robots.txt

Crawl websites using basic authentication

JavaScript rendering

This feature comes in handy when crawling websites with JS scripts (meaning, almost all sites on the Internet). If you try crawling such site in without JS rendering, a crawler will be unable to detect data which is added with the help of JavaScript (links, descriptions, images, etc.), and therefore won't be able to analyze the pages correctly.

To execute JavaScript when crawling CSR sites, Netpeak Spider uses one of the latest versions of Chromium, which makes crawling as advanced and close to Googlebot as possible.

Quick search

Quick search allows finding a specific URL or parameter value in program tables.

Data filtering

You can set any filtering rules and combine them using the ‘AND’ and ‘OR’ rules. By default all reports in the program are built based on filtering (the ‘Dashboard’ and other reports in a sidebar), so don’t hesitate to use filters and create segments on their basis for further data analysis. 

Data segmentation

When crawling is finished, you can change a data view in the program limiting it to a particular segment (same as segmentation in Google Analytics). Set a segment manually or use any available filter as a segment (use the corresponding button above the main table).

Segmentation is a powerful feature that has lots of use in the program and helps analyze data from a completely different angle. Keep in mind that using segments affects all reports in the program, including ‘Issues’, ‘Overview’, ‘Site structure’, ‘Dashboard’, etc.

Data saving and copying

The program back up the collected data automatically. This is useful when there is a risk of a sudden computer shutdown and data loss. Additionally, after the crawling is over, you can save the project to have quick access to the crawled data, share the file with colleagues, and even open the list of URLs in Netpeak Checker (yup, our tools are cross-integrated, duh).

Use good ol` Ctrl+C hotkeys, or the 'Copy' option in the context menu. Oh, and make sure to try the extended copy button in the sidebar: in just one click you can copy to the clipboard the contents of the 'Issues', 'Overview', 'Site structure', 'Scraping' tabs.

Multi-window mode

This is an advanced feature for professionals with numerous projects: using it, you can open the program in additional windows and work in each of them simultaneously. planned

Custom website scraping and data extraction

Custom search of source code / text according to the 4 types of search: ‘Contains’, ‘RegExp’, ‘CSS Selector’, or ‘XPath’.

It allows scraping websites' source code and extract data from web pages.'

Copy / paste scraping parameters

Copy parameters' and 'Paste parameters' buttons on the 'Scraping' tab in the crawling settings. It allows to export or import scraping settings in the JSON format to / from the clipboard.

Internal PageRank calculation

Internal PageRank calculation tool will help you get deep insights into the project:

  • how link weight is distributed across the site and where it concentrates
  • what unimportant pages get extra link weight
  • what pages are 'dead ends' and simply ‘burn’ incoming link weight

While creating this tool, we used official Google documentation and patents.

Source code and HTTP headers analysis

You don’t have to open a page in a browser to quickly analyze its source code – just use the built-in tool in Netpeak Spider. It allows checking HTTP request and response headers, data on redirects, GET parameters in URL, and all extracted text without the HTML code, which is especially useful for text analysis.

Sitemap generation (XML, Image, HTML)

XML Sitemap validation

The validator in Netpeak Spider finds more than 30 issues in Sitemap files and Sitemap index files. The tool automatically detects Sitemap index files, unzips .gz archives, and helps you use sitemaps for your projects to the fullest extent.

Validation of Image Sitemap files

SEO Audit Data Enrichment

Upload your data from Netpeak Checker or other services, merge and compare them to carry out more complex SEO audits. You can work with imported parameters just like you do with the main ones: filter, segment and analyze them.

Read more

Multi-domain crawling

An opportunity to set simultaneous crawling of multiple domains. The algorithm is simple: upload the list of domains, run simultaneous crawling, analyze parameters and issues in a single report.

Custom HTTP request headers

Opportunity to set up custom HTTP request headers in the 'HTTP headers' tab in the program settings.

Bulk export

Data export in Netpeak Spider is limited only by your imagination :) Data is exported in the same form as it’s shown in the results table including sorting, grouping, columns width, and order. You can also set parameters for the reports.

With a special export menu, in just one click you can export all available reports, a set of specific reports, or a particular report you need for further work – to set a task for a developer or report to your clients. 

Basic technical SEO audit in PDF

Just in two clicks, you can export a report containing technical SEO audit in PDF. It shows key information required for a site audit – simply add your own recommendations and you can send it to your client or colleagues for further implementation.

This feature allows you to get the best of 'two worlds' in Netpeak Spider: the in-depth analysis and customization of the desktop tool, and the results visualization like in the most advanced online products.

White label reports with SEO audit

Opportunity to remove Netpeak Software branded elements from the PDF report with technical SEO audit and to add your own logo and necessary contact details in just a few clicks.

White label reports will help increase brand awareness and interest potential clients at the SEO services pre-sales stage. Also, they’ll help save time / energy and earn more for: 

  • freelancers → by deepening clients’ loyalty and trust through personalized audits;
  • agency specialists → by including our reports into corresponding business processes;
  • in-house team members → as an additional value for internal reporting between departments or by hierarchy.

Universal Analytics

Google Analytics 4

Google Search Console

Import of Search Queries from Google Search Console

Export of reports to Google Drive & Sheets

Advanced Results Table

The table in Netpeak Spider is highly optimized for working with heavy data (100,000 pages and more).

Such features make it special:

  • Data sorting and grouping;
  • Adjusting columns: switching visibility on/off, shifting, attaching, resizing (with autosaving for future sessions);
  • Highlighting issues depending on their severity;
  • Advanced engagement with context menu: besides basic features, there’s fast access to the database, quick transition to the 'Source Code and HTTP Headers Analysis' tool, opportunity to open a selected URL in a variety of external services;
  • Opportunity to use hotkeys, including adding URL from clipboard to the table via Ctrl+V or using Drag'n'Drop;
  • Hints when entering a URL in the address bar (like in Google search).

Table with infinite scroll

Table with pagination


Table State

Move column

Pin column

Data grouping

Data sorting

Data sorting by multiple columns


Select cells

Select rows

Select columns

Filter by value’ option


Opportunity to use hotkeys, including adding URL from clipboard to the table via Ctrl+V or using Drag'n'Drop;

Context menu

Check for duplicates

Validate hreflangs

Check whether all hreflang attributes are set correctly on your website, if it's multilingual.

Validate AMP pages

Crawler defines the AMP pages that do not meet the requirements of the AMP Project as an “Bad AMP HTML format” issue, and also finds indexed AMP pages.

Detect Structured Data

With this new feature, you can detect what pages have structured data and which ones are missing it, though it should be there. Also, you can check what structured data formats (JSON-LD, Microdata, RDFa) and types the pages contain.

Automatic data backup during/after the crawling

When you start the crawl, and even if the forces of fate are against you, the next time you run the program, it will open a temporary project, which was saved during the last backup in Netpeak Spider. You can save this project to your computer or continue crawling the website.

The program automatically backups data:

  • With time interval defined in 'Settings' -> 'General' tab
  • When you stop (or pause) the crawling
  • When the crawling is complete

Read more

Spell checking

You can check the spelling on the entire page and separately in the title, description, image alt tags, H1-H6 headings. In short, spell checker drills down all important and necessary places on your website.

Read more


Create templates of parameters, settings, filters and segments.

Ця стаття була корисною?


Дякуємо за відгук

Даруйте, що не вдалося допомогти вам

Дякуємо за відгук

Розкажіть, як ми можемо поліпшити цю статтю!

Виберіть принаймні одну причину
Необхідна перевірка CAPTCHA.

Відгук надіслано

Дякуємо за допомогу! Ми докладемо всіх зусиль, щоби виправити статтю