Netpeak Spider features on Windows and macOS

Modified on Fri, 22 Dec 2023 at 02:12 PM

Here are the functions of the Netpeak Spider.


The 'Windows' and 'macOS' columns indicate the version of the program in which a certain function appeared.


'Declined' refers to features that are no longer supported, while 'planned' denotes features to be added in the future.


A complete list of changes is indicated in change log.



FeatureWindowsmacOS

Application automatic update


We regularly update our products, and update installing is a breeze. Launch Netpeak Spider and it will automatically update to the latest version. You don’t need to download anything from the site or reload. 


2.0.0.00.1.0.0

Use the app on two devices


One user can connect two devices to the account. It is convenient if you use two machines: at home and at work or PC and laptop.


2.0.0.00.1.0.0

Multilingual interface


Netpeak Spider supports the English, Ukrainian, and Russian languages.


2.1.0.40.1.0.0

Change parameters during crawling


Turn on and turn off certain parameters before and during crawling. This allows you to reduce crawling time and the size of the project.


The parameters can also be switched off while analyzing the data to leave only the data you need in the table.


3.0.0.00.1.0.0

Check 100+ technical issues


Netpeak Spider is the first and foremost a tool for in-depth technical audit: check for broken links, duplicate pages and meta tags, canonical and pagination issues. These and dozens of other issues are highlighted by color in a sidebar, as well as in the main table. The program stores entire documentation on every issue: its description, threat it presents, how to fix it, and even a list of useful links.


2.1.0.00.1.0.0

Enhanced issue description


In the ‘Info’ panel at the bottom, you will find complex documentation on issues: description, threat, how to fix, and even some useful links.


3.2.0.00.1.0.0

Site structure


The ‘Site structure’ tab contains the full tree structure of a website that allows you to view pages for any directory.


3.0.0.00.1.0.0

Dashboard


The dashboard is presented in two forms depending on the crawling stage:


  • During crawling, it shows a brief overview of current settings. It means you can notice incorrect settings, stop crawling and make changes.
  • When crawling is complete, the dashboard shows graphs and charts with useful insights into crawled pages. They are interactive → click on any element and the corresponding results will be filtered.
3.0.0.0planned

Internal Database for Working with Huge Number of URLs


Netpeak Spider allows working with an internal database which is especially useful when you need to analyze millions of URLs. Right in the program, you can open any parameter from the database (incoming / outgoing / internal / external links, images, H1-H6 headings, canonical tags, redirects, issues) for a particular URL, selected or filtered URLs, or all results from the current table. It can also be done literally in one click via the context menu.


3.0.0.00.1.0.0

Crawl website from initial URL


2.1.0.00.1.0.0

Crawl a list of URLs


2.1.0.80.1.0.0

Virtual robots.txt


3.0.0.00.1.0.0

Crawl websites using basic authentication


2.1.4.00.1.0.0

JavaScript rendering


This feature comes in handy when crawling websites with JS scripts (meaning, almost all sites on the Internet). If you try crawling such site in without JS rendering, a crawler will be unable to detect data which is added with the help of JavaScript (links, descriptions, images, etc.), and therefore won't be able to analyze the pages correctly.


To execute JavaScript when crawling CSR sites, Netpeak Spider uses one of the latest versions of Chromium, which makes crawling as advanced and close to Googlebot as possible.


3.2.0.00.2.0.0

Quick search


Quick search allows finding a specific URL or parameter value in program tables.


3.0.0.00.1.0.0

Data filtering


You can set any filtering rules and combine them using the ‘AND’ and ‘OR’ rules. By default all reports in the program are built based on filtering (the ‘Dashboard’ and other reports in a sidebar), so don’t hesitate to use filters and create segments on their basis for further data analysis.


3.0.0.0 0.1.0.0 

Data segmentation


When crawling is finished, you can change a data view in the program limiting it to a particular segment (same as segmentation in Google Analytics). Set a segment manually or use any available filter as a segment (use the corresponding button above the main table).


Segmentation is a powerful feature that has lots of use in the program and helps analyze data from a completely different angle. Keep in mind that using segments affects all reports in the program, including ‘Issues’, ‘Overview’, ‘Site structure’, ‘Dashboard’, etc.


3.0.0.00.1.0.0

Data saving and copying


The program back up the collected data automatically. This is useful when there is a risk of a sudden computer shutdown and data loss. Additionally, after the crawling is over, you can save the project to have quick access to the crawled data, share the file with colleagues, and even open the list of URLs in Netpeak Checker (yup, our tools are cross-integrated, duh).


Use good ol` Ctrl+C hotkeys, or the 'Copy' option in the context menu. Oh, and make sure to try the extended copy button in the sidebar: in just one click you can copy to the clipboard the contents of the 'Issues', 'Overview', 'Site structure', 'Scraping' tabs.


2.0.0.0 0.1.0.0

Multi-window mode


This is an advanced feature for professionals with numerous projects: using it, you can open the program in additional windows and work in each of them simultaneously.


2.0.0.0 planned

Custom website scraping and data extraction


Custom search of source code / text according to the 4 types of search: ‘Contains’, ‘RegExp’, ‘CSS Selector’, or ‘XPath’.


It allows scraping websites' source code and extract data from web pages.'


2.1.1.40.1.0.0

Copy / paste scraping parameters


Copy parameters' and 'Paste parameters' buttons on the 'Scraping' tab in the crawling settings. It allows to export or import scraping settings in the JSON format to / from the clipboard.


3.9.0.00.1.0.0

Internal PageRank calculation


Internal PageRank calculation tool will help you get deep insights into the project:


  • how link weight is distributed across the site and where it concentrates
  • what unimportant pages get extra link weight
  • what pages are 'dead ends' and simply ‘burn’ incoming link weight

While creating this tool, we used official Google documentation and patents.


2.1.1.2planned

Source code and HTTP headers analysis


You don’t have to open a page in a browser to quickly analyze its source code – just use the built-in tool in Netpeak Spider. It allows checking HTTP request and response headers, data on redirects, GET parameters in URL, and all extracted text without the HTML code, which is especially useful for text analysis.


2.1.1.0planned

Sitemap generation (XML, Image, HTML)


2.1.1.3planned

XML Sitemap validation


The validator in Netpeak Spider finds more than 30 issues in Sitemap files and Sitemap index files. The tool automatically detects Sitemap index files, unzips .gz archives, and helps you use sitemaps for your projects to the fullest extent.


2.1.0.7planned

Validation of Image Sitemap files


2.1.1.3planned

SEO Audit Data Enrichment


Upload your data from Netpeak Checker or other services, merge and compare them to carry out more complex SEO audits. You can work with imported parameters just like you do with the main ones: filter, segment and analyze them.



Read more
3.11.0.10.1.0.0

Multi-domain crawling


An opportunity to set simultaneous crawling of multiple domains. The algorithm is simple: upload the list of domains, run simultaneous crawling, analyze parameters and issues in a single report.


3.5.0.00.1.0.0

Custom HTTP request headers


Opportunity to set up custom HTTP request headers in the 'HTTP headers' tab in the program settings.


3.6.0.00.1.0.0

Bulk export


Data export in Netpeak Spider is limited only by your imagination :) Data is exported in the same form as it’s shown in the results table including sorting, grouping, columns width, and order. You can also set parameters for the reports.


With a special export menu, in just one click you can export all available reports, a set of specific reports, or a particular report you need for further work – to set a task for a developer or report to your clients.


3.0.0.0 0.1.0.0 

Basic technical SEO audit in PDF


Just in two clicks, you can export a report containing technical SEO audit in PDF. It shows key information required for a site audit – simply add your own recommendations and you can send it to your client or colleagues for further implementation.


This feature allows you to get the best of 'two worlds' in Netpeak Spider: the in-depth analysis and customization of the desktop tool, and the results visualization like in the most advanced online products.


3.2.0.0 0.2.0.0

White label reports with SEO audit


Opportunity to remove Netpeak Software branded elements from the PDF report with technical SEO audit and to add your own logo and necessary contact details in just a few clicks.


White label reports will help increase brand awareness and interest potential clients at the SEO services pre-sales stage. Also, they’ll help save time / energy and earn more for: 

  • freelancers → by deepening clients’ loyalty and trust through personalized audits;
  • agency specialists → by including our reports into corresponding business processes;
  • in-house team members → as an additional value for internal reporting between departments or by hierarchy.
3.4.0.00.2.0.0

Universal Analytics


3.3.0.00.1.0.0

Google Analytics 4


3.12.0.00.1.0.0

Google Search Console


3.3.0.00.1.0.0

Import of Search Queries from Google Search Console


3.6.0.00.1.0.0

Export of reports to Google Drive & Sheets


3.7.0.00.1.0.0

Advanced Results Table


The table in Netpeak Spider is highly optimized for working with heavy data (100,000 pages and more).


Such features make it special:

  • Data sorting and grouping;
  • Adjusting columns: switching visibility on/off, shifting, attaching, resizing (with autosaving for future sessions);
  • Highlighting issues depending on their severity;
  • Advanced engagement with context menu: besides basic features, there’s fast access to the database, quick transition to the 'Source Code and HTTP Headers Analysis' tool, opportunity to open a selected URL in a variety of external services;
  • Opportunity to use hotkeys, including adding URL from clipboard to the table via Ctrl+V or using Drag'n'Drop;
  • Hints when entering a URL in the address bar (like in Google search).
3.0.0.00.1.0.0

Table with infinite scroll


3.0.0.0declined

Table with pagination


declined0.1.0.0

Table state


3.0.0.0planned

Move column


3.0.0.00.1.0.0

Pin column


3.0.0.0planned

Data grouping


3.0.0.0planned

Data sorting


3.0.0.00.1.0.0

Data sorting by multiple columns


declined0.1.0.0

Select cells


3.0.0.0declined

Select rows


3.0.0.00.1.0.0

Select columns


3.0.0.00.1.0.0

Filter by value’ option


3.1.0.0declined

Hotkeys


Opportunity to use hotkeys, including adding URL from clipboard to the table via Ctrl+V or using Drag'n'Drop;


3.0.0.0planned

Context menu


3.0.0.00.1.0.0

Check for duplicates


Validate hreflangs


Check whether all hreflang attributes are set correctly on your website, if it's multilingual.


3.5.0.00.1.0.0

Validate AMP pages


Crawler defines the AMP pages that do not meet the requirements of the AMP Project as an “Bad AMP HTML format” issue, and also finds indexed AMP pages.


2.1.4.00.1.0.0

Detect Structured Data


With this new feature, you can detect what pages have structured data and which ones are missing it, though it should be there. Also, you can check what structured data formats (JSON-LD, Microdata, RDFa) and types the pages contain.


3.7.0.00.1.0.0

Automatic data backup during/after the crawling


When you start the crawl, and even if the forces of fate are against you, the next time you run the program, it will open a temporary project, which was saved during the last backup in Netpeak Spider. You can save this project to your computer or continue crawling the website.


The program automatically backups data:

  • With time interval defined in 'Settings' -> 'General' tab
  • When you stop (or pause) the crawling
  • When the crawling is complete

Read more
3.9.0.00.1.0.0

Spell checking


You can check the spelling on the entire page and separately in the title, description, image alt tags, H1-H6 headings. In short, spell checker drills down all important and necessary places on your website.



Read more
3.9.0.0planned

Templates


Create templates of parameters, settings, filters and segments.


3.0.0.00.1.0.0

Was this article helpful?

That’s Great!

Thank you for your feedback

Sorry! We couldn't be helpful

Thank you for your feedback

Let us know how can we improve this article!

Select atleast one of the reasons
CAPTCHA verification is required.

Feedback sent

We appreciate your effort and will try to fix the article