Skip to content

Scrape results for site:{domain}, filter only for links that have a status different than 200 (togglable by args) and write them in a csv. Excellent to run after a site update to remap the google results and minimize the impact on SEO.

License

Notifications You must be signed in to change notification settings

luchesigui/google-stutus-check

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Welcome to site redirect csv 👋

Version License: MIT Twitter: luchesigui

Scrape results for site:{domain}, filter only for links that have a status different than 200 (togglable by args) and write them in a csv. Excellent to run after a site update to remap the google results and minimize the impact on SEO.

Install

npm install

Usage

npm start

Command line flags

  • --domain (alias: -d): The domain to search for. Required.

  • --csv-name: The name of the csv file will be created. Default: filtered-urls.csv.

  • --pages (alias: -p): Quantity of pages to scrape. Default: 10.

  • --check (alias: -c): Rather the links scraped should be checked for redirects or not. Default: true.

Run tests

npm run test

Author

👤 Guilherme Luchesi


This README was generated with ❤️ by readme-md-generator

About

Scrape results for site:{domain}, filter only for links that have a status different than 200 (togglable by args) and write them in a csv. Excellent to run after a site update to remap the google results and minimize the impact on SEO.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published