notes for Python script for google indexing API (200 URLs per day):
UPDATE NOV 2024
pip3 install google-api-python-client pip3 install openpyxl ------------------------------------- new use .xlsx instead of CSV!!!!!!!!! -------------------------------------
Setup Google Developer Console
log into Google Developer Console (gmail acc): https://console.developers.google.com/ click create project (top dropdown) name it: R-Indexing API API & Services -> Credentials click on manage service accounts (bottom link) Create service account: name it: r-indexing service account click create and continue Role: Owner write down the email address: r-indexing-service-account@r-indexing-api.iam.gserviceaccount.com account -> actions -> manage keys -> add keys add new key key type: JSON click create -> close window (a JSON file will be automatically downloaded) top left hamburger dropdown -> APIs and Services -> Library search for: indexing API click on Web Search Indexing API -> Enable log into google search console with original acc (anotherAcc@gmail) einstellungen -> nutzer und berechtigungen -> add add firstAcc@gmail Account add service acc email permission: inhaber/owner
python setup
in terminal:
pip3 install oauth2client httplib2
pip3 install pandas create a folder on Desktop: indexAPI indexing.py urls.csv r-indexing-api-123.json change the json file name in python script copying URLs quickly firefox addon to copy browser tabs: https://addons.mozilla.org/en-US/firefox/addon/copy-all-tab-urls-we/
usage
python3 indexing.py
src
https://www.hbfreelance.com/how-to-use-google-indexing-api-to-submit-urls-in-bulk-using-python/