scrapy-zyte-smartproxy icon indicating copy to clipboard operation
scrapy-zyte-smartproxy copied to clipboard

Support more secure ways to declare the APIKEY

Open BurnzZ opened this issue 3 years ago • 1 comments

BACKGROUND:

As of version 1.6.0, there are two (2) ways of adding the API KEYS:

  1. via the settings.py:
CRAWLERA_APIKEY = 'apikey'
  1. via spider attribute:
class SampleSpider(scrapy.Spider):
    crawlera_apikey = 'apikey'

When using Scrapy Cloud, we could also declare it via:

  1. via Spider/Project settings

image

  1. via Scrapy Cloud Crawlera add-on

image

PROBLEM

What actually happens in reality is that the API KEYS are being written inside the code and committed in the repo.

The best practice would be to avoid any sensitive keys to be coupled alongside the code. #3 and #4 above already fixes this problem as we have the option to only declare the keys inside Scrapy Cloud.

However, this becomes a problem when trying to run the spider locally during development as the keys might not be there.

OBJECTIVES

This issue aims to be a discussion ground on exploring better ways to handle it.

For starters, here are a couple of ways to approach it:

  • A. Set and retrieve the keys via environment variables.

  • B. Set and retrieve the keys via local file that is uncommited to the repo. - Examples would be similar to how SSH keys are stored in ~/.ssh and AWS Keys in ~/.aws.

Either way, it should support different API KEYs per spider.

BurnzZ avatar Jul 28 '20 10:07 BurnzZ

Option A is already doable through: https://docs.scrapy.org/en/latest/topics/settings.html#command-line-options

Gallaecio avatar Jul 28 '20 11:07 Gallaecio