crawlab icon indicating copy to clipboard operation
crawlab copied to clipboard

【优化】scrapy项目上传,爬虫解压的逻辑优化

Open tikazyq opened this issue 5 years ago • 1 comments

scrapy项目上传,爬虫解压的逻辑 正常在本地打包项目,如果进入项目中进行压缩感觉有些反人类的操作 在后台创建文件的目录树:

└── test2
    ├── md5.txt
    ├── scrapy.cfg
    └── test2
        ├── __init__.py
        ├── __pycache__
        │   ├── __init__.cpython-38.pyc
        │   └── settings.cpython-38.pyc
        ├── items.py
        ├── middlewares.py
        ├── pipelines.py
        ├── settings.py
        └── spiders
上传文件自动解压的目录树:
└── test_spider
    ├──  md5.txt
    └──  test_spider
        ├──  scrapy.cfg
        └── test_spider
            ├── __init__.py
            ├── __pycache__
            ├── items.py
            ├── middlewares.py
            ├── pipelines.py
            ├── settings.py
            └── spiders

多创建了一层文件夹

Originally posted by @stone0311 in https://github.com/crawlab-team/crawlab/issues/776#issuecomment-685419684

tikazyq avatar Sep 02 '20 07:09 tikazyq

解决办法:zip包只有单个文件夹时,进入一层再上传

tikazyq avatar Sep 02 '20 07:09 tikazyq