Skip to main content

scrapy模拟淘宝登陆,未加代理ip的处理。希望有好的代理处理方法分享出来。.

Project description

scrapy-taobao

scrapy模拟淘宝登陆,未加代理ip的处理。希望有好的代理处理方法分享出来。

确保安装了scrapy。

self.http_user = 'xxxxxxxx'   # taobao username
self.http_pass = 'xxxxxxxx'   # taobao password

记得修改taobao_spider.py中的用户名username和密码password。<br>

运行命令

scrapy crawl taobao

如果用户登陆需要输入验证码,则会自动打开验证码的图片链接让客户手动输入,输入错误会重新打开验证码的图片链接供用户再次输入。

登陆成功的提示

login-success, get user nick: ["user nick"]

用户看到这句代表登陆成功,可以进行一些其他数据的提取。

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scrapy_taobao-1.1.5.tar.gz (6.6 kB view details)

Uploaded Source

Built Distribution

scrapy_taobao-1.1.5-py3-none-any.whl (7.4 kB view details)

Uploaded Python 3

File details

Details for the file scrapy_taobao-1.1.5.tar.gz.

File metadata

  • Download URL: scrapy_taobao-1.1.5.tar.gz
  • Upload date:
  • Size: 6.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.11.2

File hashes

Hashes for scrapy_taobao-1.1.5.tar.gz
Algorithm Hash digest
SHA256 39fdcebb580c4f89999eb2ced9f09ebb31b8e0cd8d2b37374d3c3ce088420ae0
MD5 d05c0408d0e1c5f9e0f17d1fa44109f6
BLAKE2b-256 5b5c93efef48e76b908119b138b339dcdc852ca75c61d046f1f1f4cf26881734

See more details on using hashes here.

File details

Details for the file scrapy_taobao-1.1.5-py3-none-any.whl.

File metadata

File hashes

Hashes for scrapy_taobao-1.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 6d9c5d1c6199ee8971168c063a9abf6c5d401ad0fce0deeaa00965d3b496080c
MD5 c9a0e42332b9a60b0d51915f94a8a38e
BLAKE2b-256 19f6266e4732b2f3abba733aa58ae54fcd1c9dd26a16fc2ce7984103700e6dac

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page