Skip to main content

No project description provided

Project description

ZJU Crawler

介绍

浙江大学学生教师信息爬虫,从教务网、查老师获取对应信息。可以结合QQ机器人等食用。
该包目前集成了两个爬虫:

  • 学生个人信息爬虫

    • 加权均绩
    • 加权四分制均绩(出国,旧)
    • 加权4.3分制均绩(出国,新)
    • 加权百分制成绩
    • 考试信息(时间、考场、……)
  • 查老师网站老师信息爬虫(评分,对应课程等)

未来将会添加学在浙大的爬虫(如果修好了)。

安装

PyPi

pip install zjucrawler

手动安装

下载后进入setup.py所在文件夹执行pip install .即可。
也可以下载.whl文件安装。

使用示例

导入

import zjucrawler
# or:
from zjucrawler import chalaoshi # Chalaoshi website(unofficial)
from zjucrawler import zju # Fetch from official websites

教师

import asyncio
from zjucrawler import chalaoshi # Chalaoshi website(unofficial)
async def main():
    teacher = input("teacher ID >>>")
    print(await chalaoshi.get_teacher_info(int(teacher))) # 获取教师信息
    # search_teachers 通过教师姓名/缩写获取教师列表
    # get_course_info 获取课程平均绩点、标准差
loop = asyncio.get_event_loop()
loop.run_until_complete(main())

学生

import asyncio
from zjucrawler import zju # Fetch from official websites
async def main():
    username = input("username>>>")
    pwd = input("pwd>>>")
    test = zju.Fetcher(username, pwd, simulated=False) # simulated指定是否模拟浏览器进行登录
    print(await test.get_GPA()) # 获取全科均绩,不含弃修
    # get_avg_score 获取加权平均百分制成绩
    # get_abroad_GPA_old 出国均绩旧(4分制) -2021级
    # get_abroad_GPA_new 出国均绩新(4.3分制)2022级-
    exams = list(await test.get_all_exams()) # 获取所有考试信息
    print(exams)
    print(test.__dict__)

loop = asyncio.get_event_loop()
loop.run_until_complete(main())

注意事项

确保包中含有security.js.
由于一些众所周知的神秘因素可能无法使用查老师(需要内网),请自行解决:)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zjucrawler-1.0.1.tar.gz (18.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

zjucrawler-1.0.1-py3-none-any.whl (19.0 kB view details)

Uploaded Python 3

File details

Details for the file zjucrawler-1.0.1.tar.gz.

File metadata

  • Download URL: zjucrawler-1.0.1.tar.gz
  • Upload date:
  • Size: 18.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for zjucrawler-1.0.1.tar.gz
Algorithm Hash digest
SHA256 0582f5d74b7e5b3dcebde8582658062bf62d96b15ceb8855b0260e674bcaa43d
MD5 edc13a525fb16f4e51074a751023751b
BLAKE2b-256 999bb4b6e3d7e44e4259411b3ee28914f3db314753efc01be98e0894b4338e94

See more details on using hashes here.

File details

Details for the file zjucrawler-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: zjucrawler-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 19.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for zjucrawler-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f58fee3d5b480ac3b6fc1ca081c55b13c8cbc7eb9bb304da7c9401aee2eb91dc
MD5 1a7e131b2575cee1b19f6dee544ed386
BLAKE2b-256 4e7d6ca52c7e34e2f1171c41d11ec7aadac1fd5f986fb65d07b6a2fec5ab7512

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page