Skip to main content

Extracting home information from zillow.com

Project description

Building Web-Crawler Building a Web-Crawler Software is easy, and helps you take advantage of a data mining software. This document will help guide you through understanding this build process.

## Requirements

  1. BeautifulSoup4, module is needed and can be installed with pip install BeautifulSoup4.

  2. re, module is needed and can be installed with pip install re.

  3. matplotlib, module is needed to plot the graphs for the scatterplot and boxplot.

  4. pandas, module is needed to load the csv data into a dataframe.

2. Web-Crawler, using one of the following configurations: * macOS You can either use Web-Crawler for Mac or See installation instructions. * Linux Install Web-Crawler according to the [instructions] for your OS.

## Overview

While it is possible to build a web-crawler using a local python installation, we have a build process that runs on a local environment. This simplifies initial set up and provides for a very consistent build and test environment.

## Key scripts

The following scripts are found in the build/ directory. Note that all scripts must be run from the Web-Crawler root directory. 1. src/webcrawler/move_csv.sh

Project details


Release history Release notifications | RSS feed

This version

1.0

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Final_Project-1.0.tar.gz (254.7 kB view details)

Uploaded Source

File details

Details for the file Final_Project-1.0.tar.gz.

File metadata

  • Download URL: Final_Project-1.0.tar.gz
  • Upload date:
  • Size: 254.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Python-urllib/3.6

File hashes

Hashes for Final_Project-1.0.tar.gz
Algorithm Hash digest
SHA256 610081d8eb2de7afa9cb8ea1c004801d4ebdbe3ee0ad9721f913d01f265a26d2
MD5 cc8cbb759428482b676f9497b1a6226f
BLAKE2b-256 c8427a532b8bec5bdb1e0784d1c117c21f675693ddf71936ec1a6e6b9541fb9b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page