Extracting home information from zillow.com
Building Web-Crawler Building a Web-Crawler Software is easy, and helps you take advantage of a data mining software. This document will help guide you through understanding this build process.
- BeautifulSoup4, module is needed and can be installed with pip install BeautifulSoup4.
- re, module is needed and can be installed with pip install re.
- matplotlib, module is needed to plot the graphs for the scatterplot and boxplot.
- pandas, module is needed to load the csv data into a dataframe.
2. Web-Crawler, using one of the following configurations: * macOS You can either use Web-Crawler for Mac or See installation instructions. * Linux Install Web-Crawler according to the [instructions] for your OS.
While it is possible to build a web-crawler using a local python installation, we have a build process that runs on a local environment. This simplifies initial set up and provides for a very consistent build and test environment.
## Key scripts
The following scripts are found in the build/ directory. Note that all scripts must be run from the Web-Crawler root directory. 1. src/webcrawler/move_csv.sh
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size Final_Project-1.0.tar.gz (254.7 kB)||File type Source||Python version None||Upload date||Hashes View hashes|