Skip to main content

Google BigQuery API client library

Project description

pypi versions

Querying massive datasets can be time consuming and expensive without the right hardware and infrastructure. Google BigQuery solves this problem by enabling super-fast, SQL queries against append-mostly tables, using the processing power of Google’s infrastructure.

Quick Start

In order to use this library, you first need to go through the following steps:

  1. Select or create a Cloud Platform project.

  2. Enable billing for your project.

  3. Enable the Google Cloud Datastore API.

  4. Setup Authentication.

Installation

Install this library in a virtualenv using pip. virtualenv is a tool to create isolated Python environments. The basic problem it addresses is one of dependencies and versions, and indirectly permissions.

With virtualenv, it’s possible to install this library without needing system install permissions, and without clashing with the installed system dependencies.

Mac/Linux

pip install virtualenv
virtualenv <your-env>
source <your-env>/bin/activate
<your-env>/bin/pip install google-cloud-bigquery

Windows

pip install virtualenv
virtualenv <your-env>
<your-env>\Scripts\activate
<your-env>\Scripts\pip.exe install google-cloud-bigquery

Example Usage

Create a dataset

from google.cloud import bigquery
from google.cloud.bigquery import Dataset

client = bigquery.Client()

dataset_ref = client.dataset('dataset_name')
dataset = Dataset(dataset_ref)
dataset.description = 'my dataset'
dataset = client.create_dataset(dataset)  # API request

Load data from CSV

import csv

from google.cloud import bigquery
from google.cloud.bigquery import LoadJobConfig
from google.cloud.bigquery import SchemaField

client = bigquery.Client()

SCHEMA = [
    SchemaField('full_name', 'STRING', mode='required'),
    SchemaField('age', 'INTEGER', mode='required'),
]
table_ref = client.dataset('dataset_name').table('table_name')

load_config = LoadJobConfig()
load_config.skip_leading_rows = 1
load_config.schema = SCHEMA

# Contents of csv_file.csv:
#     Name,Age
#     Tim,99
with open('csv_file.csv', 'rb') as readable:
    client.load_table_from_file(
        readable, table_ref, job_config=load_config)  # API request

Perform a query

# Perform a query.
QUERY = (
    'SELECT name FROM `bigquery-public-data.usa_names.usa_1910_2013` '
    'WHERE state = "TX" '
    'LIMIT 100')
query_job = client.query(QUERY)  # API request
rows = query_job.result()  # Waits for query to finish

for row in rows:
    print(row.name)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

google-cloud-bigquery-1.6.2.tar.gz (153.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

google_cloud_bigquery-1.6.2-py2.py3-none-any.whl (125.7 kB view details)

Uploaded Python 2Python 3

File details

Details for the file google-cloud-bigquery-1.6.2.tar.gz.

File metadata

  • Download URL: google-cloud-bigquery-1.6.2.tar.gz
  • Upload date:
  • Size: 153.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.6.2 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.2

File hashes

Hashes for google-cloud-bigquery-1.6.2.tar.gz
Algorithm Hash digest
SHA256 fa5febc108bddb3f5d8421c85121a0da2a8fd48cd3919645bae2dcc0e6fbe14b
MD5 83c2fca095107173085dfd5d66827da8
BLAKE2b-256 522ea1cdacbe6dbffd424b8971369f97f513e52221d73ac31b3b736f9f92f61b

See more details on using hashes here.

File details

Details for the file google_cloud_bigquery-1.6.2-py2.py3-none-any.whl.

File metadata

  • Download URL: google_cloud_bigquery-1.6.2-py2.py3-none-any.whl
  • Upload date:
  • Size: 125.7 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.6.2 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.2

File hashes

Hashes for google_cloud_bigquery-1.6.2-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 4d969239bda46fd9b90972dd810e3c26bf61f540e6626a0b1ddbb1a265f8731a
MD5 78ebf9a8be17dd1537cafbd83c802e8a
BLAKE2b-256 0afe023fbc351245ac1246a53f23570ff03f644d5b3d81d3d6f77be059e898ed

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page