Skip to main content

Getting Uniprot Data from Uniprot Accession ID through Uniprot REST API

Project description

UniProt Database Web Parser Project

TLDR: This parser can be used to parse UniProt accession id and obtain related data from the UniProt web database.

With version 1.0.5, support for asyncio through aiohttp has been added to betaparser. Usage can be seen as follow

from uniprotparser.betaparser import UniprotParser
from io import StringIO
import asyncio
import pandas as pd

async def main():
    example_acc_list = ["Q99490", "Q8NEJ0", "Q13322", "P05019", "P35568", "Q15323"]
    parser = UniprotParser()
    df = []
    #Yield result for 500 accession ids at a time
    async for r in parser.parse_async(ids=example_acc_list):
        df.append(pd.read_csv(StringIO(r), sep="\t"))
    
    #Check if there were more than one result and consolidate them into one dataframe
    if len(df) > 0:
        df = pd.concat(df, ignore_index=True)
    else:
        df = df[0]

asyncio.run(main())

With version 1.0.2, support for the new UniProt REST API have been added under betaparser module of the package.

In order to utilize this new module, you can follow the example bellow

from uniprotparser.betaparser import UniprotParser
from io import StringIO

import pandas as pd
example_acc_list = ["Q99490", "Q8NEJ0", "Q13322", "P05019", "P35568", "Q15323"]
parser = UniprotParser()
df = []
#Yield result for 500 accession ids at a time
for r in parser.parse(ids=example_acc_list):
    df.append(pd.read_csv(StringIO(r), sep="\t"))

#Check if there were more than one result and consolidate them into one dataframe
if len(df) > 0:
    df = pd.concat(df, ignore_index=True)
else:
    df = df[0]

To parse UniProt accession with legacy API

from uniprotparser.parser import UniprotSequence

protein_id = "seq|P06493|swiss"

acc_id = UniprotSequence(protein_id, parse_acc=True)

#Access ACCID
acc_id.accession

#Access isoform id
acc_id.isoform

To get additional data from UniProt online database

from uniprotparser.parser import UniprotParser
from io import StringIO
#Install pandas first to handle tabulated data
import pandas as pd

protein_accession = "P06493"

parser = UniprotParser([protein_accession])

#To get tabulated data
result = []
for i in parser.parse("tab"):
    tab_data = pd.read_csv(i, sep="\t")
    last_column_name = tab_data.columns[-1]
    tab_data.rename(columns={last_column_name: "query"}, inplace=True)
    result.append(tab_data)
fin = pd.concat(result, ignore_index=True)

#To get fasta sequence
with open("fasta_output.fasta", "wt") as fasta_output:
    for i in parser.parse():
        fasta_output.write(i)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

uniprotparser-1.0.8.tar.gz (6.6 kB view details)

Uploaded Source

Built Distribution

uniprotparser-1.0.8-py3-none-any.whl (6.9 kB view details)

Uploaded Python 3

File details

Details for the file uniprotparser-1.0.8.tar.gz.

File metadata

  • Download URL: uniprotparser-1.0.8.tar.gz
  • Upload date:
  • Size: 6.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.13 CPython/3.10.5 Windows/10

File hashes

Hashes for uniprotparser-1.0.8.tar.gz
Algorithm Hash digest
SHA256 56ba3b363c00f7bcf7d1ddae26832b4645ef035e1e47c41a674566d1b169ee9d
MD5 c806fab77f2e9d71eb95df5663333348
BLAKE2b-256 27c24db934d1daa304b9fc3f834345c7fb97a82acf34d2fe1e391f2684925610

See more details on using hashes here.

File details

Details for the file uniprotparser-1.0.8-py3-none-any.whl.

File metadata

  • Download URL: uniprotparser-1.0.8-py3-none-any.whl
  • Upload date:
  • Size: 6.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.13 CPython/3.10.5 Windows/10

File hashes

Hashes for uniprotparser-1.0.8-py3-none-any.whl
Algorithm Hash digest
SHA256 45358a2037f0ca454680aa65ca40acc41ff0531aaaf9275a4309ba80f2b2d232
MD5 5930e8c8d60f5e40affd5b27f6ac7ea2
BLAKE2b-256 1c9885a16cacf091982605065be7ab8301bfc0cad81026bdcf4a5c008909e048

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page