Skip to main content

A minimal implementation of KaRR knowledge assessment method for Large Language Models (LLMs)

Project description

Statistical Knowledge Assessment for Large Language Models

A minimal implementation of KaRR knowledge assessment method from the following paper:

Statistical Knowledge Assessment for Large Language Models,
Qingxiu Dong, Jingjing Xu, Lingpeng Kong, Zhifang Sui, Lei Li
arXiv preprint (arxiv_version)

This is a fork of the official implementation released by the authors.

How to use?

First, create a new virtual environment, then install Pytorch-CUDA, and finally install minkarr using the command:

pip install minkarr

Here is a simple example of how to quantify the knowledge of a fact by an LLM using KaRR

from minkarr import KaRR
from transformers import AutoModelForCausalLM, AutoTokenizer

model_name = "gpt2"
device = "cuda"
model = AutoModelForCausalLM.from_pretrained(model_name).cuda()
tokenizer = AutoTokenizer.from_pretrained(model_name)

karr = KaRR(model, tokenizer, device)

# Testing the fact: (France, capital, Paris)
# You can find other facts by looking into Wikidata
fact = ("Q142", "P36", "Q90")

karr, does_know = karr.compute(fact)
print("Fact %s" % str(fact))
print("KaRR = %s" % karr)
ans = "Yes" if does_know else "No"
print("According to KaRR, does the model knows this fact? Answer: %s" % ans)
# Output:
# KaRR = 3.338972442145268
# According to KaRR, does the model knows this fact? Answer: No

Difference with original repo

  • Easy-to-use
  • Cleaner code
  • Minimalistic implementation: I kept only the portion of the code needed to compute KaRR and removed the rest
  • This implementation can compute KaRR on a single fact (the original implementation went through all facts)

Change storage location

Set the environment variable STORAGE_FOLDER to choose where to store the data that MinKaRR downloads.

Citation

Cite the original authors using:

@misc{dong2023statistical,
      title={Statistical Knowledge Assessment for Large Language Models}, 
      author={Qingxiu Dong and Jingjing Xu and Lingpeng Kong and Zhifang Sui and Lei Li},
      year={2023},
      journal = {Proceedings of NeurIPS},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

minkarr-0.1.6.tar.gz (97.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

minkarr-0.1.6-py3-none-any.whl (16.5 kB view details)

Uploaded Python 3

File details

Details for the file minkarr-0.1.6.tar.gz.

File metadata

  • Download URL: minkarr-0.1.6.tar.gz
  • Upload date:
  • Size: 97.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.10

File hashes

Hashes for minkarr-0.1.6.tar.gz
Algorithm Hash digest
SHA256 e19f5a408597c9650a7fde34387d68841d1edb5cd414b8df7fe38d7ab7ac48c4
MD5 9e44bd7e36f06f7d1fffa1cdcd605374
BLAKE2b-256 d99af537aedf4d448ff546725096b981f0158cfe64707f6357984fb156e88335

See more details on using hashes here.

File details

Details for the file minkarr-0.1.6-py3-none-any.whl.

File metadata

  • Download URL: minkarr-0.1.6-py3-none-any.whl
  • Upload date:
  • Size: 16.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.10

File hashes

Hashes for minkarr-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 f3fb438d87505ca04e831e6da72ab2291e8468f248a7481ce4920b8b60bf667d
MD5 7c1c282dd098a9001f4891c1eb515887
BLAKE2b-256 8a98e51cf589374f7946b4fd794353f28eb81beab76d445bf30a394e83905ce1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page