<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0">
  <channel>
    <title>PyPI recent updates for TruthTorchLM</title>
    <link>https://pypi.org/project/truthtorchlm/</link>
    <description>Recent updates to the Python Package Index for TruthTorchLM</description>
    <language>en</language>    <item>
      <title>0.1.19</title>
      <link>https://pypi.org/project/truthtorchlm/0.1.19/</link>
      <description>TruthTorchLM is an open-source library designed to assess truthfulness in language models&#39; outputs. The library integrates state-of-the-art methods, offers comprehensive benchmarking tools across various tasks, and enables seamless integration with popular frameworks like Huggingface and LiteLLM.</description>
<author>ybakman@usc.edu</author>      <pubDate>Fri, 20 Feb 2026 22:06:02 GMT</pubDate>
    </item>    <item>
      <title>0.1.18</title>
      <link>https://pypi.org/project/truthtorchlm/0.1.18/</link>
      <description>TruthTorchLM is an open-source library designed to assess truthfulness in language models&#39; outputs. The library integrates state-of-the-art methods, offers comprehensive benchmarking tools across various tasks, and enables seamless integration with popular frameworks like Huggingface and LiteLLM.</description>
<author>ybakman@usc.edu</author>      <pubDate>Fri, 20 Feb 2026 21:28:40 GMT</pubDate>
    </item>    <item>
      <title>0.1.17</title>
      <link>https://pypi.org/project/truthtorchlm/0.1.17/</link>
      <description>TruthTorchLM is an open-source library designed to assess truthfulness in language models&#39; outputs. The library integrates state-of-the-art methods, offers comprehensive benchmarking tools across various tasks, and enables seamless integration with popular frameworks like Huggingface and LiteLLM.</description>
<author>ybakman@usc.edu</author>      <pubDate>Fri, 14 Mar 2025 03:49:32 GMT</pubDate>
    </item>    <item>
      <title>0.1.14</title>
      <link>https://pypi.org/project/truthtorchlm/0.1.14/</link>
      <description>TruthTorchLM is an open-source library designed to assess truthfulness in language models&#39; outputs. The library integrates state-of-the-art methods, offers comprehensive benchmarking tools across various tasks, and enables seamless integration with popular frameworks like Huggingface and LiteLLM.</description>
<author>ybakman@usc.edu</author>      <pubDate>Wed, 05 Feb 2025 05:22:53 GMT</pubDate>
    </item>    <item>
      <title>0.1.13</title>
      <link>https://pypi.org/project/truthtorchlm/0.1.13/</link>
      <description>TruthTorchLM is an open-source library designed to assess truthfulness in language models&#39; outputs. The library integrates state-of-the-art methods, offers comprehensive benchmarking tools across various tasks, and enables seamless integration with popular frameworks like Huggingface and LiteLLM.</description>
<author>ybakman@usc.edu</author>      <pubDate>Tue, 04 Feb 2025 03:14:05 GMT</pubDate>
    </item>    <item>
      <title>0.1.12</title>
      <link>https://pypi.org/project/truthtorchlm/0.1.12/</link>
      <description>TruthTorchLM is an open-source library designed to assess truthfulness in language models&#39; outputs. The library integrates state-of-the-art methods, offers comprehensive benchmarking tools across various tasks, and enables seamless integration with popular frameworks like Huggingface and LiteLLM.</description>
<author>ybakman@usc.edu</author>      <pubDate>Tue, 04 Feb 2025 03:00:26 GMT</pubDate>
    </item>    <item>
      <title>0.1.11</title>
      <link>https://pypi.org/project/truthtorchlm/0.1.11/</link>
      <description>TruthTorchLM is an open-source library designed to assess truthfulness in language models&#39; outputs. The library integrates state-of-the-art methods, offers comprehensive benchmarking tools across various tasks, and enables seamless integration with popular frameworks like Huggingface and LiteLLM.</description>
<author>ybakman@usc.edu</author>      <pubDate>Tue, 04 Feb 2025 01:13:51 GMT</pubDate>
    </item>    <item>
      <title>0.1.9</title>
      <link>https://pypi.org/project/truthtorchlm/0.1.9/</link>
      <description>TruthTorchLM is an open-source library designed to assess truthfulness in language models&#39; outputs. The library integrates state-of-the-art methods, offers comprehensive benchmarking tools across various tasks, and enables seamless integration with popular frameworks like Huggingface and LiteLLM.</description>
<author>ybakman@usc.edu</author>      <pubDate>Mon, 03 Feb 2025 00:32:29 GMT</pubDate>
    </item>    <item>
      <title>0.1.8</title>
      <link>https://pypi.org/project/truthtorchlm/0.1.8/</link>
      <description>TruthTorchLM is an open-source library designed to assess truthfulness in language models&#39; outputs. The library integrates state-of-the-art methods, offers comprehensive benchmarking tools across various tasks, and enables seamless integration with popular frameworks like Huggingface and LiteLLM.</description>
<author>ybakman@usc.edu</author>      <pubDate>Mon, 27 Jan 2025 04:29:23 GMT</pubDate>
    </item>    <item>
      <title>0.1.7</title>
      <link>https://pypi.org/project/truthtorchlm/0.1.7/</link>
      <description>TruthTorchLM is an open-source library designed to assess truthfulness in language models&#39; outputs. The library integrates state-of-the-art methods, offers comprehensive benchmarking tools across various tasks, and enables seamless integration with popular frameworks like Huggingface and LiteLLM.</description>
<author>ybakman@usc.edu</author>      <pubDate>Sat, 25 Jan 2025 18:58:50 GMT</pubDate>
    </item>    <item>
      <title>0.1.6</title>
      <link>https://pypi.org/project/truthtorchlm/0.1.6/</link>
      <description>TruthTorchLM is an open-source library designed to assess truthfulness in language models&#39; outputs. The library integrates state-of-the-art methods, offers comprehensive benchmarking tools across various tasks, and enables seamless integration with popular frameworks like Huggingface and LiteLLM.</description>
<author>ybakman@usc.edu</author>      <pubDate>Thu, 16 Jan 2025 04:36:11 GMT</pubDate>
    </item>    <item>
      <title>0.1.5</title>
      <link>https://pypi.org/project/truthtorchlm/0.1.5/</link>
      <description>TruthTorchLM is an open-source library designed to detect and mitigate hallucinations in text generation models. The library integrates state-of-the-art methods, offers comprehensive benchmarking tools across various tasks, and enables seamless integration with popular frameworks like Huggingface and LiteLLM.</description>
<author>ybakman@usc.edu</author>      <pubDate>Sun, 12 Jan 2025 02:55:29 GMT</pubDate>
    </item>    <item>
      <title>0.1.4</title>
      <link>https://pypi.org/project/truthtorchlm/0.1.4/</link>
      <description>TruthTorchLM is an open-source library designed to detect and mitigate hallucinations in text generation models. The library integrates state-of-the-art methods, offers comprehensive benchmarking tools across various tasks, and enables seamless integration with popular frameworks like Huggingface and LiteLLM.</description>
<author>ybakman@usc.edu</author>      <pubDate>Wed, 08 Jan 2025 18:47:16 GMT</pubDate>
    </item>    <item>
      <title>0.1.3</title>
      <link>https://pypi.org/project/truthtorchlm/0.1.3/</link>
      <description>TruthTorchLM is an open-source library designed to detect and mitigate hallucinations in text generation models. The library integrates state-of-the-art methods, offers comprehensive benchmarking tools across various tasks, and enables seamless integration with popular frameworks like Huggingface and LiteLLM.</description>
<author>ybakman@usc.edu</author>      <pubDate>Wed, 08 Jan 2025 11:40:07 GMT</pubDate>
    </item>    <item>
      <title>0.1.2</title>
      <link>https://pypi.org/project/truthtorchlm/0.1.2/</link>
      <description>TruthTorchLM is an open-source library designed to detect and mitigate hallucinations in text generation models. The library integrates state-of-the-art methods, offers comprehensive benchmarking tools across various tasks, and enables seamless integration with popular frameworks like Huggingface and LiteLLM.</description>
<author>ybakman@usc.edu</author>      <pubDate>Tue, 07 Jan 2025 13:53:51 GMT</pubDate>
    </item>    <item>
      <title>0.1.1</title>
      <link>https://pypi.org/project/truthtorchlm/0.1.1/</link>
      <description>TruthTorchLM is an open-source library designed to detect and mitigate hallucinations in text generation models. The library integrates state-of-the-art methods, offers comprehensive benchmarking tools across various tasks, and enables seamless integration with popular frameworks like Huggingface and LiteLLM.</description>
<author>ybakman@usc.edu</author>      <pubDate>Tue, 07 Jan 2025 13:18:45 GMT</pubDate>
    </item>    <item>
      <title>0.0.0</title>
      <link>https://pypi.org/project/truthtorchlm/0.0.0/</link>
      <description>TruthTorchLM is an open-source library designed to detect and mitigate hallucinations in text generation models. The library integrates state-of-the-art methods, offers comprehensive benchmarking tools across various tasks, and enables seamless integration with popular frameworks like Huggingface and LiteLLM.</description>
<author>ybakman@usc.edu</author>      <pubDate>Tue, 07 Jan 2025 12:57:11 GMT</pubDate>
    </item>    <item>
      <title>0.1.0</title>
      <link>https://pypi.org/project/truthtorchlm/0.1.0/</link>
      <description>TruthTorchLM is an open-source library designed to detect and mitigate hallucinations in text generation models. The library integrates state-of-the-art methods, offers comprehensive benchmarking tools across various tasks, and enables seamless integration with popular frameworks like Huggingface and LiteLLM.</description>
<author>ybakman@usc.edu</author>      <pubDate>Tue, 07 Jan 2025 12:50:30 GMT</pubDate>
    </item>  </channel>
</rss>