Deploying to gh-pages from @ microsoft/graphrag@7c28c70d5c 🚀

This commit is contained in:
AlonsoGuevara 2025-08-14 01:01:05 +00:00
parent da635f389f
commit 397358ad3a
11 changed files with 256 additions and 257 deletions

View File

@ -1545,22 +1545,26 @@
<td>The library is Python-based.</td>
</tr>
<tr>
<td>Poetry</td>
<td><a href="https://python-poetry.org/docs/#installation">Instructions</a></td>
<td>Poetry is used for package management and virtualenv management in Python codebases</td>
<td>uv</td>
<td><a href="https://docs.astral.sh/uv/">Instructions</a></td>
<td>uv is used for package management and virtualenv management in Python codebases</td>
</tr>
</tbody>
</table>
<h1 id="getting-started">Getting Started</h1>
<h2 id="install-dependencies">Install Dependencies</h2>
<div class="highlight"><pre><span></span><code><a id="__codelineno-0-1" name="__codelineno-0-1" href="#__codelineno-0-1"></a><span class="c1"># Install Python dependencies.</span>
<a id="__codelineno-0-2" name="__codelineno-0-2" href="#__codelineno-0-2"></a>poetry<span class="w"> </span>install
<div class="highlight"><pre><span></span><code><a id="__codelineno-0-1" name="__codelineno-0-1" href="#__codelineno-0-1"></a><span class="c1"># (optional) create virtual environment</span>
<a id="__codelineno-0-2" name="__codelineno-0-2" href="#__codelineno-0-2"></a>uv<span class="w"> </span>venv<span class="w"> </span>--python<span class="w"> </span><span class="m">3</span>.10
<a id="__codelineno-0-3" name="__codelineno-0-3" href="#__codelineno-0-3"></a><span class="nb">source</span><span class="w"> </span>.venv/bin/activate
<a id="__codelineno-0-4" name="__codelineno-0-4" href="#__codelineno-0-4"></a>
<a id="__codelineno-0-5" name="__codelineno-0-5" href="#__codelineno-0-5"></a><span class="c1"># install python dependencies</span>
<a id="__codelineno-0-6" name="__codelineno-0-6" href="#__codelineno-0-6"></a>uv<span class="w"> </span>sync<span class="w"> </span>--extra<span class="w"> </span>dev
</code></pre></div>
<h2 id="execute-the-indexing-engine">Execute the Indexing Engine</h2>
<div class="highlight"><pre><span></span><code><a id="__codelineno-1-1" name="__codelineno-1-1" href="#__codelineno-1-1"></a>poetry<span class="w"> </span>run<span class="w"> </span>poe<span class="w"> </span>index<span class="w"> </span>&lt;...args&gt;
<div class="highlight"><pre><span></span><code><a id="__codelineno-1-1" name="__codelineno-1-1" href="#__codelineno-1-1"></a>uv<span class="w"> </span>run<span class="w"> </span>poe<span class="w"> </span>index<span class="w"> </span>&lt;...args&gt;
</code></pre></div>
<h2 id="executing-queries">Executing Queries</h2>
<div class="highlight"><pre><span></span><code><a id="__codelineno-2-1" name="__codelineno-2-1" href="#__codelineno-2-1"></a>poetry<span class="w"> </span>run<span class="w"> </span>poe<span class="w"> </span>query<span class="w"> </span>&lt;...args&gt;
<div class="highlight"><pre><span></span><code><a id="__codelineno-2-1" name="__codelineno-2-1" href="#__codelineno-2-1"></a>uv<span class="w"> </span>run<span class="w"> </span>poe<span class="w"> </span>query<span class="w"> </span>&lt;...args&gt;
</code></pre></div>
<h1 id="azurite">Azurite</h1>
<p>Some unit and smoke tests use Azurite to emulate Azure resources. This can be started by running:</p>
@ -1568,36 +1572,33 @@
</code></pre></div>
<p>or by simply running <code>azurite</code> in the terminal if already installed globally. See the <a href="https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azurite">Azurite documentation</a> for more information about how to install and use Azurite.</p>
<h1 id="lifecycle-scripts">Lifecycle Scripts</h1>
<p>Our Python package utilizes Poetry to manage dependencies and <a href="https://pypi.org/project/poethepoet/">poethepoet</a> to manage build scripts.</p>
<p>Our Python package utilize uv to manage dependencies and <a href="https://pypi.org/project/poethepoet/">poethepoet</a> to manage build scripts.</p>
<p>Available scripts are:</p>
<ul>
<li><code>poetry run poe index</code> - Run the Indexing CLI</li>
<li><code>poetry run poe query</code> - Run the Query CLI</li>
<li><code>poetry build</code> - This invokes <code>poetry build</code>, which will build a wheel file and other distributable artifacts.</li>
<li><code>poetry run poe test</code> - This will execute all tests.</li>
<li><code>poetry run poe test_unit</code> - This will execute unit tests.</li>
<li><code>poetry run poe test_integration</code> - This will execute integration tests.</li>
<li><code>poetry run poe test_smoke</code> - This will execute smoke tests.</li>
<li><code>poetry run poe test_verbs</code> - This will execute tests of the basic workflows.</li>
<li><code>poetry run poe check</code> - This will perform a suite of static checks across the package, including:</li>
<li><code>uv run poe index</code> - Run the Indexing CLI</li>
<li><code>uv run poe query</code> - Run the Query CLI</li>
<li><code>uv build</code> - This will build a wheel file and other distributable artifacts.</li>
<li><code>uv run poe test</code> - This will execute all tests.</li>
<li><code>uv run poe test_unit</code> - This will execute unit tests.</li>
<li><code>uv run poe test_integration</code> - This will execute integration tests.</li>
<li><code>uv run poe test_smoke</code> - This will execute smoke tests.</li>
<li><code>uv run poe test_verbs</code> - This will execute tests of the basic workflows.</li>
<li><code>uv run poe check</code> - This will perform a suite of static checks across the package, including:</li>
<li>formatting</li>
<li>documentation formatting</li>
<li>linting</li>
<li>security patterns</li>
<li>type-checking</li>
<li><code>poetry run poe fix</code> - This will apply any available auto-fixes to the package. Usually this is just formatting fixes.</li>
<li><code>poetry run poe fix_unsafe</code> - This will apply any available auto-fixes to the package, including those that may be unsafe.</li>
<li><code>poetry run poe format</code> - Explicitly run the formatter across the package.</li>
<li><code>uv run poe fix</code> - This will apply any available auto-fixes to the package. Usually this is just formatting fixes.</li>
<li><code>uv run poe fix_unsafe</code> - This will apply any available auto-fixes to the package, including those that may be unsafe.</li>
<li><code>uv run poe format</code> - Explicitly run the formatter across the package.</li>
</ul>
<h2 id="troubleshooting">Troubleshooting</h2>
<h3 id="runtimeerror-llvm-config-failed-executing-please-point-llvm_config-to-the-path-for-llvm-config-when-running-poetry-install">"RuntimeError: llvm-config failed executing, please point LLVM_CONFIG to the path for llvm-config" when running poetry install</h3>
<h3 id="runtimeerror-llvm-config-failed-executing-please-point-llvm_config-to-the-path-for-llvm-config-when-running-uv-install">"RuntimeError: llvm-config failed executing, please point LLVM_CONFIG to the path for llvm-config" when running uv install</h3>
<p>Make sure llvm-9 and llvm-9-dev are installed:</p>
<p><code>sudo apt-get install llvm-9 llvm-9-dev</code></p>
<p>and then in your bashrc, add</p>
<p><code>export LLVM_CONFIG=/usr/bin/llvm-config-9</code></p>
<h3 id="numba_pymoduleh610-fatal-error-pythonh-no-such-file-or-directory-when-running-poetry-install">"numba/_pymodule.h:6:10: fatal error: Python.h: No such file or directory" when running poetry install</h3>
<p>Make sure you have python3.10-dev installed or more generally <code>python&lt;version&gt;-dev</code></p>
<p><code>sudo apt-get install python3.10-dev</code></p>
<h3 id="llm-call-constantly-exceeds-tpm-rpm-or-time-limits">LLM call constantly exceeds TPM, RPM or time limits</h3>
<p><code>GRAPHRAG_LLM_THREAD_COUNT</code> and <code>GRAPHRAG_EMBEDDING_THREAD_COUNT</code> are both set to 50 by default. You can modify these values
to reduce concurrency. Please refer to the <a href="../config/overview/">Configuration Documents</a></p>

View File

@ -2375,7 +2375,7 @@ response, context = await api.global_search(
<span class="ansi-green-fg"> 4</span> <span class="ansi-yellow-fg">f</span><span class="ansi-yellow-fg">"</span><span class="ansi-bold" style="color: rgb(175,95,135)">{</span>PROJECT_DIRECTORY<span class="ansi-bold" style="color: rgb(175,95,135)">}</span><span class="ansi-yellow-fg">/output/community_reports.parquet</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-green-fg"> 5</span> )
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/pandas/io/parquet.py:669</span>, in <span class="ansi-cyan-fg">read_parquet</span><span class="ansi-blue-fg">(path, engine, columns, storage_options, use_nullable_dtypes, dtype_backend, filesystem, filters, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/pandas/io/parquet.py:669</span>, in <span class="ansi-cyan-fg">read_parquet</span><span class="ansi-blue-fg">(path, engine, columns, storage_options, use_nullable_dtypes, dtype_backend, filesystem, filters, **kwargs)</span>
<span class="ansi-green-fg"> 666</span> use_nullable_dtypes = <span class="ansi-bold" style="color: rgb(0,135,0)">False</span>
<span class="ansi-green-fg"> 667</span> check_dtype_backend(dtype_backend)
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">669</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-yellow-bg">impl</span><span class="ansi-yellow-bg">.</span><span class="ansi-yellow-bg">read</span><span class="ansi-yellow-bg">(</span>
@ -2389,7 +2389,7 @@ response, context = await api.global_search(
<span class="ansi-green-fg"> 677</span> <span class="ansi-yellow-bg"> </span><span class="ansi-yellow-bg">*</span><span class="ansi-yellow-bg">*</span><span class="ansi-yellow-bg">kwargs</span><span class="ansi-yellow-bg">,</span>
<span class="ansi-green-fg"> 678</span> <span class="ansi-yellow-bg">)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/pandas/io/parquet.py:258</span>, in <span class="ansi-cyan-fg">PyArrowImpl.read</span><span class="ansi-blue-fg">(self, path, columns, filters, use_nullable_dtypes, dtype_backend, storage_options, filesystem, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/pandas/io/parquet.py:258</span>, in <span class="ansi-cyan-fg">PyArrowImpl.read</span><span class="ansi-blue-fg">(self, path, columns, filters, use_nullable_dtypes, dtype_backend, storage_options, filesystem, **kwargs)</span>
<span class="ansi-green-fg"> 256</span> <span class="ansi-bold" style="color: rgb(0,135,0)">if</span> manager == <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">array</span><span class="ansi-yellow-fg">"</span>:
<span class="ansi-green-fg"> 257</span> to_pandas_kwargs[<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">split_blocks</span><span class="ansi-yellow-fg">"</span>] = <span class="ansi-bold" style="color: rgb(0,135,0)">True</span>
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">258</span> path_or_handle, handles, filesystem = <span class="ansi-yellow-bg">_get_path_or_handle</span><span class="ansi-yellow-bg">(</span>
@ -2405,7 +2405,7 @@ response, context = await api.global_search(
<span class="ansi-green-fg"> (...)</span><span class="ansi-green-fg"> 270</span> **kwargs,
<span class="ansi-green-fg"> 271</span> )
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/pandas/io/parquet.py:141</span>, in <span class="ansi-cyan-fg">_get_path_or_handle</span><span class="ansi-blue-fg">(path, fs, storage_options, mode, is_dir)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/pandas/io/parquet.py:141</span>, in <span class="ansi-cyan-fg">_get_path_or_handle</span><span class="ansi-blue-fg">(path, fs, storage_options, mode, is_dir)</span>
<span class="ansi-green-fg"> 131</span> handles = <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>
<span class="ansi-green-fg"> 132</span> <span class="ansi-bold" style="color: rgb(0,135,0)">if</span> (
<span class="ansi-green-fg"> 133</span> <span class="ansi-bold" style="color: rgb(175,0,255)">not</span> fs
@ -2418,7 +2418,7 @@ response, context = await api.global_search(
<span class="ansi-green-fg"> 144</span> fs = <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>
<span class="ansi-green-fg"> 145</span> path_or_handle = handles.handle
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/pandas/io/common.py:882</span>, in <span class="ansi-cyan-fg">get_handle</span><span class="ansi-blue-fg">(path_or_buf, mode, encoding, compression, memory_map, is_text, errors, storage_options)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/pandas/io/common.py:882</span>, in <span class="ansi-cyan-fg">get_handle</span><span class="ansi-blue-fg">(path_or_buf, mode, encoding, compression, memory_map, is_text, errors, storage_options)</span>
<span class="ansi-green-fg"> 873</span> handle = <span style="color: rgb(0,135,0)">open</span>(
<span class="ansi-green-fg"> 874</span> handle,
<span class="ansi-green-fg"> 875</span> ioargs.mode,

View File

@ -2569,38 +2569,38 @@ search = DRIFTSearch(
<span class="ansi-green-fg"> 83</span> <span class="ansi-bold" style="color: rgb(0,135,0)">else</span>:
<span class="ansi-green-fg"> 84</span> response = <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>.model(prompt, history=history, **kwargs)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/openai/llm/openai_chat_llm.py:94</span>, in <span class="ansi-cyan-fg">OpenAIChatLLMImpl.__call__</span><span class="ansi-blue-fg">(self, prompt, stream, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/openai/llm/openai_chat_llm.py:94</span>, in <span class="ansi-cyan-fg">OpenAIChatLLMImpl.__call__</span><span class="ansi-blue-fg">(self, prompt, stream, **kwargs)</span>
<span class="ansi-green-fg"> 91</span> <span class="ansi-bold" style="color: rgb(0,135,0)">if</span> stream:
<span class="ansi-green-fg"> 92</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._streaming_chat_llm(prompt, **kwargs)
<span class="ansi-green-fg">---&gt; </span><span class="ansi-green-fg">94</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._text_chat_llm(prompt, **kwargs)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/openai/services/openai_tools_parsing.py:130</span>, in <span class="ansi-cyan-fg">OpenAIParseToolsLLM.__call__</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/openai/services/openai_tools_parsing.py:130</span>, in <span class="ansi-cyan-fg">OpenAIParseToolsLLM.__call__</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-green-fg"> 127</span> tools = kwargs.get(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">tools</span><span class="ansi-yellow-fg">"</span>, [])
<span class="ansi-green-fg"> 129</span> <span class="ansi-bold" style="color: rgb(0,135,0)">if</span> <span class="ansi-bold" style="color: rgb(175,0,255)">not</span> tools:
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">130</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._delegate(prompt, **kwargs)
<span class="ansi-green-fg"> 132</span> completion_parameters = <span style="color: rgb(0,135,0)">self</span>._add_tools_to_parameters(kwargs, tools)
<span class="ansi-green-fg"> 134</span> result = <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._delegate(prompt, **completion_parameters)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/base_llm.py:144</span>, in <span class="ansi-cyan-fg">BaseLLM.__call__</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/base_llm.py:144</span>, in <span class="ansi-cyan-fg">BaseLLM.__call__</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-green-fg"> 142</span> <span class="ansi-bold" style="color: rgb(0,135,0)">try</span>:
<span class="ansi-green-fg"> 143</span> prompt, kwargs = <span style="color: rgb(0,135,0)">self</span>._rewrite_input(prompt, kwargs)
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">144</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._decorated_target(prompt, **kwargs)
<span class="ansi-green-fg"> 145</span> <span class="ansi-bold" style="color: rgb(0,135,0)">except</span> <span class="ansi-bold" style="color: rgb(215,95,95)">BaseException</span> <span class="ansi-bold" style="color: rgb(0,135,0)">as</span> e:
<span class="ansi-green-fg"> 146</span> stack_trace = traceback.format_exc()
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/services/json.py:78</span>, in <span class="ansi-cyan-fg">JsonReceiver.decorate.&lt;locals&gt;.invoke</span><span class="ansi-blue-fg">(prompt, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/services/json.py:78</span>, in <span class="ansi-cyan-fg">JsonReceiver.decorate.&lt;locals&gt;.invoke</span><span class="ansi-blue-fg">(prompt, **kwargs)</span>
<span class="ansi-green-fg"> 76</span> <span class="ansi-bold" style="color: rgb(0,135,0)">if</span> kwargs.get(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">json_model</span><span class="ansi-yellow-fg">"</span>) <span class="ansi-bold" style="color: rgb(175,0,255)">is</span> <span class="ansi-bold" style="color: rgb(175,0,255)">not</span> <span class="ansi-bold" style="color: rgb(0,135,0)">None</span> <span class="ansi-bold" style="color: rgb(175,0,255)">or</span> kwargs.get(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">json</span><span class="ansi-yellow-fg">"</span>):
<span class="ansi-green-fg"> 77</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> this.invoke_json(delegate, prompt, kwargs)
<span class="ansi-green-fg">---&gt; </span><span class="ansi-green-fg">78</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> delegate(prompt, **kwargs)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/services/rate_limiter.py:75</span>, in <span class="ansi-cyan-fg">RateLimiter.decorate.&lt;locals&gt;.invoke</span><span class="ansi-blue-fg">(prompt, **args)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/services/rate_limiter.py:75</span>, in <span class="ansi-cyan-fg">RateLimiter.decorate.&lt;locals&gt;.invoke</span><span class="ansi-blue-fg">(prompt, **args)</span>
<span class="ansi-green-fg"> 73</span> <span class="ansi-bold" style="color: rgb(0,135,0)">async</span> <span class="ansi-bold" style="color: rgb(0,135,0)">with</span> <span style="color: rgb(0,135,0)">self</span>._limiter.use(manifest):
<span class="ansi-green-fg"> 74</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._events.on_limit_acquired(manifest)
<span class="ansi-green-fg">---&gt; </span><span class="ansi-green-fg">75</span> result = <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> delegate(prompt, **args)
<span class="ansi-green-fg"> 76</span> <span class="ansi-bold" style="color: rgb(0,135,0)">finally</span>:
<span class="ansi-green-fg"> 77</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._events.on_limit_released(manifest)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/base_llm.py:126</span>, in <span class="ansi-cyan-fg">BaseLLM._decorator_target</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/base_llm.py:126</span>, in <span class="ansi-cyan-fg">BaseLLM._decorator_target</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-green-fg"> 121</span> <span class="ansi-yellow-fg">"""Target for the decorator chain.</span>
<span class="ansi-green-fg"> 122</span>
<span class="ansi-green-fg"> 123</span> <span class="ansi-yellow-fg">Leave signature alone as prompt, kwargs.</span>
@ -2610,22 +2610,22 @@ search = DRIFTSearch(
<span class="ansi-green-fg"> 127</span> result = LLMOutput(output=output)
<span class="ansi-green-fg"> 128</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._inject_usage(result)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/openai/llm/openai_text_chat_llm.py:166</span>, in <span class="ansi-cyan-fg">OpenAITextChatLLMImpl._execute_llm</span><span class="ansi-blue-fg">(self, prompt, kwargs)</span>
<span class="ansi-green-fg"> 163</span> local_model_parameters = kwargs.get(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">model_parameters</span><span class="ansi-yellow-fg">"</span>)
<span class="ansi-green-fg"> 164</span> parameters = <span style="color: rgb(0,135,0)">self</span>._build_completion_parameters(local_model_parameters)
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">166</span> raw_response = <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._client.chat.completions.with_raw_response.create(
<span class="ansi-green-fg"> 167</span> messages=cast(Iterator[ChatCompletionMessageParam], messages),
<span class="ansi-green-fg"> 168</span> **parameters,
<span class="ansi-green-fg"> 169</span> )
<span class="ansi-green-fg"> 170</span> completion = raw_response.parse()
<span class="ansi-green-fg"> 171</span> headers = raw_response.headers
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/openai/llm/openai_text_chat_llm.py:173</span>, in <span class="ansi-cyan-fg">OpenAITextChatLLMImpl._execute_llm</span><span class="ansi-blue-fg">(self, prompt, kwargs)</span>
<span class="ansi-green-fg"> 170</span> local_model_parameters = kwargs.get(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">model_parameters</span><span class="ansi-yellow-fg">"</span>)
<span class="ansi-green-fg"> 171</span> parameters = <span style="color: rgb(0,135,0)">self</span>._build_completion_parameters(local_model_parameters)
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">173</span> raw_response = <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._client.chat.completions.with_raw_response.create(
<span class="ansi-green-fg"> 174</span> messages=cast(Iterator[ChatCompletionMessageParam], messages),
<span class="ansi-green-fg"> 175</span> **parameters,
<span class="ansi-green-fg"> 176</span> )
<span class="ansi-green-fg"> 177</span> completion = raw_response.parse()
<span class="ansi-green-fg"> 178</span> headers = raw_response.headers
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/_legacy_response.py:381</span>, in <span class="ansi-cyan-fg">async_to_raw_response_wrapper.&lt;locals&gt;.wrapped</span><span class="ansi-blue-fg">(*args, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/_legacy_response.py:381</span>, in <span class="ansi-cyan-fg">async_to_raw_response_wrapper.&lt;locals&gt;.wrapped</span><span class="ansi-blue-fg">(*args, **kwargs)</span>
<span class="ansi-green-fg"> 377</span> extra_headers[RAW_RESPONSE_HEADER] = <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">true</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-green-fg"> 379</span> kwargs[<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">extra_headers</span><span class="ansi-yellow-fg">"</span>] = extra_headers
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">381</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> cast(LegacyAPIResponse[R], <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> func(*args, **kwargs))
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/resources/chat/completions/completions.py:2454</span>, in <span class="ansi-cyan-fg">AsyncCompletions.create</span><span class="ansi-blue-fg">(self, messages, model, audio, frequency_penalty, function_call, functions, logit_bias, logprobs, max_completion_tokens, max_tokens, metadata, modalities, n, parallel_tool_calls, prediction, presence_penalty, reasoning_effort, response_format, seed, service_tier, stop, store, stream, stream_options, temperature, tool_choice, tools, top_logprobs, top_p, user, web_search_options, extra_headers, extra_query, extra_body, timeout)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/resources/chat/completions/completions.py:2454</span>, in <span class="ansi-cyan-fg">AsyncCompletions.create</span><span class="ansi-blue-fg">(self, messages, model, audio, frequency_penalty, function_call, functions, logit_bias, logprobs, max_completion_tokens, max_tokens, metadata, modalities, n, parallel_tool_calls, prediction, presence_penalty, reasoning_effort, response_format, seed, service_tier, stop, store, stream, stream_options, temperature, tool_choice, tools, top_logprobs, top_p, user, web_search_options, extra_headers, extra_query, extra_body, timeout)</span>
<span class="ansi-green-fg"> 2411</span> <span style="color: rgb(175,0,255)">@required_args</span>([<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">messages</span><span class="ansi-yellow-fg">"</span>, <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">model</span><span class="ansi-yellow-fg">"</span>], [<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">messages</span><span class="ansi-yellow-fg">"</span>, <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">model</span><span class="ansi-yellow-fg">"</span>, <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">stream</span><span class="ansi-yellow-fg">"</span>])
<span class="ansi-green-fg"> 2412</span> <span class="ansi-bold" style="color: rgb(0,135,0)">async</span> <span class="ansi-bold" style="color: rgb(0,135,0)">def</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-blue-fg">create</span>(
<span class="ansi-green-fg"> 2413</span> <span style="color: rgb(0,135,0)">self</span>,
@ -2680,23 +2680,23 @@ search = DRIFTSearch(
<span class="ansi-green-fg"> 2499</span> stream_cls=AsyncStream[ChatCompletionChunk],
<span class="ansi-green-fg"> 2500</span> )
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/_base_client.py:1784</span>, in <span class="ansi-cyan-fg">AsyncAPIClient.post</span><span class="ansi-blue-fg">(self, path, cast_to, body, files, options, stream, stream_cls)</span>
<span class="ansi-green-fg"> 1770</span> <span class="ansi-bold" style="color: rgb(0,135,0)">async</span> <span class="ansi-bold" style="color: rgb(0,135,0)">def</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-blue-fg">post</span>(
<span class="ansi-green-fg"> 1771</span> <span style="color: rgb(0,135,0)">self</span>,
<span class="ansi-green-fg"> 1772</span> path: <span style="color: rgb(0,135,0)">str</span>,
<span class="ansi-green-fg"> (...)</span><span class="ansi-green-fg"> 1779</span> stream_cls: <span style="color: rgb(0,135,0)">type</span>[_AsyncStreamT] | <span class="ansi-bold" style="color: rgb(0,135,0)">None</span> = <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>,
<span class="ansi-green-fg"> 1780</span> ) -&gt; ResponseT | _AsyncStreamT:
<span class="ansi-green-fg"> 1781</span> opts = FinalRequestOptions.construct(
<span class="ansi-green-fg"> 1782</span> method=<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">post</span><span class="ansi-yellow-fg">"</span>, url=path, json_data=body, files=<span class="ansi-bold" style="color: rgb(0,135,0)">await</span> async_to_httpx_files(files), **options
<span class="ansi-green-fg"> 1783</span> )
<span class="ansi-green-fg">-&gt; </span><span class="ansi-green-fg">1784</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/_base_client.py:1791</span>, in <span class="ansi-cyan-fg">AsyncAPIClient.post</span><span class="ansi-blue-fg">(self, path, cast_to, body, files, options, stream, stream_cls)</span>
<span class="ansi-green-fg"> 1777</span> <span class="ansi-bold" style="color: rgb(0,135,0)">async</span> <span class="ansi-bold" style="color: rgb(0,135,0)">def</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-blue-fg">post</span>(
<span class="ansi-green-fg"> 1778</span> <span style="color: rgb(0,135,0)">self</span>,
<span class="ansi-green-fg"> 1779</span> path: <span style="color: rgb(0,135,0)">str</span>,
<span class="ansi-green-fg"> (...)</span><span class="ansi-green-fg"> 1786</span> stream_cls: <span style="color: rgb(0,135,0)">type</span>[_AsyncStreamT] | <span class="ansi-bold" style="color: rgb(0,135,0)">None</span> = <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>,
<span class="ansi-green-fg"> 1787</span> ) -&gt; ResponseT | _AsyncStreamT:
<span class="ansi-green-fg"> 1788</span> opts = FinalRequestOptions.construct(
<span class="ansi-green-fg"> 1789</span> method=<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">post</span><span class="ansi-yellow-fg">"</span>, url=path, json_data=body, files=<span class="ansi-bold" style="color: rgb(0,135,0)">await</span> async_to_httpx_files(files), **options
<span class="ansi-green-fg"> 1790</span> )
<span class="ansi-green-fg">-&gt; </span><span class="ansi-green-fg">1791</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/_base_client.py:1584</span>, in <span class="ansi-cyan-fg">AsyncAPIClient.request</span><span class="ansi-blue-fg">(self, cast_to, options, stream, stream_cls)</span>
<span class="ansi-green-fg"> 1581</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> err.response.aread()
<span class="ansi-green-fg"> 1583</span> log.debug(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">Re-raising status error</span><span class="ansi-yellow-fg">"</span>)
<span class="ansi-green-fg">-&gt; </span><span class="ansi-green-fg">1584</span> <span class="ansi-bold" style="color: rgb(0,135,0)">raise</span> <span style="color: rgb(0,135,0)">self</span>._make_status_error_from_response(err.response) <span class="ansi-bold" style="color: rgb(0,135,0)">from</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-bold" style="color: rgb(0,135,0)">None</span>
<span class="ansi-green-fg"> 1586</span> <span class="ansi-bold" style="color: rgb(0,135,0)">break</span>
<span class="ansi-green-fg"> 1588</span> <span class="ansi-bold" style="color: rgb(0,135,0)">assert</span> response <span class="ansi-bold" style="color: rgb(175,0,255)">is</span> <span class="ansi-bold" style="color: rgb(175,0,255)">not</span> <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>, <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">could not resolve response (should never happen)</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/_base_client.py:1591</span>, in <span class="ansi-cyan-fg">AsyncAPIClient.request</span><span class="ansi-blue-fg">(self, cast_to, options, stream, stream_cls)</span>
<span class="ansi-green-fg"> 1588</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> err.response.aread()
<span class="ansi-green-fg"> 1590</span> log.debug(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">Re-raising status error</span><span class="ansi-yellow-fg">"</span>)
<span class="ansi-green-fg">-&gt; </span><span class="ansi-green-fg">1591</span> <span class="ansi-bold" style="color: rgb(0,135,0)">raise</span> <span style="color: rgb(0,135,0)">self</span>._make_status_error_from_response(err.response) <span class="ansi-bold" style="color: rgb(0,135,0)">from</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-bold" style="color: rgb(0,135,0)">None</span>
<span class="ansi-green-fg"> 1593</span> <span class="ansi-bold" style="color: rgb(0,135,0)">break</span>
<span class="ansi-green-fg"> 1595</span> <span class="ansi-bold" style="color: rgb(0,135,0)">assert</span> response <span class="ansi-bold" style="color: rgb(175,0,255)">is</span> <span class="ansi-bold" style="color: rgb(175,0,255)">not</span> <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>, <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">could not resolve response (should never happen)</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-red-fg">AuthenticationError</span>: Error code: 401 - {'error': {'message': 'Incorrect API key provided: sk-proj-********************************************************************************************************************************************************zWYA. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}</pre>
</div>

View File

@ -2706,7 +2706,7 @@ print(result.response)</div>
<div class="jp-OutputArea-child">
<div class="jp-OutputPrompt jp-OutputArea-prompt"></div>
<div class="jp-RenderedText jp-OutputArea-output" data-mime-type="text/plain" tabindex="0">
<pre>2025-08-08 23:01:05.0881 - ERROR - graphrag.query.structured_search.global_search.search - Exception in _map_response_single_batch
<pre>2025-08-14 00:58:35.0530 - ERROR - graphrag.query.structured_search.global_search.search - Exception in _map_response_single_batch
Traceback (most recent call last):
File "/home/runner/work/graphrag/graphrag/graphrag/query/structured_search/global_search/search.py", line 227, in _map_response_single_batch
model_response = await self.model.achat(
@ -2714,43 +2714,43 @@ Traceback (most recent call last):
File "/home/runner/work/graphrag/graphrag/graphrag/language_model/providers/fnllm/models.py", line 84, in achat
response = await self.model(prompt, history=history, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/runner/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/openai/llm/openai_chat_llm.py", line 94, in __call__
File "/home/runner/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/openai/llm/openai_chat_llm.py", line 94, in __call__
return await self._text_chat_llm(prompt, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/runner/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/openai/services/openai_tools_parsing.py", line 130, in __call__
File "/home/runner/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/openai/services/openai_tools_parsing.py", line 130, in __call__
return await self._delegate(prompt, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/runner/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/base_llm.py", line 144, in __call__
File "/home/runner/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/base_llm.py", line 144, in __call__
return await self._decorated_target(prompt, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/runner/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/services/json.py", line 77, in invoke
File "/home/runner/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/services/json.py", line 77, in invoke
return await this.invoke_json(delegate, prompt, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/runner/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/services/json.py", line 96, in invoke_json
File "/home/runner/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/services/json.py", line 96, in invoke_json
return await self.try_receive_json(delegate, prompt, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/runner/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/services/json.py", line 162, in try_receive_json
File "/home/runner/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/services/json.py", line 162, in try_receive_json
result = await delegate(prompt, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/runner/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/services/rate_limiter.py", line 75, in invoke
File "/home/runner/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/services/rate_limiter.py", line 75, in invoke
result = await delegate(prompt, **args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/runner/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/base_llm.py", line 126, in _decorator_target
File "/home/runner/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/base_llm.py", line 126, in _decorator_target
output = await self._execute_llm(prompt, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/runner/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/openai/llm/openai_text_chat_llm.py", line 166, in _execute_llm
File "/home/runner/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/openai/llm/openai_text_chat_llm.py", line 173, in _execute_llm
raw_response = await self._client.chat.completions.with_raw_response.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/runner/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/_legacy_response.py", line 381, in wrapped
File "/home/runner/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/_legacy_response.py", line 381, in wrapped
return cast(LegacyAPIResponse[R], await func(*args, **kwargs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/runner/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/resources/chat/completions/completions.py", line 2454, in create
File "/home/runner/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/resources/chat/completions/completions.py", line 2454, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "/home/runner/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/_base_client.py", line 1784, in post
File "/home/runner/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1791, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/runner/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/_base_client.py", line 1584, in request
File "/home/runner/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1591, in request
raise self._make_status_error_from_response(err.response) from None
openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: sk-proj-********************************************************************************************************************************************************zWYA. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
</pre>
@ -2759,7 +2759,7 @@ openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect A
<div class="jp-OutputArea-child">
<div class="jp-OutputPrompt jp-OutputArea-prompt"></div>
<div class="jp-RenderedText jp-OutputArea-output" data-mime-type="text/plain" tabindex="0">
<pre>2025-08-08 23:01:05.0885 - WARNING - graphrag.query.structured_search.global_search.search - Warning: All map responses have score 0 (i.e., no relevant information found from the dataset), returning a canned 'I do not know' answer. You can try enabling `allow_general_knowledge` to encourage the LLM to incorporate relevant general knowledge, at the risk of increasing hallucinations.
<pre>2025-08-14 00:58:35.0534 - WARNING - graphrag.query.structured_search.global_search.search - Warning: All map responses have score 0 (i.e., no relevant information found from the dataset), returning a canned 'I do not know' answer. You can try enabling `allow_general_knowledge` to encourage the LLM to incorporate relevant general knowledge, at the risk of increasing hallucinations.
</pre>
</div>
</div>

View File

@ -2694,26 +2694,26 @@ print(result.response)</div>
<span class="ansi-green-fg"> (...)</span><span class="ansi-green-fg"> 94</span> metrics=response.metrics,
<span class="ansi-green-fg"> 95</span> )
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/openai/llm/openai_chat_llm.py:94</span>, in <span class="ansi-cyan-fg">OpenAIChatLLMImpl.__call__</span><span class="ansi-blue-fg">(self, prompt, stream, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/openai/llm/openai_chat_llm.py:94</span>, in <span class="ansi-cyan-fg">OpenAIChatLLMImpl.__call__</span><span class="ansi-blue-fg">(self, prompt, stream, **kwargs)</span>
<span class="ansi-green-fg"> 91</span> <span class="ansi-bold" style="color: rgb(0,135,0)">if</span> stream:
<span class="ansi-green-fg"> 92</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._streaming_chat_llm(prompt, **kwargs)
<span class="ansi-green-fg">---&gt; </span><span class="ansi-green-fg">94</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._text_chat_llm(prompt, **kwargs)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/openai/services/openai_tools_parsing.py:130</span>, in <span class="ansi-cyan-fg">OpenAIParseToolsLLM.__call__</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/openai/services/openai_tools_parsing.py:130</span>, in <span class="ansi-cyan-fg">OpenAIParseToolsLLM.__call__</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-green-fg"> 127</span> tools = kwargs.get(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">tools</span><span class="ansi-yellow-fg">"</span>, [])
<span class="ansi-green-fg"> 129</span> <span class="ansi-bold" style="color: rgb(0,135,0)">if</span> <span class="ansi-bold" style="color: rgb(175,0,255)">not</span> tools:
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">130</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._delegate(prompt, **kwargs)
<span class="ansi-green-fg"> 132</span> completion_parameters = <span style="color: rgb(0,135,0)">self</span>._add_tools_to_parameters(kwargs, tools)
<span class="ansi-green-fg"> 134</span> result = <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._delegate(prompt, **completion_parameters)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/base_llm.py:144</span>, in <span class="ansi-cyan-fg">BaseLLM.__call__</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/base_llm.py:144</span>, in <span class="ansi-cyan-fg">BaseLLM.__call__</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-green-fg"> 142</span> <span class="ansi-bold" style="color: rgb(0,135,0)">try</span>:
<span class="ansi-green-fg"> 143</span> prompt, kwargs = <span style="color: rgb(0,135,0)">self</span>._rewrite_input(prompt, kwargs)
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">144</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._decorated_target(prompt, **kwargs)
<span class="ansi-green-fg"> 145</span> <span class="ansi-bold" style="color: rgb(0,135,0)">except</span> <span class="ansi-bold" style="color: rgb(215,95,95)">BaseException</span> <span class="ansi-bold" style="color: rgb(0,135,0)">as</span> e:
<span class="ansi-green-fg"> 146</span> stack_trace = traceback.format_exc()
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/services/json.py:77</span>, in <span class="ansi-cyan-fg">JsonReceiver.decorate.&lt;locals&gt;.invoke</span><span class="ansi-blue-fg">(prompt, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/services/json.py:77</span>, in <span class="ansi-cyan-fg">JsonReceiver.decorate.&lt;locals&gt;.invoke</span><span class="ansi-blue-fg">(prompt, **kwargs)</span>
<span class="ansi-green-fg"> 72</span> <span class="ansi-bold" style="color: rgb(0,135,0)">async</span> <span class="ansi-bold" style="color: rgb(0,135,0)">def</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-blue-fg">invoke</span>(
<span class="ansi-green-fg"> 73</span> prompt: TInput,
<span class="ansi-green-fg"> 74</span> **kwargs: Unpack[LLMInput[TJsonModel, THistoryEntry, TModelParameters]],
@ -2722,28 +2722,28 @@ print(result.response)</div>
<span class="ansi-green-fg">---&gt; </span><span class="ansi-green-fg">77</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> this.invoke_json(delegate, prompt, kwargs)
<span class="ansi-green-fg"> 78</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> delegate(prompt, **kwargs)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/services/json.py:96</span>, in <span class="ansi-cyan-fg">JsonReceiver.invoke_json</span><span class="ansi-blue-fg">(self, delegate, prompt, kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/services/json.py:96</span>, in <span class="ansi-cyan-fg">JsonReceiver.invoke_json</span><span class="ansi-blue-fg">(self, delegate, prompt, kwargs)</span>
<span class="ansi-green-fg"> 94</span> <span class="ansi-bold" style="color: rgb(0,135,0)">if</span> attempt &gt; <span class="ansi-green-fg">0</span>:
<span class="ansi-green-fg"> 95</span> kwargs[<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">bust_cache</span><span class="ansi-yellow-fg">"</span>] = <span class="ansi-bold" style="color: rgb(0,135,0)">True</span>
<span class="ansi-green-fg">---&gt; </span><span class="ansi-green-fg">96</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>.try_receive_json(delegate, prompt, kwargs)
<span class="ansi-green-fg"> 97</span> <span class="ansi-bold" style="color: rgb(0,135,0)">except</span> FailedToGenerateValidJsonError <span class="ansi-bold" style="color: rgb(0,135,0)">as</span> e:
<span class="ansi-green-fg"> 98</span> error = e
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/services/json.py:162</span>, in <span class="ansi-cyan-fg">LooseModeJsonReceiver.try_receive_json</span><span class="ansi-blue-fg">(self, delegate, prompt, kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/services/json.py:162</span>, in <span class="ansi-cyan-fg">LooseModeJsonReceiver.try_receive_json</span><span class="ansi-blue-fg">(self, delegate, prompt, kwargs)</span>
<span class="ansi-green-fg"> 159</span> <span class="ansi-yellow-fg">"""Invoke the JSON decorator."""</span>
<span class="ansi-green-fg"> 160</span> json_model = kwargs.get(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">json_model</span><span class="ansi-yellow-fg">"</span>)
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">162</span> result = <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> delegate(prompt, **kwargs)
<span class="ansi-green-fg"> 163</span> json_string = <span style="color: rgb(0,135,0)">self</span>._marshaler.extract_json_string(result)
<span class="ansi-green-fg"> 164</span> <span class="ansi-bold" style="color: rgb(0,135,0)">try</span>:
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/services/rate_limiter.py:75</span>, in <span class="ansi-cyan-fg">RateLimiter.decorate.&lt;locals&gt;.invoke</span><span class="ansi-blue-fg">(prompt, **args)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/services/rate_limiter.py:75</span>, in <span class="ansi-cyan-fg">RateLimiter.decorate.&lt;locals&gt;.invoke</span><span class="ansi-blue-fg">(prompt, **args)</span>
<span class="ansi-green-fg"> 73</span> <span class="ansi-bold" style="color: rgb(0,135,0)">async</span> <span class="ansi-bold" style="color: rgb(0,135,0)">with</span> <span style="color: rgb(0,135,0)">self</span>._limiter.use(manifest):
<span class="ansi-green-fg"> 74</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._events.on_limit_acquired(manifest)
<span class="ansi-green-fg">---&gt; </span><span class="ansi-green-fg">75</span> result = <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> delegate(prompt, **args)
<span class="ansi-green-fg"> 76</span> <span class="ansi-bold" style="color: rgb(0,135,0)">finally</span>:
<span class="ansi-green-fg"> 77</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._events.on_limit_released(manifest)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/base_llm.py:126</span>, in <span class="ansi-cyan-fg">BaseLLM._decorator_target</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/base_llm.py:126</span>, in <span class="ansi-cyan-fg">BaseLLM._decorator_target</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-green-fg"> 121</span> <span class="ansi-yellow-fg">"""Target for the decorator chain.</span>
<span class="ansi-green-fg"> 122</span>
<span class="ansi-green-fg"> 123</span> <span class="ansi-yellow-fg">Leave signature alone as prompt, kwargs.</span>
@ -2753,22 +2753,22 @@ print(result.response)</div>
<span class="ansi-green-fg"> 127</span> result = LLMOutput(output=output)
<span class="ansi-green-fg"> 128</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._inject_usage(result)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/openai/llm/openai_text_chat_llm.py:166</span>, in <span class="ansi-cyan-fg">OpenAITextChatLLMImpl._execute_llm</span><span class="ansi-blue-fg">(self, prompt, kwargs)</span>
<span class="ansi-green-fg"> 163</span> local_model_parameters = kwargs.get(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">model_parameters</span><span class="ansi-yellow-fg">"</span>)
<span class="ansi-green-fg"> 164</span> parameters = <span style="color: rgb(0,135,0)">self</span>._build_completion_parameters(local_model_parameters)
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">166</span> raw_response = <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._client.chat.completions.with_raw_response.create(
<span class="ansi-green-fg"> 167</span> messages=cast(Iterator[ChatCompletionMessageParam], messages),
<span class="ansi-green-fg"> 168</span> **parameters,
<span class="ansi-green-fg"> 169</span> )
<span class="ansi-green-fg"> 170</span> completion = raw_response.parse()
<span class="ansi-green-fg"> 171</span> headers = raw_response.headers
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/openai/llm/openai_text_chat_llm.py:173</span>, in <span class="ansi-cyan-fg">OpenAITextChatLLMImpl._execute_llm</span><span class="ansi-blue-fg">(self, prompt, kwargs)</span>
<span class="ansi-green-fg"> 170</span> local_model_parameters = kwargs.get(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">model_parameters</span><span class="ansi-yellow-fg">"</span>)
<span class="ansi-green-fg"> 171</span> parameters = <span style="color: rgb(0,135,0)">self</span>._build_completion_parameters(local_model_parameters)
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">173</span> raw_response = <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._client.chat.completions.with_raw_response.create(
<span class="ansi-green-fg"> 174</span> messages=cast(Iterator[ChatCompletionMessageParam], messages),
<span class="ansi-green-fg"> 175</span> **parameters,
<span class="ansi-green-fg"> 176</span> )
<span class="ansi-green-fg"> 177</span> completion = raw_response.parse()
<span class="ansi-green-fg"> 178</span> headers = raw_response.headers
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/_legacy_response.py:381</span>, in <span class="ansi-cyan-fg">async_to_raw_response_wrapper.&lt;locals&gt;.wrapped</span><span class="ansi-blue-fg">(*args, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/_legacy_response.py:381</span>, in <span class="ansi-cyan-fg">async_to_raw_response_wrapper.&lt;locals&gt;.wrapped</span><span class="ansi-blue-fg">(*args, **kwargs)</span>
<span class="ansi-green-fg"> 377</span> extra_headers[RAW_RESPONSE_HEADER] = <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">true</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-green-fg"> 379</span> kwargs[<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">extra_headers</span><span class="ansi-yellow-fg">"</span>] = extra_headers
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">381</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> cast(LegacyAPIResponse[R], <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> func(*args, **kwargs))
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/resources/chat/completions/completions.py:2454</span>, in <span class="ansi-cyan-fg">AsyncCompletions.create</span><span class="ansi-blue-fg">(self, messages, model, audio, frequency_penalty, function_call, functions, logit_bias, logprobs, max_completion_tokens, max_tokens, metadata, modalities, n, parallel_tool_calls, prediction, presence_penalty, reasoning_effort, response_format, seed, service_tier, stop, store, stream, stream_options, temperature, tool_choice, tools, top_logprobs, top_p, user, web_search_options, extra_headers, extra_query, extra_body, timeout)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/resources/chat/completions/completions.py:2454</span>, in <span class="ansi-cyan-fg">AsyncCompletions.create</span><span class="ansi-blue-fg">(self, messages, model, audio, frequency_penalty, function_call, functions, logit_bias, logprobs, max_completion_tokens, max_tokens, metadata, modalities, n, parallel_tool_calls, prediction, presence_penalty, reasoning_effort, response_format, seed, service_tier, stop, store, stream, stream_options, temperature, tool_choice, tools, top_logprobs, top_p, user, web_search_options, extra_headers, extra_query, extra_body, timeout)</span>
<span class="ansi-green-fg"> 2411</span> <span style="color: rgb(175,0,255)">@required_args</span>([<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">messages</span><span class="ansi-yellow-fg">"</span>, <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">model</span><span class="ansi-yellow-fg">"</span>], [<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">messages</span><span class="ansi-yellow-fg">"</span>, <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">model</span><span class="ansi-yellow-fg">"</span>, <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">stream</span><span class="ansi-yellow-fg">"</span>])
<span class="ansi-green-fg"> 2412</span> <span class="ansi-bold" style="color: rgb(0,135,0)">async</span> <span class="ansi-bold" style="color: rgb(0,135,0)">def</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-blue-fg">create</span>(
<span class="ansi-green-fg"> 2413</span> <span style="color: rgb(0,135,0)">self</span>,
@ -2823,23 +2823,23 @@ print(result.response)</div>
<span class="ansi-green-fg"> 2499</span> stream_cls=AsyncStream[ChatCompletionChunk],
<span class="ansi-green-fg"> 2500</span> )
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/_base_client.py:1784</span>, in <span class="ansi-cyan-fg">AsyncAPIClient.post</span><span class="ansi-blue-fg">(self, path, cast_to, body, files, options, stream, stream_cls)</span>
<span class="ansi-green-fg"> 1770</span> <span class="ansi-bold" style="color: rgb(0,135,0)">async</span> <span class="ansi-bold" style="color: rgb(0,135,0)">def</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-blue-fg">post</span>(
<span class="ansi-green-fg"> 1771</span> <span style="color: rgb(0,135,0)">self</span>,
<span class="ansi-green-fg"> 1772</span> path: <span style="color: rgb(0,135,0)">str</span>,
<span class="ansi-green-fg"> (...)</span><span class="ansi-green-fg"> 1779</span> stream_cls: <span style="color: rgb(0,135,0)">type</span>[_AsyncStreamT] | <span class="ansi-bold" style="color: rgb(0,135,0)">None</span> = <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>,
<span class="ansi-green-fg"> 1780</span> ) -&gt; ResponseT | _AsyncStreamT:
<span class="ansi-green-fg"> 1781</span> opts = FinalRequestOptions.construct(
<span class="ansi-green-fg"> 1782</span> method=<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">post</span><span class="ansi-yellow-fg">"</span>, url=path, json_data=body, files=<span class="ansi-bold" style="color: rgb(0,135,0)">await</span> async_to_httpx_files(files), **options
<span class="ansi-green-fg"> 1783</span> )
<span class="ansi-green-fg">-&gt; </span><span class="ansi-green-fg">1784</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/_base_client.py:1791</span>, in <span class="ansi-cyan-fg">AsyncAPIClient.post</span><span class="ansi-blue-fg">(self, path, cast_to, body, files, options, stream, stream_cls)</span>
<span class="ansi-green-fg"> 1777</span> <span class="ansi-bold" style="color: rgb(0,135,0)">async</span> <span class="ansi-bold" style="color: rgb(0,135,0)">def</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-blue-fg">post</span>(
<span class="ansi-green-fg"> 1778</span> <span style="color: rgb(0,135,0)">self</span>,
<span class="ansi-green-fg"> 1779</span> path: <span style="color: rgb(0,135,0)">str</span>,
<span class="ansi-green-fg"> (...)</span><span class="ansi-green-fg"> 1786</span> stream_cls: <span style="color: rgb(0,135,0)">type</span>[_AsyncStreamT] | <span class="ansi-bold" style="color: rgb(0,135,0)">None</span> = <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>,
<span class="ansi-green-fg"> 1787</span> ) -&gt; ResponseT | _AsyncStreamT:
<span class="ansi-green-fg"> 1788</span> opts = FinalRequestOptions.construct(
<span class="ansi-green-fg"> 1789</span> method=<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">post</span><span class="ansi-yellow-fg">"</span>, url=path, json_data=body, files=<span class="ansi-bold" style="color: rgb(0,135,0)">await</span> async_to_httpx_files(files), **options
<span class="ansi-green-fg"> 1790</span> )
<span class="ansi-green-fg">-&gt; </span><span class="ansi-green-fg">1791</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/_base_client.py:1584</span>, in <span class="ansi-cyan-fg">AsyncAPIClient.request</span><span class="ansi-blue-fg">(self, cast_to, options, stream, stream_cls)</span>
<span class="ansi-green-fg"> 1581</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> err.response.aread()
<span class="ansi-green-fg"> 1583</span> log.debug(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">Re-raising status error</span><span class="ansi-yellow-fg">"</span>)
<span class="ansi-green-fg">-&gt; </span><span class="ansi-green-fg">1584</span> <span class="ansi-bold" style="color: rgb(0,135,0)">raise</span> <span style="color: rgb(0,135,0)">self</span>._make_status_error_from_response(err.response) <span class="ansi-bold" style="color: rgb(0,135,0)">from</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-bold" style="color: rgb(0,135,0)">None</span>
<span class="ansi-green-fg"> 1586</span> <span class="ansi-bold" style="color: rgb(0,135,0)">break</span>
<span class="ansi-green-fg"> 1588</span> <span class="ansi-bold" style="color: rgb(0,135,0)">assert</span> response <span class="ansi-bold" style="color: rgb(175,0,255)">is</span> <span class="ansi-bold" style="color: rgb(175,0,255)">not</span> <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>, <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">could not resolve response (should never happen)</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/_base_client.py:1591</span>, in <span class="ansi-cyan-fg">AsyncAPIClient.request</span><span class="ansi-blue-fg">(self, cast_to, options, stream, stream_cls)</span>
<span class="ansi-green-fg"> 1588</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> err.response.aread()
<span class="ansi-green-fg"> 1590</span> log.debug(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">Re-raising status error</span><span class="ansi-yellow-fg">"</span>)
<span class="ansi-green-fg">-&gt; </span><span class="ansi-green-fg">1591</span> <span class="ansi-bold" style="color: rgb(0,135,0)">raise</span> <span style="color: rgb(0,135,0)">self</span>._make_status_error_from_response(err.response) <span class="ansi-bold" style="color: rgb(0,135,0)">from</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-bold" style="color: rgb(0,135,0)">None</span>
<span class="ansi-green-fg"> 1593</span> <span class="ansi-bold" style="color: rgb(0,135,0)">break</span>
<span class="ansi-green-fg"> 1595</span> <span class="ansi-bold" style="color: rgb(0,135,0)">assert</span> response <span class="ansi-bold" style="color: rgb(175,0,255)">is</span> <span class="ansi-bold" style="color: rgb(175,0,255)">not</span> <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>, <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">could not resolve response (should never happen)</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-red-fg">AuthenticationError</span>: Error code: 401 - {'error': {'message': 'Incorrect API key provided: sk-proj-********************************************************************************************************************************************************zWYA. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}</pre>
</div>

View File

@ -2485,10 +2485,11 @@ await write_table_to_storage(
</div>
</clipboard-copy>
</div>
<div class="highlight-ipynb hl-python"><pre><span></span><span class="kn">from</span><span class="w"> </span><span class="nn">graphrag.cache.factory</span><span class="w"> </span><span class="kn">import</span> <span class="n">CacheFactory</span>
<div class="highlight-ipynb hl-python"><pre><span></span><span class="kn">from</span><span class="w"> </span><span class="nn">graphrag.index.flows.generate_text_embeddings</span><span class="w"> </span><span class="kn">import</span> <span class="n">generate_text_embeddings</span>
<span class="kn">from</span><span class="w"> </span><span class="nn">graphrag.cache.factory</span><span class="w"> </span><span class="kn">import</span> <span class="n">CacheFactory</span>
<span class="kn">from</span><span class="w"> </span><span class="nn">graphrag.callbacks.noop_workflow_callbacks</span><span class="w"> </span><span class="kn">import</span> <span class="n">NoopWorkflowCallbacks</span>
<span class="kn">from</span><span class="w"> </span><span class="nn">graphrag.config.embeddings</span><span class="w"> </span><span class="kn">import</span> <span class="n">get_embedded_fields</span><span class="p">,</span> <span class="n">get_embedding_settings</span>
<span class="kn">from</span><span class="w"> </span><span class="nn">graphrag.index.flows.generate_text_embeddings</span><span class="w"> </span><span class="kn">import</span> <span class="n">generate_text_embeddings</span>
<span class="c1"># We only need to re-run the embeddings workflow, to ensure that embeddings for all required search fields are in place</span>
<span class="c1"># We'll construct the context and run this function flow directly to avoid everything else</span>
@ -2518,10 +2519,11 @@ await write_table_to_storage(
<span class="n">snapshot_embeddings_enabled</span><span class="o">=</span><span class="kc">False</span><span class="p">,</span>
<span class="p">)</span>
</pre></div>
<div class="clipboard-copy-txt" id="cell-7">from graphrag.cache.factory import CacheFactory
<div class="clipboard-copy-txt" id="cell-7">from graphrag.index.flows.generate_text_embeddings import generate_text_embeddings
from graphrag.cache.factory import CacheFactory
from graphrag.callbacks.noop_workflow_callbacks import NoopWorkflowCallbacks
from graphrag.config.embeddings import get_embedded_fields, get_embedding_settings
from graphrag.index.flows.generate_text_embeddings import generate_text_embeddings
# We only need to re-run the embeddings workflow, to ensure that embeddings for all required search fields are in place
# We'll construct the context and run this function flow directly to avoid everything else
@ -2563,16 +2565,13 @@ await generate_text_embeddings(
<div class="jp-RenderedText jp-OutputArea-output" data-mime-type="application/vnd.jupyter.stderr" tabindex="0">
<pre>
<span class="ansi-red-fg">---------------------------------------------------------------------------</span>
<span class="ansi-red-fg">ImportError</span> Traceback (most recent call last)
<span class="ansi-cyan-fg">Cell</span><span class="ansi-cyan-fg"> </span><span class="ansi-green-fg">In[7]</span><span class="ansi-green-fg">, line 3</span>
<span class="ansi-green-fg"> 1</span> <span class="ansi-bold" style="color: rgb(0,135,0)">from</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-blue-intense-fg ansi-bold">graphrag</span><span class="ansi-blue-intense-fg ansi-bold">.</span><span class="ansi-blue-intense-fg ansi-bold">cache</span><span class="ansi-blue-intense-fg ansi-bold">.</span><span class="ansi-blue-intense-fg ansi-bold">factory</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-bold" style="color: rgb(0,135,0)">import</span> CacheFactory
<span class="ansi-green-fg"> 2</span> <span class="ansi-bold" style="color: rgb(0,135,0)">from</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-blue-intense-fg ansi-bold">graphrag</span><span class="ansi-blue-intense-fg ansi-bold">.</span><span class="ansi-blue-intense-fg ansi-bold">callbacks</span><span class="ansi-blue-intense-fg ansi-bold">.</span><span class="ansi-blue-intense-fg ansi-bold">noop_workflow_callbacks</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-bold" style="color: rgb(0,135,0)">import</span> NoopWorkflowCallbacks
<span class="ansi-green-fg">----&gt; </span><span class="ansi-green-fg">3</span> <span class="ansi-bold" style="color: rgb(0,135,0)">from</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-blue-intense-fg ansi-bold">graphrag</span><span class="ansi-blue-intense-fg ansi-bold">.</span><span class="ansi-blue-intense-fg ansi-bold">config</span><span class="ansi-blue-intense-fg ansi-bold">.</span><span class="ansi-blue-intense-fg ansi-bold">embeddings</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-bold" style="color: rgb(0,135,0)">import</span> get_embedded_fields, get_embedding_settings
<span class="ansi-green-fg"> 4</span> <span class="ansi-bold" style="color: rgb(0,135,0)">from</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-blue-intense-fg ansi-bold">graphrag</span><span class="ansi-blue-intense-fg ansi-bold">.</span><span class="ansi-blue-intense-fg ansi-bold">index</span><span class="ansi-blue-intense-fg ansi-bold">.</span><span class="ansi-blue-intense-fg ansi-bold">flows</span><span class="ansi-blue-intense-fg ansi-bold">.</span><span class="ansi-blue-intense-fg ansi-bold">generate_text_embeddings</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-bold" style="color: rgb(0,135,0)">import</span> generate_text_embeddings
<span class="ansi-green-fg"> 6</span> <span style="color: rgb(95,135,135)"># We only need to re-run the embeddings workflow, to ensure that embeddings for all required search fields are in place</span>
<span class="ansi-green-fg"> 7</span> <span style="color: rgb(95,135,135)"># We'll construct the context and run this function flow directly to avoid everything else</span>
<span class="ansi-red-fg">ModuleNotFoundError</span> Traceback (most recent call last)
<span class="ansi-cyan-fg">Cell</span><span class="ansi-cyan-fg"> </span><span class="ansi-green-fg">In[7]</span><span class="ansi-green-fg">, line 1</span>
<span class="ansi-green-fg">----&gt; </span><span class="ansi-green-fg">1</span> <span class="ansi-bold" style="color: rgb(0,135,0)">from</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-blue-intense-fg ansi-bold">graphrag</span><span class="ansi-blue-intense-fg ansi-bold">.</span><span class="ansi-blue-intense-fg ansi-bold">index</span><span class="ansi-blue-intense-fg ansi-bold">.</span><span class="ansi-blue-intense-fg ansi-bold">flows</span><span class="ansi-blue-intense-fg ansi-bold">.</span><span class="ansi-blue-intense-fg ansi-bold">generate_text_embeddings</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-bold" style="color: rgb(0,135,0)">import</span> generate_text_embeddings
<span class="ansi-green-fg"> 3</span> <span class="ansi-bold" style="color: rgb(0,135,0)">from</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-blue-intense-fg ansi-bold">graphrag</span><span class="ansi-blue-intense-fg ansi-bold">.</span><span class="ansi-blue-intense-fg ansi-bold">cache</span><span class="ansi-blue-intense-fg ansi-bold">.</span><span class="ansi-blue-intense-fg ansi-bold">factory</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-bold" style="color: rgb(0,135,0)">import</span> CacheFactory
<span class="ansi-green-fg"> 4</span> <span class="ansi-bold" style="color: rgb(0,135,0)">from</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-blue-intense-fg ansi-bold">graphrag</span><span class="ansi-blue-intense-fg ansi-bold">.</span><span class="ansi-blue-intense-fg ansi-bold">callbacks</span><span class="ansi-blue-intense-fg ansi-bold">.</span><span class="ansi-blue-intense-fg ansi-bold">noop_workflow_callbacks</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-bold" style="color: rgb(0,135,0)">import</span> NoopWorkflowCallbacks
<span class="ansi-red-fg">ImportError</span>: cannot import name 'get_embedded_fields' from 'graphrag.config.embeddings' (/home/runner/work/graphrag/graphrag/graphrag/config/embeddings.py)</pre>
<span class="ansi-red-fg">ModuleNotFoundError</span>: No module named 'graphrag.index.flows'</pre>
</div>
</div>
</div>

View File

@ -3567,21 +3567,21 @@ print(result.response)</div>
<span class="ansi-green-fg"> 208</span> <span class="ansi-bold" style="color: rgb(0,135,0)">if</span> response.output.embeddings <span class="ansi-bold" style="color: rgb(175,0,255)">is</span> <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>:
<span class="ansi-green-fg"> 209</span> msg = <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">No embeddings found in response</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/base_llm.py:144</span>, in <span class="ansi-cyan-fg">BaseLLM.__call__</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/base_llm.py:144</span>, in <span class="ansi-cyan-fg">BaseLLM.__call__</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-green-fg"> 142</span> <span class="ansi-bold" style="color: rgb(0,135,0)">try</span>:
<span class="ansi-green-fg"> 143</span> prompt, kwargs = <span style="color: rgb(0,135,0)">self</span>._rewrite_input(prompt, kwargs)
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">144</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._decorated_target(prompt, **kwargs)
<span class="ansi-green-fg"> 145</span> <span class="ansi-bold" style="color: rgb(0,135,0)">except</span> <span class="ansi-bold" style="color: rgb(215,95,95)">BaseException</span> <span class="ansi-bold" style="color: rgb(0,135,0)">as</span> e:
<span class="ansi-green-fg"> 146</span> stack_trace = traceback.format_exc()
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/services/rate_limiter.py:75</span>, in <span class="ansi-cyan-fg">RateLimiter.decorate.&lt;locals&gt;.invoke</span><span class="ansi-blue-fg">(prompt, **args)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/services/rate_limiter.py:75</span>, in <span class="ansi-cyan-fg">RateLimiter.decorate.&lt;locals&gt;.invoke</span><span class="ansi-blue-fg">(prompt, **args)</span>
<span class="ansi-green-fg"> 73</span> <span class="ansi-bold" style="color: rgb(0,135,0)">async</span> <span class="ansi-bold" style="color: rgb(0,135,0)">with</span> <span style="color: rgb(0,135,0)">self</span>._limiter.use(manifest):
<span class="ansi-green-fg"> 74</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._events.on_limit_acquired(manifest)
<span class="ansi-green-fg">---&gt; </span><span class="ansi-green-fg">75</span> result = <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> delegate(prompt, **args)
<span class="ansi-green-fg"> 76</span> <span class="ansi-bold" style="color: rgb(0,135,0)">finally</span>:
<span class="ansi-green-fg"> 77</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._events.on_limit_released(manifest)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/base_llm.py:126</span>, in <span class="ansi-cyan-fg">BaseLLM._decorator_target</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/base_llm.py:126</span>, in <span class="ansi-cyan-fg">BaseLLM._decorator_target</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-green-fg"> 121</span> <span class="ansi-yellow-fg">"""Target for the decorator chain.</span>
<span class="ansi-green-fg"> 122</span>
<span class="ansi-green-fg"> 123</span> <span class="ansi-yellow-fg">Leave signature alone as prompt, kwargs.</span>
@ -3591,7 +3591,7 @@ print(result.response)</div>
<span class="ansi-green-fg"> 127</span> result = LLMOutput(output=output)
<span class="ansi-green-fg"> 128</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._inject_usage(result)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/openai/llm/openai_embeddings_llm.py:126</span>, in <span class="ansi-cyan-fg">OpenAIEmbeddingsLLMImpl._execute_llm</span><span class="ansi-blue-fg">(self, prompt, kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/openai/llm/openai_embeddings_llm.py:126</span>, in <span class="ansi-cyan-fg">OpenAIEmbeddingsLLMImpl._execute_llm</span><span class="ansi-blue-fg">(self, prompt, kwargs)</span>
<span class="ansi-green-fg"> 121</span> local_model_parameters = kwargs.get(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">model_parameters</span><span class="ansi-yellow-fg">"</span>)
<span class="ansi-green-fg"> 122</span> embeddings_parameters = <span style="color: rgb(0,135,0)">self</span>._build_embeddings_parameters(
<span class="ansi-green-fg"> 123</span> local_model_parameters
@ -3603,46 +3603,46 @@ print(result.response)</div>
<span class="ansi-green-fg"> 130</span> result = result_raw.parse()
<span class="ansi-green-fg"> 131</span> headers = result_raw.headers
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/_legacy_response.py:381</span>, in <span class="ansi-cyan-fg">async_to_raw_response_wrapper.&lt;locals&gt;.wrapped</span><span class="ansi-blue-fg">(*args, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/_legacy_response.py:381</span>, in <span class="ansi-cyan-fg">async_to_raw_response_wrapper.&lt;locals&gt;.wrapped</span><span class="ansi-blue-fg">(*args, **kwargs)</span>
<span class="ansi-green-fg"> 377</span> extra_headers[RAW_RESPONSE_HEADER] = <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">true</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-green-fg"> 379</span> kwargs[<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">extra_headers</span><span class="ansi-yellow-fg">"</span>] = extra_headers
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">381</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> cast(LegacyAPIResponse[R], <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> func(*args, **kwargs))
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/resources/embeddings.py:245</span>, in <span class="ansi-cyan-fg">AsyncEmbeddings.create</span><span class="ansi-blue-fg">(self, input, model, dimensions, encoding_format, user, extra_headers, extra_query, extra_body, timeout)</span>
<span class="ansi-green-fg"> 239</span> embedding.embedding = np.frombuffer( <span style="color: rgb(95,135,135)"># type: ignore[no-untyped-call]</span>
<span class="ansi-green-fg"> 240</span> base64.b64decode(data), dtype=<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">float32</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-green-fg"> 241</span> ).tolist()
<span class="ansi-green-fg"> 243</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> obj
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">245</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._post(
<span class="ansi-green-fg"> 246</span> <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">/embeddings</span><span class="ansi-yellow-fg">"</span>,
<span class="ansi-green-fg"> 247</span> body=maybe_transform(params, embedding_create_params.EmbeddingCreateParams),
<span class="ansi-green-fg"> 248</span> options=make_request_options(
<span class="ansi-green-fg"> 249</span> extra_headers=extra_headers,
<span class="ansi-green-fg"> 250</span> extra_query=extra_query,
<span class="ansi-green-fg"> 251</span> extra_body=extra_body,
<span class="ansi-green-fg"> 252</span> timeout=timeout,
<span class="ansi-green-fg"> 253</span> post_parser=parser,
<span class="ansi-green-fg"> 254</span> ),
<span class="ansi-green-fg"> 255</span> cast_to=CreateEmbeddingResponse,
<span class="ansi-green-fg"> 256</span> )
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/resources/embeddings.py:251</span>, in <span class="ansi-cyan-fg">AsyncEmbeddings.create</span><span class="ansi-blue-fg">(self, input, model, dimensions, encoding_format, user, extra_headers, extra_query, extra_body, timeout)</span>
<span class="ansi-green-fg"> 245</span> embedding.embedding = np.frombuffer( <span style="color: rgb(95,135,135)"># type: ignore[no-untyped-call]</span>
<span class="ansi-green-fg"> 246</span> base64.b64decode(data), dtype=<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">float32</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-green-fg"> 247</span> ).tolist()
<span class="ansi-green-fg"> 249</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> obj
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">251</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._post(
<span class="ansi-green-fg"> 252</span> <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">/embeddings</span><span class="ansi-yellow-fg">"</span>,
<span class="ansi-green-fg"> 253</span> body=maybe_transform(params, embedding_create_params.EmbeddingCreateParams),
<span class="ansi-green-fg"> 254</span> options=make_request_options(
<span class="ansi-green-fg"> 255</span> extra_headers=extra_headers,
<span class="ansi-green-fg"> 256</span> extra_query=extra_query,
<span class="ansi-green-fg"> 257</span> extra_body=extra_body,
<span class="ansi-green-fg"> 258</span> timeout=timeout,
<span class="ansi-green-fg"> 259</span> post_parser=parser,
<span class="ansi-green-fg"> 260</span> ),
<span class="ansi-green-fg"> 261</span> cast_to=CreateEmbeddingResponse,
<span class="ansi-green-fg"> 262</span> )
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/_base_client.py:1784</span>, in <span class="ansi-cyan-fg">AsyncAPIClient.post</span><span class="ansi-blue-fg">(self, path, cast_to, body, files, options, stream, stream_cls)</span>
<span class="ansi-green-fg"> 1770</span> <span class="ansi-bold" style="color: rgb(0,135,0)">async</span> <span class="ansi-bold" style="color: rgb(0,135,0)">def</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-blue-fg">post</span>(
<span class="ansi-green-fg"> 1771</span> <span style="color: rgb(0,135,0)">self</span>,
<span class="ansi-green-fg"> 1772</span> path: <span style="color: rgb(0,135,0)">str</span>,
<span class="ansi-green-fg"> (...)</span><span class="ansi-green-fg"> 1779</span> stream_cls: <span style="color: rgb(0,135,0)">type</span>[_AsyncStreamT] | <span class="ansi-bold" style="color: rgb(0,135,0)">None</span> = <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>,
<span class="ansi-green-fg"> 1780</span> ) -&gt; ResponseT | _AsyncStreamT:
<span class="ansi-green-fg"> 1781</span> opts = FinalRequestOptions.construct(
<span class="ansi-green-fg"> 1782</span> method=<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">post</span><span class="ansi-yellow-fg">"</span>, url=path, json_data=body, files=<span class="ansi-bold" style="color: rgb(0,135,0)">await</span> async_to_httpx_files(files), **options
<span class="ansi-green-fg"> 1783</span> )
<span class="ansi-green-fg">-&gt; </span><span class="ansi-green-fg">1784</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/_base_client.py:1791</span>, in <span class="ansi-cyan-fg">AsyncAPIClient.post</span><span class="ansi-blue-fg">(self, path, cast_to, body, files, options, stream, stream_cls)</span>
<span class="ansi-green-fg"> 1777</span> <span class="ansi-bold" style="color: rgb(0,135,0)">async</span> <span class="ansi-bold" style="color: rgb(0,135,0)">def</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-blue-fg">post</span>(
<span class="ansi-green-fg"> 1778</span> <span style="color: rgb(0,135,0)">self</span>,
<span class="ansi-green-fg"> 1779</span> path: <span style="color: rgb(0,135,0)">str</span>,
<span class="ansi-green-fg"> (...)</span><span class="ansi-green-fg"> 1786</span> stream_cls: <span style="color: rgb(0,135,0)">type</span>[_AsyncStreamT] | <span class="ansi-bold" style="color: rgb(0,135,0)">None</span> = <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>,
<span class="ansi-green-fg"> 1787</span> ) -&gt; ResponseT | _AsyncStreamT:
<span class="ansi-green-fg"> 1788</span> opts = FinalRequestOptions.construct(
<span class="ansi-green-fg"> 1789</span> method=<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">post</span><span class="ansi-yellow-fg">"</span>, url=path, json_data=body, files=<span class="ansi-bold" style="color: rgb(0,135,0)">await</span> async_to_httpx_files(files), **options
<span class="ansi-green-fg"> 1790</span> )
<span class="ansi-green-fg">-&gt; </span><span class="ansi-green-fg">1791</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/_base_client.py:1584</span>, in <span class="ansi-cyan-fg">AsyncAPIClient.request</span><span class="ansi-blue-fg">(self, cast_to, options, stream, stream_cls)</span>
<span class="ansi-green-fg"> 1581</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> err.response.aread()
<span class="ansi-green-fg"> 1583</span> log.debug(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">Re-raising status error</span><span class="ansi-yellow-fg">"</span>)
<span class="ansi-green-fg">-&gt; </span><span class="ansi-green-fg">1584</span> <span class="ansi-bold" style="color: rgb(0,135,0)">raise</span> <span style="color: rgb(0,135,0)">self</span>._make_status_error_from_response(err.response) <span class="ansi-bold" style="color: rgb(0,135,0)">from</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-bold" style="color: rgb(0,135,0)">None</span>
<span class="ansi-green-fg"> 1586</span> <span class="ansi-bold" style="color: rgb(0,135,0)">break</span>
<span class="ansi-green-fg"> 1588</span> <span class="ansi-bold" style="color: rgb(0,135,0)">assert</span> response <span class="ansi-bold" style="color: rgb(175,0,255)">is</span> <span class="ansi-bold" style="color: rgb(175,0,255)">not</span> <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>, <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">could not resolve response (should never happen)</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/_base_client.py:1591</span>, in <span class="ansi-cyan-fg">AsyncAPIClient.request</span><span class="ansi-blue-fg">(self, cast_to, options, stream, stream_cls)</span>
<span class="ansi-green-fg"> 1588</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> err.response.aread()
<span class="ansi-green-fg"> 1590</span> log.debug(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">Re-raising status error</span><span class="ansi-yellow-fg">"</span>)
<span class="ansi-green-fg">-&gt; </span><span class="ansi-green-fg">1591</span> <span class="ansi-bold" style="color: rgb(0,135,0)">raise</span> <span style="color: rgb(0,135,0)">self</span>._make_status_error_from_response(err.response) <span class="ansi-bold" style="color: rgb(0,135,0)">from</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-bold" style="color: rgb(0,135,0)">None</span>
<span class="ansi-green-fg"> 1593</span> <span class="ansi-bold" style="color: rgb(0,135,0)">break</span>
<span class="ansi-green-fg"> 1595</span> <span class="ansi-bold" style="color: rgb(0,135,0)">assert</span> response <span class="ansi-bold" style="color: rgb(175,0,255)">is</span> <span class="ansi-bold" style="color: rgb(175,0,255)">not</span> <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>, <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">could not resolve response (should never happen)</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-red-fg">AuthenticationError</span>: Error code: 401 - {'error': {'message': 'Incorrect API key provided: sk-proj-********************************************************************************************************************************************************zWYA. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}</pre>
</div>
@ -3805,21 +3805,21 @@ print(result.response)</div>
<span class="ansi-green-fg"> 208</span> <span class="ansi-bold" style="color: rgb(0,135,0)">if</span> response.output.embeddings <span class="ansi-bold" style="color: rgb(175,0,255)">is</span> <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>:
<span class="ansi-green-fg"> 209</span> msg = <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">No embeddings found in response</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/base_llm.py:144</span>, in <span class="ansi-cyan-fg">BaseLLM.__call__</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/base_llm.py:144</span>, in <span class="ansi-cyan-fg">BaseLLM.__call__</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-green-fg"> 142</span> <span class="ansi-bold" style="color: rgb(0,135,0)">try</span>:
<span class="ansi-green-fg"> 143</span> prompt, kwargs = <span style="color: rgb(0,135,0)">self</span>._rewrite_input(prompt, kwargs)
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">144</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._decorated_target(prompt, **kwargs)
<span class="ansi-green-fg"> 145</span> <span class="ansi-bold" style="color: rgb(0,135,0)">except</span> <span class="ansi-bold" style="color: rgb(215,95,95)">BaseException</span> <span class="ansi-bold" style="color: rgb(0,135,0)">as</span> e:
<span class="ansi-green-fg"> 146</span> stack_trace = traceback.format_exc()
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/services/rate_limiter.py:75</span>, in <span class="ansi-cyan-fg">RateLimiter.decorate.&lt;locals&gt;.invoke</span><span class="ansi-blue-fg">(prompt, **args)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/services/rate_limiter.py:75</span>, in <span class="ansi-cyan-fg">RateLimiter.decorate.&lt;locals&gt;.invoke</span><span class="ansi-blue-fg">(prompt, **args)</span>
<span class="ansi-green-fg"> 73</span> <span class="ansi-bold" style="color: rgb(0,135,0)">async</span> <span class="ansi-bold" style="color: rgb(0,135,0)">with</span> <span style="color: rgb(0,135,0)">self</span>._limiter.use(manifest):
<span class="ansi-green-fg"> 74</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._events.on_limit_acquired(manifest)
<span class="ansi-green-fg">---&gt; </span><span class="ansi-green-fg">75</span> result = <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> delegate(prompt, **args)
<span class="ansi-green-fg"> 76</span> <span class="ansi-bold" style="color: rgb(0,135,0)">finally</span>:
<span class="ansi-green-fg"> 77</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._events.on_limit_released(manifest)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/base_llm.py:126</span>, in <span class="ansi-cyan-fg">BaseLLM._decorator_target</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/base_llm.py:126</span>, in <span class="ansi-cyan-fg">BaseLLM._decorator_target</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-green-fg"> 121</span> <span class="ansi-yellow-fg">"""Target for the decorator chain.</span>
<span class="ansi-green-fg"> 122</span>
<span class="ansi-green-fg"> 123</span> <span class="ansi-yellow-fg">Leave signature alone as prompt, kwargs.</span>
@ -3829,7 +3829,7 @@ print(result.response)</div>
<span class="ansi-green-fg"> 127</span> result = LLMOutput(output=output)
<span class="ansi-green-fg"> 128</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._inject_usage(result)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/openai/llm/openai_embeddings_llm.py:126</span>, in <span class="ansi-cyan-fg">OpenAIEmbeddingsLLMImpl._execute_llm</span><span class="ansi-blue-fg">(self, prompt, kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/openai/llm/openai_embeddings_llm.py:126</span>, in <span class="ansi-cyan-fg">OpenAIEmbeddingsLLMImpl._execute_llm</span><span class="ansi-blue-fg">(self, prompt, kwargs)</span>
<span class="ansi-green-fg"> 121</span> local_model_parameters = kwargs.get(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">model_parameters</span><span class="ansi-yellow-fg">"</span>)
<span class="ansi-green-fg"> 122</span> embeddings_parameters = <span style="color: rgb(0,135,0)">self</span>._build_embeddings_parameters(
<span class="ansi-green-fg"> 123</span> local_model_parameters
@ -3841,46 +3841,46 @@ print(result.response)</div>
<span class="ansi-green-fg"> 130</span> result = result_raw.parse()
<span class="ansi-green-fg"> 131</span> headers = result_raw.headers
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/_legacy_response.py:381</span>, in <span class="ansi-cyan-fg">async_to_raw_response_wrapper.&lt;locals&gt;.wrapped</span><span class="ansi-blue-fg">(*args, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/_legacy_response.py:381</span>, in <span class="ansi-cyan-fg">async_to_raw_response_wrapper.&lt;locals&gt;.wrapped</span><span class="ansi-blue-fg">(*args, **kwargs)</span>
<span class="ansi-green-fg"> 377</span> extra_headers[RAW_RESPONSE_HEADER] = <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">true</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-green-fg"> 379</span> kwargs[<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">extra_headers</span><span class="ansi-yellow-fg">"</span>] = extra_headers
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">381</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> cast(LegacyAPIResponse[R], <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> func(*args, **kwargs))
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/resources/embeddings.py:245</span>, in <span class="ansi-cyan-fg">AsyncEmbeddings.create</span><span class="ansi-blue-fg">(self, input, model, dimensions, encoding_format, user, extra_headers, extra_query, extra_body, timeout)</span>
<span class="ansi-green-fg"> 239</span> embedding.embedding = np.frombuffer( <span style="color: rgb(95,135,135)"># type: ignore[no-untyped-call]</span>
<span class="ansi-green-fg"> 240</span> base64.b64decode(data), dtype=<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">float32</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-green-fg"> 241</span> ).tolist()
<span class="ansi-green-fg"> 243</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> obj
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">245</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._post(
<span class="ansi-green-fg"> 246</span> <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">/embeddings</span><span class="ansi-yellow-fg">"</span>,
<span class="ansi-green-fg"> 247</span> body=maybe_transform(params, embedding_create_params.EmbeddingCreateParams),
<span class="ansi-green-fg"> 248</span> options=make_request_options(
<span class="ansi-green-fg"> 249</span> extra_headers=extra_headers,
<span class="ansi-green-fg"> 250</span> extra_query=extra_query,
<span class="ansi-green-fg"> 251</span> extra_body=extra_body,
<span class="ansi-green-fg"> 252</span> timeout=timeout,
<span class="ansi-green-fg"> 253</span> post_parser=parser,
<span class="ansi-green-fg"> 254</span> ),
<span class="ansi-green-fg"> 255</span> cast_to=CreateEmbeddingResponse,
<span class="ansi-green-fg"> 256</span> )
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/resources/embeddings.py:251</span>, in <span class="ansi-cyan-fg">AsyncEmbeddings.create</span><span class="ansi-blue-fg">(self, input, model, dimensions, encoding_format, user, extra_headers, extra_query, extra_body, timeout)</span>
<span class="ansi-green-fg"> 245</span> embedding.embedding = np.frombuffer( <span style="color: rgb(95,135,135)"># type: ignore[no-untyped-call]</span>
<span class="ansi-green-fg"> 246</span> base64.b64decode(data), dtype=<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">float32</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-green-fg"> 247</span> ).tolist()
<span class="ansi-green-fg"> 249</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> obj
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">251</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._post(
<span class="ansi-green-fg"> 252</span> <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">/embeddings</span><span class="ansi-yellow-fg">"</span>,
<span class="ansi-green-fg"> 253</span> body=maybe_transform(params, embedding_create_params.EmbeddingCreateParams),
<span class="ansi-green-fg"> 254</span> options=make_request_options(
<span class="ansi-green-fg"> 255</span> extra_headers=extra_headers,
<span class="ansi-green-fg"> 256</span> extra_query=extra_query,
<span class="ansi-green-fg"> 257</span> extra_body=extra_body,
<span class="ansi-green-fg"> 258</span> timeout=timeout,
<span class="ansi-green-fg"> 259</span> post_parser=parser,
<span class="ansi-green-fg"> 260</span> ),
<span class="ansi-green-fg"> 261</span> cast_to=CreateEmbeddingResponse,
<span class="ansi-green-fg"> 262</span> )
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/_base_client.py:1784</span>, in <span class="ansi-cyan-fg">AsyncAPIClient.post</span><span class="ansi-blue-fg">(self, path, cast_to, body, files, options, stream, stream_cls)</span>
<span class="ansi-green-fg"> 1770</span> <span class="ansi-bold" style="color: rgb(0,135,0)">async</span> <span class="ansi-bold" style="color: rgb(0,135,0)">def</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-blue-fg">post</span>(
<span class="ansi-green-fg"> 1771</span> <span style="color: rgb(0,135,0)">self</span>,
<span class="ansi-green-fg"> 1772</span> path: <span style="color: rgb(0,135,0)">str</span>,
<span class="ansi-green-fg"> (...)</span><span class="ansi-green-fg"> 1779</span> stream_cls: <span style="color: rgb(0,135,0)">type</span>[_AsyncStreamT] | <span class="ansi-bold" style="color: rgb(0,135,0)">None</span> = <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>,
<span class="ansi-green-fg"> 1780</span> ) -&gt; ResponseT | _AsyncStreamT:
<span class="ansi-green-fg"> 1781</span> opts = FinalRequestOptions.construct(
<span class="ansi-green-fg"> 1782</span> method=<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">post</span><span class="ansi-yellow-fg">"</span>, url=path, json_data=body, files=<span class="ansi-bold" style="color: rgb(0,135,0)">await</span> async_to_httpx_files(files), **options
<span class="ansi-green-fg"> 1783</span> )
<span class="ansi-green-fg">-&gt; </span><span class="ansi-green-fg">1784</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/_base_client.py:1791</span>, in <span class="ansi-cyan-fg">AsyncAPIClient.post</span><span class="ansi-blue-fg">(self, path, cast_to, body, files, options, stream, stream_cls)</span>
<span class="ansi-green-fg"> 1777</span> <span class="ansi-bold" style="color: rgb(0,135,0)">async</span> <span class="ansi-bold" style="color: rgb(0,135,0)">def</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-blue-fg">post</span>(
<span class="ansi-green-fg"> 1778</span> <span style="color: rgb(0,135,0)">self</span>,
<span class="ansi-green-fg"> 1779</span> path: <span style="color: rgb(0,135,0)">str</span>,
<span class="ansi-green-fg"> (...)</span><span class="ansi-green-fg"> 1786</span> stream_cls: <span style="color: rgb(0,135,0)">type</span>[_AsyncStreamT] | <span class="ansi-bold" style="color: rgb(0,135,0)">None</span> = <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>,
<span class="ansi-green-fg"> 1787</span> ) -&gt; ResponseT | _AsyncStreamT:
<span class="ansi-green-fg"> 1788</span> opts = FinalRequestOptions.construct(
<span class="ansi-green-fg"> 1789</span> method=<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">post</span><span class="ansi-yellow-fg">"</span>, url=path, json_data=body, files=<span class="ansi-bold" style="color: rgb(0,135,0)">await</span> async_to_httpx_files(files), **options
<span class="ansi-green-fg"> 1790</span> )
<span class="ansi-green-fg">-&gt; </span><span class="ansi-green-fg">1791</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/_base_client.py:1584</span>, in <span class="ansi-cyan-fg">AsyncAPIClient.request</span><span class="ansi-blue-fg">(self, cast_to, options, stream, stream_cls)</span>
<span class="ansi-green-fg"> 1581</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> err.response.aread()
<span class="ansi-green-fg"> 1583</span> log.debug(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">Re-raising status error</span><span class="ansi-yellow-fg">"</span>)
<span class="ansi-green-fg">-&gt; </span><span class="ansi-green-fg">1584</span> <span class="ansi-bold" style="color: rgb(0,135,0)">raise</span> <span style="color: rgb(0,135,0)">self</span>._make_status_error_from_response(err.response) <span class="ansi-bold" style="color: rgb(0,135,0)">from</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-bold" style="color: rgb(0,135,0)">None</span>
<span class="ansi-green-fg"> 1586</span> <span class="ansi-bold" style="color: rgb(0,135,0)">break</span>
<span class="ansi-green-fg"> 1588</span> <span class="ansi-bold" style="color: rgb(0,135,0)">assert</span> response <span class="ansi-bold" style="color: rgb(175,0,255)">is</span> <span class="ansi-bold" style="color: rgb(175,0,255)">not</span> <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>, <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">could not resolve response (should never happen)</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/_base_client.py:1591</span>, in <span class="ansi-cyan-fg">AsyncAPIClient.request</span><span class="ansi-blue-fg">(self, cast_to, options, stream, stream_cls)</span>
<span class="ansi-green-fg"> 1588</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> err.response.aread()
<span class="ansi-green-fg"> 1590</span> log.debug(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">Re-raising status error</span><span class="ansi-yellow-fg">"</span>)
<span class="ansi-green-fg">-&gt; </span><span class="ansi-green-fg">1591</span> <span class="ansi-bold" style="color: rgb(0,135,0)">raise</span> <span style="color: rgb(0,135,0)">self</span>._make_status_error_from_response(err.response) <span class="ansi-bold" style="color: rgb(0,135,0)">from</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-bold" style="color: rgb(0,135,0)">None</span>
<span class="ansi-green-fg"> 1593</span> <span class="ansi-bold" style="color: rgb(0,135,0)">break</span>
<span class="ansi-green-fg"> 1595</span> <span class="ansi-bold" style="color: rgb(0,135,0)">assert</span> response <span class="ansi-bold" style="color: rgb(175,0,255)">is</span> <span class="ansi-bold" style="color: rgb(175,0,255)">not</span> <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>, <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">could not resolve response (should never happen)</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-red-fg">AuthenticationError</span>: Error code: 401 - {'error': {'message': 'Incorrect API key provided: sk-proj-********************************************************************************************************************************************************zWYA. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}</pre>
</div>
@ -4365,21 +4365,21 @@ print(candidate_questions.response)</div>
<span class="ansi-green-fg"> 208</span> <span class="ansi-bold" style="color: rgb(0,135,0)">if</span> response.output.embeddings <span class="ansi-bold" style="color: rgb(175,0,255)">is</span> <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>:
<span class="ansi-green-fg"> 209</span> msg = <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">No embeddings found in response</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/base_llm.py:144</span>, in <span class="ansi-cyan-fg">BaseLLM.__call__</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/base_llm.py:144</span>, in <span class="ansi-cyan-fg">BaseLLM.__call__</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-green-fg"> 142</span> <span class="ansi-bold" style="color: rgb(0,135,0)">try</span>:
<span class="ansi-green-fg"> 143</span> prompt, kwargs = <span style="color: rgb(0,135,0)">self</span>._rewrite_input(prompt, kwargs)
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">144</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._decorated_target(prompt, **kwargs)
<span class="ansi-green-fg"> 145</span> <span class="ansi-bold" style="color: rgb(0,135,0)">except</span> <span class="ansi-bold" style="color: rgb(215,95,95)">BaseException</span> <span class="ansi-bold" style="color: rgb(0,135,0)">as</span> e:
<span class="ansi-green-fg"> 146</span> stack_trace = traceback.format_exc()
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/services/rate_limiter.py:75</span>, in <span class="ansi-cyan-fg">RateLimiter.decorate.&lt;locals&gt;.invoke</span><span class="ansi-blue-fg">(prompt, **args)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/services/rate_limiter.py:75</span>, in <span class="ansi-cyan-fg">RateLimiter.decorate.&lt;locals&gt;.invoke</span><span class="ansi-blue-fg">(prompt, **args)</span>
<span class="ansi-green-fg"> 73</span> <span class="ansi-bold" style="color: rgb(0,135,0)">async</span> <span class="ansi-bold" style="color: rgb(0,135,0)">with</span> <span style="color: rgb(0,135,0)">self</span>._limiter.use(manifest):
<span class="ansi-green-fg"> 74</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._events.on_limit_acquired(manifest)
<span class="ansi-green-fg">---&gt; </span><span class="ansi-green-fg">75</span> result = <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> delegate(prompt, **args)
<span class="ansi-green-fg"> 76</span> <span class="ansi-bold" style="color: rgb(0,135,0)">finally</span>:
<span class="ansi-green-fg"> 77</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._events.on_limit_released(manifest)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/base/base_llm.py:126</span>, in <span class="ansi-cyan-fg">BaseLLM._decorator_target</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/base/base_llm.py:126</span>, in <span class="ansi-cyan-fg">BaseLLM._decorator_target</span><span class="ansi-blue-fg">(self, prompt, **kwargs)</span>
<span class="ansi-green-fg"> 121</span> <span class="ansi-yellow-fg">"""Target for the decorator chain.</span>
<span class="ansi-green-fg"> 122</span>
<span class="ansi-green-fg"> 123</span> <span class="ansi-yellow-fg">Leave signature alone as prompt, kwargs.</span>
@ -4389,7 +4389,7 @@ print(candidate_questions.response)</div>
<span class="ansi-green-fg"> 127</span> result = LLMOutput(output=output)
<span class="ansi-green-fg"> 128</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._inject_usage(result)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/fnllm/openai/llm/openai_embeddings_llm.py:126</span>, in <span class="ansi-cyan-fg">OpenAIEmbeddingsLLMImpl._execute_llm</span><span class="ansi-blue-fg">(self, prompt, kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/fnllm/openai/llm/openai_embeddings_llm.py:126</span>, in <span class="ansi-cyan-fg">OpenAIEmbeddingsLLMImpl._execute_llm</span><span class="ansi-blue-fg">(self, prompt, kwargs)</span>
<span class="ansi-green-fg"> 121</span> local_model_parameters = kwargs.get(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">model_parameters</span><span class="ansi-yellow-fg">"</span>)
<span class="ansi-green-fg"> 122</span> embeddings_parameters = <span style="color: rgb(0,135,0)">self</span>._build_embeddings_parameters(
<span class="ansi-green-fg"> 123</span> local_model_parameters
@ -4401,46 +4401,46 @@ print(candidate_questions.response)</div>
<span class="ansi-green-fg"> 130</span> result = result_raw.parse()
<span class="ansi-green-fg"> 131</span> headers = result_raw.headers
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/_legacy_response.py:381</span>, in <span class="ansi-cyan-fg">async_to_raw_response_wrapper.&lt;locals&gt;.wrapped</span><span class="ansi-blue-fg">(*args, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/_legacy_response.py:381</span>, in <span class="ansi-cyan-fg">async_to_raw_response_wrapper.&lt;locals&gt;.wrapped</span><span class="ansi-blue-fg">(*args, **kwargs)</span>
<span class="ansi-green-fg"> 377</span> extra_headers[RAW_RESPONSE_HEADER] = <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">true</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-green-fg"> 379</span> kwargs[<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">extra_headers</span><span class="ansi-yellow-fg">"</span>] = extra_headers
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">381</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> cast(LegacyAPIResponse[R], <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> func(*args, **kwargs))
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/resources/embeddings.py:245</span>, in <span class="ansi-cyan-fg">AsyncEmbeddings.create</span><span class="ansi-blue-fg">(self, input, model, dimensions, encoding_format, user, extra_headers, extra_query, extra_body, timeout)</span>
<span class="ansi-green-fg"> 239</span> embedding.embedding = np.frombuffer( <span style="color: rgb(95,135,135)"># type: ignore[no-untyped-call]</span>
<span class="ansi-green-fg"> 240</span> base64.b64decode(data), dtype=<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">float32</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-green-fg"> 241</span> ).tolist()
<span class="ansi-green-fg"> 243</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> obj
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">245</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._post(
<span class="ansi-green-fg"> 246</span> <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">/embeddings</span><span class="ansi-yellow-fg">"</span>,
<span class="ansi-green-fg"> 247</span> body=maybe_transform(params, embedding_create_params.EmbeddingCreateParams),
<span class="ansi-green-fg"> 248</span> options=make_request_options(
<span class="ansi-green-fg"> 249</span> extra_headers=extra_headers,
<span class="ansi-green-fg"> 250</span> extra_query=extra_query,
<span class="ansi-green-fg"> 251</span> extra_body=extra_body,
<span class="ansi-green-fg"> 252</span> timeout=timeout,
<span class="ansi-green-fg"> 253</span> post_parser=parser,
<span class="ansi-green-fg"> 254</span> ),
<span class="ansi-green-fg"> 255</span> cast_to=CreateEmbeddingResponse,
<span class="ansi-green-fg"> 256</span> )
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/resources/embeddings.py:251</span>, in <span class="ansi-cyan-fg">AsyncEmbeddings.create</span><span class="ansi-blue-fg">(self, input, model, dimensions, encoding_format, user, extra_headers, extra_query, extra_body, timeout)</span>
<span class="ansi-green-fg"> 245</span> embedding.embedding = np.frombuffer( <span style="color: rgb(95,135,135)"># type: ignore[no-untyped-call]</span>
<span class="ansi-green-fg"> 246</span> base64.b64decode(data), dtype=<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">float32</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-green-fg"> 247</span> ).tolist()
<span class="ansi-green-fg"> 249</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> obj
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">251</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>._post(
<span class="ansi-green-fg"> 252</span> <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">/embeddings</span><span class="ansi-yellow-fg">"</span>,
<span class="ansi-green-fg"> 253</span> body=maybe_transform(params, embedding_create_params.EmbeddingCreateParams),
<span class="ansi-green-fg"> 254</span> options=make_request_options(
<span class="ansi-green-fg"> 255</span> extra_headers=extra_headers,
<span class="ansi-green-fg"> 256</span> extra_query=extra_query,
<span class="ansi-green-fg"> 257</span> extra_body=extra_body,
<span class="ansi-green-fg"> 258</span> timeout=timeout,
<span class="ansi-green-fg"> 259</span> post_parser=parser,
<span class="ansi-green-fg"> 260</span> ),
<span class="ansi-green-fg"> 261</span> cast_to=CreateEmbeddingResponse,
<span class="ansi-green-fg"> 262</span> )
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/_base_client.py:1784</span>, in <span class="ansi-cyan-fg">AsyncAPIClient.post</span><span class="ansi-blue-fg">(self, path, cast_to, body, files, options, stream, stream_cls)</span>
<span class="ansi-green-fg"> 1770</span> <span class="ansi-bold" style="color: rgb(0,135,0)">async</span> <span class="ansi-bold" style="color: rgb(0,135,0)">def</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-blue-fg">post</span>(
<span class="ansi-green-fg"> 1771</span> <span style="color: rgb(0,135,0)">self</span>,
<span class="ansi-green-fg"> 1772</span> path: <span style="color: rgb(0,135,0)">str</span>,
<span class="ansi-green-fg"> (...)</span><span class="ansi-green-fg"> 1779</span> stream_cls: <span style="color: rgb(0,135,0)">type</span>[_AsyncStreamT] | <span class="ansi-bold" style="color: rgb(0,135,0)">None</span> = <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>,
<span class="ansi-green-fg"> 1780</span> ) -&gt; ResponseT | _AsyncStreamT:
<span class="ansi-green-fg"> 1781</span> opts = FinalRequestOptions.construct(
<span class="ansi-green-fg"> 1782</span> method=<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">post</span><span class="ansi-yellow-fg">"</span>, url=path, json_data=body, files=<span class="ansi-bold" style="color: rgb(0,135,0)">await</span> async_to_httpx_files(files), **options
<span class="ansi-green-fg"> 1783</span> )
<span class="ansi-green-fg">-&gt; </span><span class="ansi-green-fg">1784</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/_base_client.py:1791</span>, in <span class="ansi-cyan-fg">AsyncAPIClient.post</span><span class="ansi-blue-fg">(self, path, cast_to, body, files, options, stream, stream_cls)</span>
<span class="ansi-green-fg"> 1777</span> <span class="ansi-bold" style="color: rgb(0,135,0)">async</span> <span class="ansi-bold" style="color: rgb(0,135,0)">def</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-blue-fg">post</span>(
<span class="ansi-green-fg"> 1778</span> <span style="color: rgb(0,135,0)">self</span>,
<span class="ansi-green-fg"> 1779</span> path: <span style="color: rgb(0,135,0)">str</span>,
<span class="ansi-green-fg"> (...)</span><span class="ansi-green-fg"> 1786</span> stream_cls: <span style="color: rgb(0,135,0)">type</span>[_AsyncStreamT] | <span class="ansi-bold" style="color: rgb(0,135,0)">None</span> = <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>,
<span class="ansi-green-fg"> 1787</span> ) -&gt; ResponseT | _AsyncStreamT:
<span class="ansi-green-fg"> 1788</span> opts = FinalRequestOptions.construct(
<span class="ansi-green-fg"> 1789</span> method=<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">post</span><span class="ansi-yellow-fg">"</span>, url=path, json_data=body, files=<span class="ansi-bold" style="color: rgb(0,135,0)">await</span> async_to_httpx_files(files), **options
<span class="ansi-green-fg"> 1790</span> )
<span class="ansi-green-fg">-&gt; </span><span class="ansi-green-fg">1791</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> <span style="color: rgb(0,135,0)">self</span>.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/openai/_base_client.py:1584</span>, in <span class="ansi-cyan-fg">AsyncAPIClient.request</span><span class="ansi-blue-fg">(self, cast_to, options, stream, stream_cls)</span>
<span class="ansi-green-fg"> 1581</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> err.response.aread()
<span class="ansi-green-fg"> 1583</span> log.debug(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">Re-raising status error</span><span class="ansi-yellow-fg">"</span>)
<span class="ansi-green-fg">-&gt; </span><span class="ansi-green-fg">1584</span> <span class="ansi-bold" style="color: rgb(0,135,0)">raise</span> <span style="color: rgb(0,135,0)">self</span>._make_status_error_from_response(err.response) <span class="ansi-bold" style="color: rgb(0,135,0)">from</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-bold" style="color: rgb(0,135,0)">None</span>
<span class="ansi-green-fg"> 1586</span> <span class="ansi-bold" style="color: rgb(0,135,0)">break</span>
<span class="ansi-green-fg"> 1588</span> <span class="ansi-bold" style="color: rgb(0,135,0)">assert</span> response <span class="ansi-bold" style="color: rgb(175,0,255)">is</span> <span class="ansi-bold" style="color: rgb(175,0,255)">not</span> <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>, <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">could not resolve response (should never happen)</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/openai/_base_client.py:1591</span>, in <span class="ansi-cyan-fg">AsyncAPIClient.request</span><span class="ansi-blue-fg">(self, cast_to, options, stream, stream_cls)</span>
<span class="ansi-green-fg"> 1588</span> <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> err.response.aread()
<span class="ansi-green-fg"> 1590</span> log.debug(<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">Re-raising status error</span><span class="ansi-yellow-fg">"</span>)
<span class="ansi-green-fg">-&gt; </span><span class="ansi-green-fg">1591</span> <span class="ansi-bold" style="color: rgb(0,135,0)">raise</span> <span style="color: rgb(0,135,0)">self</span>._make_status_error_from_response(err.response) <span class="ansi-bold" style="color: rgb(0,135,0)">from</span><span style="color: rgb(188,188,188)"> </span><span class="ansi-bold" style="color: rgb(0,135,0)">None</span>
<span class="ansi-green-fg"> 1593</span> <span class="ansi-bold" style="color: rgb(0,135,0)">break</span>
<span class="ansi-green-fg"> 1595</span> <span class="ansi-bold" style="color: rgb(0,135,0)">assert</span> response <span class="ansi-bold" style="color: rgb(175,0,255)">is</span> <span class="ansi-bold" style="color: rgb(175,0,255)">not</span> <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>, <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">could not resolve response (should never happen)</span><span class="ansi-yellow-fg">"</span>
<span class="ansi-red-fg">AuthenticationError</span>: Error code: 401 - {'error': {'message': 'Incorrect API key provided: sk-proj-********************************************************************************************************************************************************zWYA. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}</pre>
</div>

View File

@ -2432,7 +2432,7 @@ results = await task</div>
<span class="ansi-green-fg"> 6</span> pd.read_parquet(<span class="ansi-yellow-fg">f</span><span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">inputs/</span><span class="ansi-bold" style="color: rgb(175,95,135)">{</span>index<span class="ansi-bold" style="color: rgb(175,95,135)">}</span><span class="ansi-yellow-fg">/community_reports.parquet</span><span class="ansi-yellow-fg">"</span>) <span class="ansi-bold" style="color: rgb(0,135,0)">for</span> index <span class="ansi-bold" style="color: rgb(175,0,255)">in</span> indexes
<span class="ansi-green-fg"> 7</span> ]
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/pandas/io/parquet.py:669</span>, in <span class="ansi-cyan-fg">read_parquet</span><span class="ansi-blue-fg">(path, engine, columns, storage_options, use_nullable_dtypes, dtype_backend, filesystem, filters, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/pandas/io/parquet.py:669</span>, in <span class="ansi-cyan-fg">read_parquet</span><span class="ansi-blue-fg">(path, engine, columns, storage_options, use_nullable_dtypes, dtype_backend, filesystem, filters, **kwargs)</span>
<span class="ansi-green-fg"> 666</span> use_nullable_dtypes = <span class="ansi-bold" style="color: rgb(0,135,0)">False</span>
<span class="ansi-green-fg"> 667</span> check_dtype_backend(dtype_backend)
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">669</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-yellow-bg">impl</span><span class="ansi-yellow-bg">.</span><span class="ansi-yellow-bg">read</span><span class="ansi-yellow-bg">(</span>
@ -2446,7 +2446,7 @@ results = await task</div>
<span class="ansi-green-fg"> 677</span> <span class="ansi-yellow-bg"> </span><span class="ansi-yellow-bg">*</span><span class="ansi-yellow-bg">*</span><span class="ansi-yellow-bg">kwargs</span><span class="ansi-yellow-bg">,</span>
<span class="ansi-green-fg"> 678</span> <span class="ansi-yellow-bg">)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/pandas/io/parquet.py:258</span>, in <span class="ansi-cyan-fg">PyArrowImpl.read</span><span class="ansi-blue-fg">(self, path, columns, filters, use_nullable_dtypes, dtype_backend, storage_options, filesystem, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/pandas/io/parquet.py:258</span>, in <span class="ansi-cyan-fg">PyArrowImpl.read</span><span class="ansi-blue-fg">(self, path, columns, filters, use_nullable_dtypes, dtype_backend, storage_options, filesystem, **kwargs)</span>
<span class="ansi-green-fg"> 256</span> <span class="ansi-bold" style="color: rgb(0,135,0)">if</span> manager == <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">array</span><span class="ansi-yellow-fg">"</span>:
<span class="ansi-green-fg"> 257</span> to_pandas_kwargs[<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">split_blocks</span><span class="ansi-yellow-fg">"</span>] = <span class="ansi-bold" style="color: rgb(0,135,0)">True</span>
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">258</span> path_or_handle, handles, filesystem = <span class="ansi-yellow-bg">_get_path_or_handle</span><span class="ansi-yellow-bg">(</span>
@ -2462,7 +2462,7 @@ results = await task</div>
<span class="ansi-green-fg"> (...)</span><span class="ansi-green-fg"> 270</span> **kwargs,
<span class="ansi-green-fg"> 271</span> )
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/pandas/io/parquet.py:141</span>, in <span class="ansi-cyan-fg">_get_path_or_handle</span><span class="ansi-blue-fg">(path, fs, storage_options, mode, is_dir)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/pandas/io/parquet.py:141</span>, in <span class="ansi-cyan-fg">_get_path_or_handle</span><span class="ansi-blue-fg">(path, fs, storage_options, mode, is_dir)</span>
<span class="ansi-green-fg"> 131</span> handles = <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>
<span class="ansi-green-fg"> 132</span> <span class="ansi-bold" style="color: rgb(0,135,0)">if</span> (
<span class="ansi-green-fg"> 133</span> <span class="ansi-bold" style="color: rgb(175,0,255)">not</span> fs
@ -2475,7 +2475,7 @@ results = await task</div>
<span class="ansi-green-fg"> 144</span> fs = <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>
<span class="ansi-green-fg"> 145</span> path_or_handle = handles.handle
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/pandas/io/common.py:882</span>, in <span class="ansi-cyan-fg">get_handle</span><span class="ansi-blue-fg">(path_or_buf, mode, encoding, compression, memory_map, is_text, errors, storage_options)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/pandas/io/common.py:882</span>, in <span class="ansi-cyan-fg">get_handle</span><span class="ansi-blue-fg">(path_or_buf, mode, encoding, compression, memory_map, is_text, errors, storage_options)</span>
<span class="ansi-green-fg"> 873</span> handle = <span style="color: rgb(0,135,0)">open</span>(
<span class="ansi-green-fg"> 874</span> handle,
<span class="ansi-green-fg"> 875</span> ioargs.mode,
@ -2775,7 +2775,7 @@ results = await task</div>
<span class="ansi-green-fg"> 6</span> pd.read_parquet(<span class="ansi-yellow-fg">f</span><span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">inputs/</span><span class="ansi-bold" style="color: rgb(175,95,135)">{</span>index<span class="ansi-bold" style="color: rgb(175,95,135)">}</span><span class="ansi-yellow-fg">/community_reports.parquet</span><span class="ansi-yellow-fg">"</span>) <span class="ansi-bold" style="color: rgb(0,135,0)">for</span> index <span class="ansi-bold" style="color: rgb(175,0,255)">in</span> indexes
<span class="ansi-green-fg"> 7</span> ]
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/pandas/io/parquet.py:669</span>, in <span class="ansi-cyan-fg">read_parquet</span><span class="ansi-blue-fg">(path, engine, columns, storage_options, use_nullable_dtypes, dtype_backend, filesystem, filters, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/pandas/io/parquet.py:669</span>, in <span class="ansi-cyan-fg">read_parquet</span><span class="ansi-blue-fg">(path, engine, columns, storage_options, use_nullable_dtypes, dtype_backend, filesystem, filters, **kwargs)</span>
<span class="ansi-green-fg"> 666</span> use_nullable_dtypes = <span class="ansi-bold" style="color: rgb(0,135,0)">False</span>
<span class="ansi-green-fg"> 667</span> check_dtype_backend(dtype_backend)
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">669</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-yellow-bg">impl</span><span class="ansi-yellow-bg">.</span><span class="ansi-yellow-bg">read</span><span class="ansi-yellow-bg">(</span>
@ -2789,7 +2789,7 @@ results = await task</div>
<span class="ansi-green-fg"> 677</span> <span class="ansi-yellow-bg"> </span><span class="ansi-yellow-bg">*</span><span class="ansi-yellow-bg">*</span><span class="ansi-yellow-bg">kwargs</span><span class="ansi-yellow-bg">,</span>
<span class="ansi-green-fg"> 678</span> <span class="ansi-yellow-bg">)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/pandas/io/parquet.py:258</span>, in <span class="ansi-cyan-fg">PyArrowImpl.read</span><span class="ansi-blue-fg">(self, path, columns, filters, use_nullable_dtypes, dtype_backend, storage_options, filesystem, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/pandas/io/parquet.py:258</span>, in <span class="ansi-cyan-fg">PyArrowImpl.read</span><span class="ansi-blue-fg">(self, path, columns, filters, use_nullable_dtypes, dtype_backend, storage_options, filesystem, **kwargs)</span>
<span class="ansi-green-fg"> 256</span> <span class="ansi-bold" style="color: rgb(0,135,0)">if</span> manager == <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">array</span><span class="ansi-yellow-fg">"</span>:
<span class="ansi-green-fg"> 257</span> to_pandas_kwargs[<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">split_blocks</span><span class="ansi-yellow-fg">"</span>] = <span class="ansi-bold" style="color: rgb(0,135,0)">True</span>
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">258</span> path_or_handle, handles, filesystem = <span class="ansi-yellow-bg">_get_path_or_handle</span><span class="ansi-yellow-bg">(</span>
@ -2805,7 +2805,7 @@ results = await task</div>
<span class="ansi-green-fg"> (...)</span><span class="ansi-green-fg"> 270</span> **kwargs,
<span class="ansi-green-fg"> 271</span> )
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/pandas/io/parquet.py:141</span>, in <span class="ansi-cyan-fg">_get_path_or_handle</span><span class="ansi-blue-fg">(path, fs, storage_options, mode, is_dir)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/pandas/io/parquet.py:141</span>, in <span class="ansi-cyan-fg">_get_path_or_handle</span><span class="ansi-blue-fg">(path, fs, storage_options, mode, is_dir)</span>
<span class="ansi-green-fg"> 131</span> handles = <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>
<span class="ansi-green-fg"> 132</span> <span class="ansi-bold" style="color: rgb(0,135,0)">if</span> (
<span class="ansi-green-fg"> 133</span> <span class="ansi-bold" style="color: rgb(175,0,255)">not</span> fs
@ -2818,7 +2818,7 @@ results = await task</div>
<span class="ansi-green-fg"> 144</span> fs = <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>
<span class="ansi-green-fg"> 145</span> path_or_handle = handles.handle
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/pandas/io/common.py:882</span>, in <span class="ansi-cyan-fg">get_handle</span><span class="ansi-blue-fg">(path_or_buf, mode, encoding, compression, memory_map, is_text, errors, storage_options)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/pandas/io/common.py:882</span>, in <span class="ansi-cyan-fg">get_handle</span><span class="ansi-blue-fg">(path_or_buf, mode, encoding, compression, memory_map, is_text, errors, storage_options)</span>
<span class="ansi-green-fg"> 873</span> handle = <span style="color: rgb(0,135,0)">open</span>(
<span class="ansi-green-fg"> 874</span> handle,
<span class="ansi-green-fg"> 875</span> ioargs.mode,
@ -3232,7 +3232,7 @@ results = await task</div>
<span class="ansi-green-fg"> 6</span> pd.read_parquet(<span class="ansi-yellow-fg">f</span><span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">inputs/</span><span class="ansi-bold" style="color: rgb(175,95,135)">{</span>index<span class="ansi-bold" style="color: rgb(175,95,135)">}</span><span class="ansi-yellow-fg">/community_reports.parquet</span><span class="ansi-yellow-fg">"</span>) <span class="ansi-bold" style="color: rgb(0,135,0)">for</span> index <span class="ansi-bold" style="color: rgb(175,0,255)">in</span> indexes
<span class="ansi-green-fg"> 7</span> ]
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/pandas/io/parquet.py:669</span>, in <span class="ansi-cyan-fg">read_parquet</span><span class="ansi-blue-fg">(path, engine, columns, storage_options, use_nullable_dtypes, dtype_backend, filesystem, filters, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/pandas/io/parquet.py:669</span>, in <span class="ansi-cyan-fg">read_parquet</span><span class="ansi-blue-fg">(path, engine, columns, storage_options, use_nullable_dtypes, dtype_backend, filesystem, filters, **kwargs)</span>
<span class="ansi-green-fg"> 666</span> use_nullable_dtypes = <span class="ansi-bold" style="color: rgb(0,135,0)">False</span>
<span class="ansi-green-fg"> 667</span> check_dtype_backend(dtype_backend)
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">669</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-yellow-bg">impl</span><span class="ansi-yellow-bg">.</span><span class="ansi-yellow-bg">read</span><span class="ansi-yellow-bg">(</span>
@ -3246,7 +3246,7 @@ results = await task</div>
<span class="ansi-green-fg"> 677</span> <span class="ansi-yellow-bg"> </span><span class="ansi-yellow-bg">*</span><span class="ansi-yellow-bg">*</span><span class="ansi-yellow-bg">kwargs</span><span class="ansi-yellow-bg">,</span>
<span class="ansi-green-fg"> 678</span> <span class="ansi-yellow-bg">)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/pandas/io/parquet.py:258</span>, in <span class="ansi-cyan-fg">PyArrowImpl.read</span><span class="ansi-blue-fg">(self, path, columns, filters, use_nullable_dtypes, dtype_backend, storage_options, filesystem, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/pandas/io/parquet.py:258</span>, in <span class="ansi-cyan-fg">PyArrowImpl.read</span><span class="ansi-blue-fg">(self, path, columns, filters, use_nullable_dtypes, dtype_backend, storage_options, filesystem, **kwargs)</span>
<span class="ansi-green-fg"> 256</span> <span class="ansi-bold" style="color: rgb(0,135,0)">if</span> manager == <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">array</span><span class="ansi-yellow-fg">"</span>:
<span class="ansi-green-fg"> 257</span> to_pandas_kwargs[<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">split_blocks</span><span class="ansi-yellow-fg">"</span>] = <span class="ansi-bold" style="color: rgb(0,135,0)">True</span>
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">258</span> path_or_handle, handles, filesystem = <span class="ansi-yellow-bg">_get_path_or_handle</span><span class="ansi-yellow-bg">(</span>
@ -3262,7 +3262,7 @@ results = await task</div>
<span class="ansi-green-fg"> (...)</span><span class="ansi-green-fg"> 270</span> **kwargs,
<span class="ansi-green-fg"> 271</span> )
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/pandas/io/parquet.py:141</span>, in <span class="ansi-cyan-fg">_get_path_or_handle</span><span class="ansi-blue-fg">(path, fs, storage_options, mode, is_dir)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/pandas/io/parquet.py:141</span>, in <span class="ansi-cyan-fg">_get_path_or_handle</span><span class="ansi-blue-fg">(path, fs, storage_options, mode, is_dir)</span>
<span class="ansi-green-fg"> 131</span> handles = <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>
<span class="ansi-green-fg"> 132</span> <span class="ansi-bold" style="color: rgb(0,135,0)">if</span> (
<span class="ansi-green-fg"> 133</span> <span class="ansi-bold" style="color: rgb(175,0,255)">not</span> fs
@ -3275,7 +3275,7 @@ results = await task</div>
<span class="ansi-green-fg"> 144</span> fs = <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>
<span class="ansi-green-fg"> 145</span> path_or_handle = handles.handle
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/pandas/io/common.py:882</span>, in <span class="ansi-cyan-fg">get_handle</span><span class="ansi-blue-fg">(path_or_buf, mode, encoding, compression, memory_map, is_text, errors, storage_options)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/pandas/io/common.py:882</span>, in <span class="ansi-cyan-fg">get_handle</span><span class="ansi-blue-fg">(path_or_buf, mode, encoding, compression, memory_map, is_text, errors, storage_options)</span>
<span class="ansi-green-fg"> 873</span> handle = <span style="color: rgb(0,135,0)">open</span>(
<span class="ansi-green-fg"> 874</span> handle,
<span class="ansi-green-fg"> 875</span> ioargs.mode,
@ -3585,7 +3585,7 @@ results = await task</div>
<span class="ansi-green-fg"> 9</span> )
<span class="ansi-green-fg"> 10</span> results = <span class="ansi-bold" style="color: rgb(0,135,0)">await</span> task
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/pandas/io/parquet.py:669</span>, in <span class="ansi-cyan-fg">read_parquet</span><span class="ansi-blue-fg">(path, engine, columns, storage_options, use_nullable_dtypes, dtype_backend, filesystem, filters, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/pandas/io/parquet.py:669</span>, in <span class="ansi-cyan-fg">read_parquet</span><span class="ansi-blue-fg">(path, engine, columns, storage_options, use_nullable_dtypes, dtype_backend, filesystem, filters, **kwargs)</span>
<span class="ansi-green-fg"> 666</span> use_nullable_dtypes = <span class="ansi-bold" style="color: rgb(0,135,0)">False</span>
<span class="ansi-green-fg"> 667</span> check_dtype_backend(dtype_backend)
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">669</span> <span class="ansi-bold" style="color: rgb(0,135,0)">return</span> <span class="ansi-yellow-bg">impl</span><span class="ansi-yellow-bg">.</span><span class="ansi-yellow-bg">read</span><span class="ansi-yellow-bg">(</span>
@ -3599,7 +3599,7 @@ results = await task</div>
<span class="ansi-green-fg"> 677</span> <span class="ansi-yellow-bg"> </span><span class="ansi-yellow-bg">*</span><span class="ansi-yellow-bg">*</span><span class="ansi-yellow-bg">kwargs</span><span class="ansi-yellow-bg">,</span>
<span class="ansi-green-fg"> 678</span> <span class="ansi-yellow-bg">)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/pandas/io/parquet.py:258</span>, in <span class="ansi-cyan-fg">PyArrowImpl.read</span><span class="ansi-blue-fg">(self, path, columns, filters, use_nullable_dtypes, dtype_backend, storage_options, filesystem, **kwargs)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/pandas/io/parquet.py:258</span>, in <span class="ansi-cyan-fg">PyArrowImpl.read</span><span class="ansi-blue-fg">(self, path, columns, filters, use_nullable_dtypes, dtype_backend, storage_options, filesystem, **kwargs)</span>
<span class="ansi-green-fg"> 256</span> <span class="ansi-bold" style="color: rgb(0,135,0)">if</span> manager == <span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">array</span><span class="ansi-yellow-fg">"</span>:
<span class="ansi-green-fg"> 257</span> to_pandas_kwargs[<span class="ansi-yellow-fg">"</span><span class="ansi-yellow-fg">split_blocks</span><span class="ansi-yellow-fg">"</span>] = <span class="ansi-bold" style="color: rgb(0,135,0)">True</span>
<span class="ansi-green-fg">--&gt; </span><span class="ansi-green-fg">258</span> path_or_handle, handles, filesystem = <span class="ansi-yellow-bg">_get_path_or_handle</span><span class="ansi-yellow-bg">(</span>
@ -3615,7 +3615,7 @@ results = await task</div>
<span class="ansi-green-fg"> (...)</span><span class="ansi-green-fg"> 270</span> **kwargs,
<span class="ansi-green-fg"> 271</span> )
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/pandas/io/parquet.py:141</span>, in <span class="ansi-cyan-fg">_get_path_or_handle</span><span class="ansi-blue-fg">(path, fs, storage_options, mode, is_dir)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/pandas/io/parquet.py:141</span>, in <span class="ansi-cyan-fg">_get_path_or_handle</span><span class="ansi-blue-fg">(path, fs, storage_options, mode, is_dir)</span>
<span class="ansi-green-fg"> 131</span> handles = <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>
<span class="ansi-green-fg"> 132</span> <span class="ansi-bold" style="color: rgb(0,135,0)">if</span> (
<span class="ansi-green-fg"> 133</span> <span class="ansi-bold" style="color: rgb(175,0,255)">not</span> fs
@ -3628,7 +3628,7 @@ results = await task</div>
<span class="ansi-green-fg"> 144</span> fs = <span class="ansi-bold" style="color: rgb(0,135,0)">None</span>
<span class="ansi-green-fg"> 145</span> path_or_handle = handles.handle
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/.cache/pypoetry/virtualenvs/graphrag-F2jvqev7-py3.11/lib/python3.11/site-packages/pandas/io/common.py:882</span>, in <span class="ansi-cyan-fg">get_handle</span><span class="ansi-blue-fg">(path_or_buf, mode, encoding, compression, memory_map, is_text, errors, storage_options)</span>
<span class="ansi-cyan-fg">File </span><span class="ansi-green-fg">~/work/graphrag/graphrag/.venv/lib/python3.11/site-packages/pandas/io/common.py:882</span>, in <span class="ansi-cyan-fg">get_handle</span><span class="ansi-blue-fg">(path_or_buf, mode, encoding, compression, memory_map, is_text, errors, storage_options)</span>
<span class="ansi-green-fg"> 873</span> handle = <span style="color: rgb(0,135,0)">open</span>(
<span class="ansi-green-fg"> 874</span> handle,
<span class="ansi-green-fg"> 875</span> ioargs.mode,

View File

@ -1718,8 +1718,7 @@
After you have a config file you can run the pipeline using the CLI or the Python API.</p>
<h2 id="usage">Usage</h2>
<h3 id="cli">CLI</h3>
<div class="highlight"><pre><span></span><code><a id="__codelineno-0-1" name="__codelineno-0-1" href="#__codelineno-0-1"></a><span class="c1"># Via Poetry</span>
<a id="__codelineno-0-2" name="__codelineno-0-2" href="#__codelineno-0-2"></a>poetry<span class="w"> </span>run<span class="w"> </span>poe<span class="w"> </span>index<span class="w"> </span>--root<span class="w"> </span>&lt;data_root&gt;<span class="w"> </span><span class="c1"># default config mode</span>
<div class="highlight"><pre><span></span><code><a id="__codelineno-0-1" name="__codelineno-0-1" href="#__codelineno-0-1"></a>uv<span class="w"> </span>run<span class="w"> </span>poe<span class="w"> </span>index<span class="w"> </span>--root<span class="w"> </span>&lt;data_root&gt;<span class="w"> </span><span class="c1"># default config mode</span>
</code></pre></div>
<h3 id="python-api">Python API</h3>
<p>Please see the indexing API <a href="https://github.com/microsoft/graphrag/blob/main/graphrag/api/index.py">python file</a> for the recommended method to call directly from Python code.</p>

File diff suppressed because one or more lines are too long

Binary file not shown.