Skip to content

Conversation

@dependabot
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Feb 4, 2026

Bumps the gha group with 13 updates in the /packages/opentelemetry-instrumentation-llamaindex directory:

Package From To
llama-index 0.14.12 0.14.13
ruff 0.14.11 0.15.0
chromadb 0.5.23 1.4.1
llama-index-llms-cohere 0.6.1 0.7.1
llama-index-llms-openai 0.6.13 0.6.17
llama-index-postprocessor-cohere-rerank 0.5.1 0.6.0
onnxruntime 1.19.2 1.23.2
openai 1.109.1 2.16.0
opentelemetry-instrumentation-chromadb 0.50.1 0.52.1
opentelemetry-instrumentation-cohere 0.50.1 0.52.1
opentelemetry-instrumentation-openai 0.50.1 0.52.1
pytest-asyncio 0.23.8 1.3.0
sqlalchemy 2.0.45 2.0.46

Updates llama-index from 0.14.12 to 0.14.13

Release notes

Sourced from llama-index's releases.

v0.14.13

Release Notes

[2026-01-21]

llama-index-core [0.14.13]

  • feat: add early_stopping_method parameter to agent workflows (#20389)
  • feat: Add token-based code splitting support to CodeSplitter (#20438)
  • Add RayIngestionPipeline integration for distributed data ingestion (#20443)
  • Added the multi-modal version of the Condensed Conversation & Context… (#20446)
  • Replace ChatMemoryBuffer with Memory (#20458)
  • fix(bug):Raise value error on when input is empty list in mean_agg instead of returning float (#20466)
  • fix: The classmethod of ReActChatFormatter should use cls instead of the class name (#20475)
  • feat: add configurable empty response message to synthesizers (#20503)

llama-index-embeddings-bedrock [0.7.3]

  • Enable use of ARNs for Bedrock Embedding Models (#20435)

llama-index-embeddings-ollama [0.8.6]

  • Improved Ollama batch embedding (#20447)

llama-index-embeddings-voyageai [0.5.3]

  • Adding voyage-4 models (#20497)

llama-index-ingestion-ray [0.1.0]

  • Add RayIngestionPipeline integration for distributed data ingestion (#20443)

llama-index-llms-anthropic [0.10.6]

  • feat: enhance structured predict methods for anthropic (#20440)
  • fix: preserve input_tokens in Anthropic stream_chat responses (#20512)

llama-index-llms-apertis [0.1.0]

  • Add Apertis LLM integration with example notebook (#20436)

llama-index-llms-bedrock-converse [0.12.4]

  • chore(bedrock-converse): Remove extraneous thinking_delta kwarg from ChatMessage (#20455)

llama-index-llms-gemini [0.6.2]

  • chore: deprecate llama-index-llms-gemini (#20511)

llama-index-llms-openai [0.6.13]

... (truncated)

Changelog

Sourced from llama-index's changelog.

llama-index-core [0.14.13]

  • feat: add early_stopping_method parameter to agent workflows (#20389)
  • feat: Add token-based code splitting support to CodeSplitter (#20438)
  • Add RayIngestionPipeline integration for distributed data ingestion (#20443)
  • Added the multi-modal version of the Condensed Conversation & Context… (#20446)
  • Replace ChatMemoryBuffer with Memory (#20458)
  • fix(bug):Raise value error on when input is empty list in mean_agg instead of returning float (#20466)
  • fix: The classmethod of ReActChatFormatter should use cls instead of the class name (#20475)
  • feat: add configurable empty response message to synthesizers (#20503)

llama-index-embeddings-bedrock [0.7.3]

  • Enable use of ARNs for Bedrock Embedding Models (#20435)

llama-index-embeddings-ollama [0.8.6]

  • Improved Ollama batch embedding (#20447)

llama-index-embeddings-voyageai [0.5.3]

  • Adding voyage-4 models (#20497)

llama-index-ingestion-ray [0.1.0]

  • Add RayIngestionPipeline integration for distributed data ingestion (#20443)

llama-index-llms-anthropic [0.10.6]

  • feat: enhance structured predict methods for anthropic (#20440)
  • fix: preserve input_tokens in Anthropic stream_chat responses (#20512)

llama-index-llms-apertis [0.1.0]

  • Add Apertis LLM integration with example notebook (#20436)

llama-index-llms-bedrock-converse [0.12.4]

  • chore(bedrock-converse): Remove extraneous thinking_delta kwarg from ChatMessage (#20455)

llama-index-llms-gemini [0.6.2]

  • chore: deprecate llama-index-llms-gemini (#20511)

llama-index-llms-openai [0.6.13]

  • Sanitize OpenAI structured output JSON schema name for generic Pydantic models (#20452)
  • chore: vbump openai (#20482)

llama-index-llms-openrouter [0.4.3]

... (truncated)

Commits

Updates ruff from 0.14.11 to 0.15.0

Release notes

Sourced from ruff's releases.

0.15.0

Release Notes

Released on 2026-02-03.

Check out the blog post for a migration guide and overview of the changes!

Breaking changes

  • Ruff now formats your code according to the 2026 style guide. See the formatter section below or in the blog post for a detailed list of changes.

  • The linter now supports block suppression comments. For example, to suppress N803 for all parameters in this function:

    # ruff: disable[N803]
    def foo(
        legacyArg1,
        legacyArg2,
        legacyArg3,
        legacyArg4,
    ): ...
    # ruff: enable[N803]

    See the documentation for more details.

  • The ruff:alpine Docker image is now based on Alpine 3.23 (up from 3.21).

  • The ruff:debian and ruff:debian-slim Docker images are now based on Debian 13 "Trixie" instead of Debian 12 "Bookworm."

  • Binaries for the ppc64 (64-bit big-endian PowerPC) architecture are no longer included in our releases. It should still be possible to build Ruff manually for this platform, if needed.

  • Ruff now resolves all extended configuration files before falling back on a default Python version.

Stabilization

The following rules have been stabilized and are no longer in preview:

... (truncated)

Changelog

Sourced from ruff's changelog.

0.15.0

Released on 2026-02-03.

Check out the blog post for a migration guide and overview of the changes!

Breaking changes

  • Ruff now formats your code according to the 2026 style guide. See the formatter section below or in the blog post for a detailed list of changes.

  • The linter now supports block suppression comments. For example, to suppress N803 for all parameters in this function:

    # ruff: disable[N803]
    def foo(
        legacyArg1,
        legacyArg2,
        legacyArg3,
        legacyArg4,
    ): ...
    # ruff: enable[N803]

    See the documentation for more details.

  • The ruff:alpine Docker image is now based on Alpine 3.23 (up from 3.21).

  • The ruff:debian and ruff:debian-slim Docker images are now based on Debian 13 "Trixie" instead of Debian 12 "Bookworm."

  • Binaries for the ppc64 (64-bit big-endian PowerPC) architecture are no longer included in our releases. It should still be possible to build Ruff manually for this platform, if needed.

  • Ruff now resolves all extended configuration files before falling back on a default Python version.

Stabilization

The following rules have been stabilized and are no longer in preview:

... (truncated)

Commits

Updates chromadb from 0.5.23 to 1.4.1

Release notes

Sourced from chromadb's releases.

1.4.1

Version: 1.4.1 Git ref: refs/tags/1.4.1 Build Date: 2026-01-14T19:19 PIP Package: chroma-1.4.1.tar.gz Github Container Registry Image: :1.4.1 DockerHub Image: :1.4.1

What's Changed

... (truncated)

Commits

Updates llama-index-llms-cohere from 0.6.1 to 0.7.1

Updates llama-index-llms-openai from 0.6.13 to 0.6.17

Updates llama-index-postprocessor-cohere-rerank from 0.5.1 to 0.6.0

Updates onnxruntime from 1.19.2 to 1.23.2

Release notes

Sourced from onnxruntime's releases.

ONNX Runtime v1.23.2

No release notes provided.

ONNX Runtime v1.23.1

What's Changed

  • Fix Attention GQA implementation on CPU (#25966)
  • Address edge GetMemInfo edge cases (#26021)
  • Implement new Python APIs (#25999)
  • MemcpyFromHost and MemcpyToHost support for plugin EPs (#26088)
  • [TRT RTX EP] Fix bug for generating the correct subgraph in GetCapability (#26132)
  • add session_id_ to LogEvaluationStart/Stop, LogSessionCreationStart (#25590)
  • [build] fix WebAssembly build on macOS/arm64 (#25653)
  • [CPU] MoE Kernel (#25958)
  • [CPU] Block-wise QMoE kernel for CPU (#26009)
  • [C#] Implement missing APIs (#26101)
  • Regenerate test model with ONNX IR < 12 (#26149)
  • [CPU] Fix compilation errors because of unused variables (#26147)
  • [EP ABI] Check if nodes specified in GetCapability() have already been assigned (#26156)
  • [QNN EP] Add dynamic option to set HTP performance mode (#26135)

Full Changelog: microsoft/onnxruntime@v1.23.0...v1.23.1

ONNX Runtime v1.23.0

Announcements

  • This release introduces Execution Provider (EP) Plugin API, which is a new infrastructure for building plugin-based EPs. (#24887 , #25137, #25124, #25147, #25127, #25159, #25191, #2524)

  • This release introduces the ability to dynamically download and install execution providers. This feature is exclusively available in the WinML build and requires Windows 11 version 25H2 or later. To leverage this new capability, C/C++/C# users should use the builds distributed through the Windows App SDK, and Python users should install the onnxruntime-winml package(will be published soon). We encourage users who can upgrade to the latest Windows 11 to utilize the WinML build to take advantage of this enhancement.

Upcoming Changes

  • The next release will stop providing x86_64 binaries for macOS and iOS operating systems.
  • The next release will increase the minimum supported macOS version from 13.4 to 14.0.
  • The next release will stop providing python 3.10 wheels.

Execution & Core Optimizations

Shutdown logic on Windows is simplified

Now on Windows some global object will be not destroyed if we detect that the process is being shutting down(#24891) . It will not cause memory leak as when a process ends all the memory will be returned to the operating system. This change can reduce the chance of having crashes on process exit.

AutoEP/Device Management

Now ONNX Runtime has the ability to automatically discovery computing devices and select the best EPs to download and register. The EP downloading feature currently only works on Windows 11 version 25H2 or later.

Execution Provider (EP) Updates

ROCM EP was removed from the source tree. Users are recommended to use Migraphx or Vitis AI EPs from AMD. A new EP, Nvidia TensorRT RTX, was added.

... (truncated)

Commits

Updates openai from 1.109.1 to 2.16.0

Release notes

Sourced from openai's releases.

v2.16.0

2.16.0 (2026-01-27)

Full Changelog: v2.15.0...v2.16.0

Features

  • api: api update (b97f9f2)
  • api: api updates (9debcc0)
  • client: add support for binary request streaming (49561d8)

Bug Fixes

  • api: mark assistants as deprecated (0419cbc)

Chores

  • ci: upgrade actions/github-script (5139f13)
  • internal: update actions/checkout version (f276714)

Documentation

  • examples: update Azure Realtime sample to use v1 API (#2829) (3b31981)

v2.15.0

2.15.0 (2026-01-09)

Full Changelog: v2.14.0...v2.15.0

Features

  • api: add new Response completed_at prop (f077752)

Chores

  • internal: codegen related update (e7daba6)

v2.14.0

2.14.0 (2025-12-19)

Full Changelog: v2.13.0...v2.14.0

Features

  • api: slugs for new audio models; make all model params accept strings (e517792)

... (truncated)

Changelog

Sourced from openai's changelog.

2.16.0 (2026-01-27)

Full Changelog: v2.15.0...v2.16.0

Features

  • api: api update (b97f9f2)
  • api: api updates (9debcc0)
  • client: add support for binary request streaming (49561d8)

Bug Fixes

  • api: mark assistants as deprecated (0419cbc)

Chores

  • ci: upgrade actions/github-script (5139f13)
  • internal: update actions/checkout version (f276714)

Documentation

  • examples: update Azure Realtime sample to use v1 API (#2829) (3b31981)

2.15.0 (2026-01-09)

Full Changelog: v2.14.0...v2.15.0

Features

  • api: add new Response completed_at prop (f077752)

Chores

  • internal: codegen related update (e7daba6)

2.14.0 (2025-12-19)

Full Changelog: v2.13.0...v2.14.0

Features

  • api: slugs for new audio models; make all model params accept strings (e517792)

Bug Fixes

... (truncated)

Commits

Updates opentelemetry-instrumentation-chromadb from 0.50.1 to 0.52.1

Release notes

Sourced from opentelemetry-instrumentation-chromadb's releases.

0.52.1

v0.52.1 (2026-02-02)

Fix

  • voyageai: add to commitizen to bump on release (#3660)

[main c8205f5] bump: version 0.52.0 → 0.52.1 64 files changed, 69 insertions(+), 63 deletions(-)

0.52.0

v0.52.0 (2026-02-02)

Feat

  • voyage-ai: add voyage-ai instrumentation (#3653)

Fix

  • openai-agents: apply content tracing flag to content (#3487)
  • traceloop-sdk: Align evals output schema (#3643)

[main 5f597c4] bump: version 0.51.1 → 0.52.0 62 files changed, 72 insertions(+), 61 deletions(-)

0.51.1

v0.51.1 (2026-01-26)

Fix

  • openai-agents: add support for realtime (#3533)

[main ad330e3] bump: version 0.51.0 → 0.51.1 62 files changed, 67 insertions(+), 61 deletions(-)

0.51.0

v0.51.0 (2026-01-20)

Feat

  • google-generativeai: Add metrics support (#3506)

Fix

  • traceloop-sdk: Add csv and json support to experiment (#3537)
  • evals: evals API supports input + config, generate mbt functions (#3534)
  • langchain: correct unknown role in completion spans (#3532)
  • evals: auto generate evals (#3529)
  • tracing: Add association property (#3524)
  • openai-agents: optional import of optional deps (#3488)

... (truncated)

Changelog

Sourced from opentelemetry-instrumentation-chromadb's changelog.

v0.52.1 (2026-02-02)

Fix

  • voyageai: add to commitizen to bump on release (#3660)

v0.52.0 (2026-02-02)

Feat

  • voyage-ai: add voyage-ai instrumentation (#3653)

Fix

  • openai-agents: apply content tracing flag to content (#3487)
  • traceloop-sdk: Align evals output schema (#3643)

v0.51.1 (2026-01-26)

Fix

  • openai-agents: add support for realtime (#3533)

v0.51.0 (2026-01-20)

Feat

  • google-generativeai: Add metrics support (#3506)

Fix

  • traceloop-sdk: Add csv and json support to experiment (#3537)
  • evals: evals API supports input + config, generate mbt functions (#3534)
  • langchain: correct unknown role in completion spans (#3532)
  • evals: auto generate evals (#3529)
  • tracing: Add association property (#3524)
  • openai-agents: optional import of optional deps (#3488)
Commits
  • c8205f5 bump: version 0.52.0 → 0.52.1
  • 3e82f07 fix(voyageai): add to commitizen to bump on release (#3660)
  • 5f597c4 bump: version 0.51.1 → 0.52.0
  • 7ac8c48 feat(voyage-ai): add voyage-ai instrumentation (#3653)
  • 1b75912 chore(dependencies): bump mcp, agno, python-multipart, protobuf, pypdf (#3651)
  • f941d0c fix(openai-agents): apply content tracing flag to content (#3487)
  • 19d204f fix(traceloop-sdk): Align evals output schema (#3643)
  • ad330e3 bump: version 0.51.0 → 0.51.1
  • 92a4383 fix(openai-agents): add support for realtime (#3533)
  • 73913ea bump: version 0.50.1 → 0.51.0
  • Additional commits viewable in compare view

Updates opentelemetry-instrumentation-cohere from 0.50.1 to 0.52.1

Release notes

Sourced from opentelemetry-instrumentation-cohere's releases.

0.52.1

v0.52.1 (2026-02-02)

Fix

  • voyageai: add to commitizen to bump on release (#3660)

[main c8205f5] bump: version 0.52.0 → 0.52.1 64 files changed, 69 insertions(+), 63 deletions(-)

0.52.0

v0.52.0 (2026-02-02)

Feat

  • voyage-ai: add voyage-ai instrumentation (#3653)

Fix

  • openai-agents: apply content tracing flag to content (#3487)
  • traceloop-sdk: Align evals output schema (#3643)

[main 5f597c4] bump: version 0.51.1 → 0.52.0 62 files changed, 72 insertions(+), 61 deletions(-)

0.51.1

v0.51.1 (2026-01-26)

Fix

  • openai-agents: add support for realtime (#3533)

[main ad330e3] bump: version 0.51.0 → 0.51.1 62 files changed, 67 insertions(+), 61 deletions(-)

0.51.0

v0.51.0 (2026-01-20)

Feat

  • google-generativeai: Add metrics support (#3506)

Fix

  • traceloop-sdk: Add csv and json support to experiment (#3537)
  • evals: evals API supports input + config, generate mbt functions (#3534)
  • langchain: correct unknown role in completion spans (#3532)
  • evals: auto generate evals (#3529)
  • tracing: Add association property (#3524)
  • openai-agents: optional import of optional deps (#3488)

... (truncated)

Changelog

Sourced from opentelemetry-instrumentation-cohere's changelog.

v0.52.1 (2026-02-02)

Fix

  • voyageai: add to commitizen to bump on release (#3660)

v0.52.0 (2026-02-02)

Feat

  • voyage-ai: add voyage-ai instrumentation (#3653)

Fix

  • openai-agents: apply content tracing flag to content (#3487)
  • traceloop-sdk: Align evals output schema (#3643)

v0.51.1 (2026-01-26)

Fix

  • openai-agents: add support for realtime (#3533)

v0.51.0 (2026-01-20)

Feat

  • google-generativeai: Add metrics support (#3506)

Fix

  • traceloop-sdk: Add csv and json support to experiment (#3537)
  • evals: evals API supports input + config, generate mbt functions (#3534)
  • langchain: correct unknown role in completion spans (#3532)
  • evals: auto generate evals (#3529)
  • tracing: Add association property (#3524)
  • openai-agents: optional import of optional deps (#3488)
Commits
  • c8205f5 bump: version 0.52.0 → 0.52.1
  • 3e82f07 fix(voyageai): add to commitizen to bump on release (#3660)
  • 5f597c4 bump: version 0.51.1 → 0.52.0
  • 7ac8c48 feat(voyage-ai): add voyage-ai instrumentation (#3653)
  • 1b75912 chore(dependencies): bump mcp, agno, python-multipart, protobuf, pypdf (#3651)
  • f941d0c fix(openai-agents): apply content tracing flag to content (#3487)
  • 19d204f fix(traceloop-sdk): Align evals output schema (#3643)
  • ad330e3 bump: version 0.51.0 → 0.51.1
  • 92a4383 fix(openai-agents): add support for realtime (#3533)
  • 73913ea bump: version 0.50.1 → 0.51.0
  • Additional commits viewable in compare view

Updates opentelemetry-instrumentation-openai from 0.50.1 to 0.52.1

Release notes

Sourced from opentelemetry-instrumentation-openai's releases.

0.52.1

v0.52.1 (2026-02-02)

Fix

  • voyageai: add to commitizen to bump on release (#3660)

[main c8205f5] bump: version 0.52.0 → 0.52.1 64 files changed, 69 insertions(+), 63 deletions(-)

0.52.0

v0.52.0 (2026-02-02)

Feat

  • voyage-ai: add voyage-ai instrumentation (#3653)

Fix

  • openai-agents: apply content tracing flag to content (#3487)
  • traceloop-sdk: Align evals output schema (#3643)

[main 5f597c4] bum...

Description has been truncated

Bumps the gha group with 13 updates in the /packages/opentelemetry-instrumentation-llamaindex directory:

| Package | From | To |
| --- | --- | --- |
| [llama-index](https://github.com/run-llama/llama_index) | `0.14.12` | `0.14.13` |
| [ruff](https://github.com/astral-sh/ruff) | `0.14.11` | `0.15.0` |
| [chromadb](https://github.com/chroma-core/chroma) | `0.5.23` | `1.4.1` |
| llama-index-llms-cohere | `0.6.1` | `0.7.1` |
| llama-index-llms-openai | `0.6.13` | `0.6.17` |
| llama-index-postprocessor-cohere-rerank | `0.5.1` | `0.6.0` |
| [onnxruntime](https://github.com/microsoft/onnxruntime) | `1.19.2` | `1.23.2` |
| [openai](https://github.com/openai/openai-python) | `1.109.1` | `2.16.0` |
| [opentelemetry-instrumentation-chromadb](https://github.com/traceloop/openllmetry) | `0.50.1` | `0.52.1` |
| [opentelemetry-instrumentation-cohere](https://github.com/traceloop/openllmetry) | `0.50.1` | `0.52.1` |
| [opentelemetry-instrumentation-openai](https://github.com/traceloop/openllmetry) | `0.50.1` | `0.52.1` |
| [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) | `0.23.8` | `1.3.0` |
| [sqlalchemy](https://github.com/sqlalchemy/sqlalchemy) | `2.0.45` | `2.0.46` |



Updates `llama-index` from 0.14.12 to 0.14.13
- [Release notes](https://github.com/run-llama/llama_index/releases)
- [Changelog](https://github.com/run-llama/llama_index/blob/main/CHANGELOG.md)
- [Commits](run-llama/llama_index@v0.14.12...v0.14.13)

Updates `ruff` from 0.14.11 to 0.15.0
- [Release notes](https://github.com/astral-sh/ruff/releases)
- [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md)
- [Commits](astral-sh/ruff@0.14.11...0.15.0)

Updates `chromadb` from 0.5.23 to 1.4.1
- [Release notes](https://github.com/chroma-core/chroma/releases)
- [Changelog](https://github.com/chroma-core/chroma/blob/main/RELEASE_PROCESS.md)
- [Commits](chroma-core/chroma@0.5.23...1.4.1)

Updates `llama-index-llms-cohere` from 0.6.1 to 0.7.1

Updates `llama-index-llms-openai` from 0.6.13 to 0.6.17

Updates `llama-index-postprocessor-cohere-rerank` from 0.5.1 to 0.6.0

Updates `onnxruntime` from 1.19.2 to 1.23.2
- [Release notes](https://github.com/microsoft/onnxruntime/releases)
- [Changelog](https://github.com/microsoft/onnxruntime/blob/main/docs/ReleaseManagement.md)
- [Commits](microsoft/onnxruntime@v1.19.2...v1.23.2)

Updates `openai` from 1.109.1 to 2.16.0
- [Release notes](https://github.com/openai/openai-python/releases)
- [Changelog](https://github.com/openai/openai-python/blob/main/CHANGELOG.md)
- [Commits](openai/openai-python@v1.109.1...v2.16.0)

Updates `opentelemetry-instrumentation-chromadb` from 0.50.1 to 0.52.1
- [Release notes](https://github.com/traceloop/openllmetry/releases)
- [Changelog](https://github.com/traceloop/openllmetry/blob/main/CHANGELOG.md)
- [Commits](0.50.1...0.52.1)

Updates `opentelemetry-instrumentation-cohere` from 0.50.1 to 0.52.1
- [Release notes](https://github.com/traceloop/openllmetry/releases)
- [Changelog](https://github.com/traceloop/openllmetry/blob/main/CHANGELOG.md)
- [Commits](0.50.1...0.52.1)

Updates `opentelemetry-instrumentation-openai` from 0.50.1 to 0.52.1
- [Release notes](https://github.com/traceloop/openllmetry/releases)
- [Changelog](https://github.com/traceloop/openllmetry/blob/main/CHANGELOG.md)
- [Commits](0.50.1...0.52.1)

Updates `pytest-asyncio` from 0.23.8 to 1.3.0
- [Release notes](https://github.com/pytest-dev/pytest-asyncio/releases)
- [Commits](pytest-dev/pytest-asyncio@v0.23.8...v1.3.0)

Updates `sqlalchemy` from 2.0.45 to 2.0.46
- [Release notes](https://github.com/sqlalchemy/sqlalchemy/releases)
- [Changelog](https://github.com/sqlalchemy/sqlalchemy/blob/main/CHANGES.rst)
- [Commits](https://github.com/sqlalchemy/sqlalchemy/commits)

---
updated-dependencies:
- dependency-name: llama-index
  dependency-version: 0.14.13
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: gha
- dependency-name: ruff
  dependency-version: 0.15.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: gha
- dependency-name: chromadb
  dependency-version: 1.4.1
  dependency-type: direct:development
  update-type: version-update:semver-major
  dependency-group: gha
- dependency-name: llama-index-llms-cohere
  dependency-version: 0.7.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: gha
- dependency-name: llama-index-llms-openai
  dependency-version: 0.6.17
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: gha
- dependency-name: llama-index-postprocessor-cohere-rerank
  dependency-version: 0.6.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: gha
- dependency-name: onnxruntime
  dependency-version: 1.23.2
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: gha
- dependency-name: openai
  dependency-version: 2.16.0
  dependency-type: direct:development
  update-type: version-update:semver-major
  dependency-group: gha
- dependency-name: opentelemetry-instrumentation-chromadb
  dependency-version: 0.52.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: gha
- dependency-name: opentelemetry-instrumentation-cohere
  dependency-version: 0.52.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: gha
- dependency-name: opentelemetry-instrumentation-openai
  dependency-version: 0.52.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: gha
- dependency-name: pytest-asyncio
  dependency-version: 1.3.0
  dependency-type: direct:development
  update-type: version-update:semver-major
  dependency-group: gha
- dependency-name: sqlalchemy
  dependency-version: 2.0.46
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: gha
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Feb 4, 2026
@coderabbitai
Copy link

coderabbitai bot commented Feb 4, 2026

Important

Review skipped

Bot user detected.

To trigger a single review, invoke the @coderabbitai review command.

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.

  • 🔍 Trigger a full review

Comment @coderabbitai help to get the list of available commands and usage tips.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants