Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ jobs:
run: ./scripts/lint

build:
if: github.repository == 'stainless-sdks/openai-python' && (github.event_name == 'push' || github.event.pull_request.head.repo.fork)
if: github.event_name == 'push' || github.event.pull_request.head.repo.fork
timeout-minutes: 10
name: build
permissions:
Expand All @@ -61,12 +61,14 @@ jobs:
run: rye build

- name: Get GitHub OIDC Token
if: github.repository == 'stainless-sdks/openai-python'
id: github-oidc
uses: actions/github-script@v6
with:
script: core.setOutput('github_token', await core.getIDToken());

- name: Upload tarball
if: github.repository == 'stainless-sdks/openai-python'
env:
URL: https://pkg.stainless.com/s
AUTH: ${{ steps.github-oidc.outputs.github_token }}
Expand Down
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "1.100.3"
".": "1.101.0"
}
8 changes: 4 additions & 4 deletions .stats.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
configured_endpoints: 111
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/openai%2Fopenai-7ef7a457c3bf05364e66e48c9ca34f31bfef1f6c9b7c15b1812346105e0abb16.yml
openapi_spec_hash: a2b1f5d8fbb62175c93b0ebea9f10063
config_hash: 4870312b04f48fd717ea4151053e7fb9
configured_endpoints: 119
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/openai%2Fopenai-ddbdf9343316047e8a773c54fb24e4a8d225955e202a1888fde6f9c8898ebf98.yml
openapi_spec_hash: 9802f6dd381558466c897f6e387e06ca
config_hash: fe0ea26680ac2075a6cd66416aefe7db
14 changes: 14 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,19 @@
# Changelog

## 1.101.0 (2025-08-21)

Full Changelog: [v1.100.3...v1.101.0](https://github.com/openai/openai-python/compare/v1.100.3...v1.101.0)

### Features

* **api:** Add connectors support for MCP tool ([a47f962](https://github.com/openai/openai-python/commit/a47f962daf579c142b8af5579be732772b688a29))
* **api:** adding support for /v1/conversations to the API ([e30bcbc](https://github.com/openai/openai-python/commit/e30bcbc0cb7c827af779bee6971f976261abfb67))


### Chores

* update github action ([7333b28](https://github.com/openai/openai-python/commit/7333b282718a5f6977f30e1a2548207b3a089bd4))

## 1.100.3 (2025-08-20)

Full Changelog: [v1.100.2...v1.100.3](https://github.com/openai/openai-python/compare/v1.100.2...v1.100.3)
Expand Down
49 changes: 49 additions & 0 deletions api.md
Original file line number Diff line number Diff line change
Expand Up @@ -751,6 +751,7 @@ from openai.types.responses import (
ResponseContent,
ResponseContentPartAddedEvent,
ResponseContentPartDoneEvent,
ResponseConversationParam,
ResponseCreatedEvent,
ResponseCustomToolCall,
ResponseCustomToolCallInputDeltaEvent,
Expand Down Expand Up @@ -854,6 +855,54 @@ Methods:

- <code title="get /responses/{response_id}/input_items">client.responses.input_items.<a href="./src/openai/resources/responses/input_items.py">list</a>(response_id, \*\*<a href="src/openai/types/responses/input_item_list_params.py">params</a>) -> <a href="./src/openai/types/responses/response_item.py">SyncCursorPage[ResponseItem]</a></code>

# Conversations

Types:

```python
from openai.types.conversations import (
ComputerScreenshotContent,
ContainerFileCitationBody,
Conversation,
ConversationDeleted,
ConversationDeletedResource,
FileCitationBody,
InputFileContent,
InputImageContent,
InputTextContent,
LobProb,
Message,
OutputTextContent,
RefusalContent,
SummaryTextContent,
TextContent,
TopLogProb,
URLCitationBody,
)
```

Methods:

- <code title="post /conversations">client.conversations.<a href="./src/openai/resources/conversations/conversations.py">create</a>(\*\*<a href="src/openai/types/conversations/conversation_create_params.py">params</a>) -> <a href="./src/openai/types/conversations/conversation.py">Conversation</a></code>
- <code title="get /conversations/{conversation_id}">client.conversations.<a href="./src/openai/resources/conversations/conversations.py">retrieve</a>(conversation_id) -> <a href="./src/openai/types/conversations/conversation.py">Conversation</a></code>
- <code title="post /conversations/{conversation_id}">client.conversations.<a href="./src/openai/resources/conversations/conversations.py">update</a>(conversation_id, \*\*<a href="src/openai/types/conversations/conversation_update_params.py">params</a>) -> <a href="./src/openai/types/conversations/conversation.py">Conversation</a></code>
- <code title="delete /conversations/{conversation_id}">client.conversations.<a href="./src/openai/resources/conversations/conversations.py">delete</a>(conversation_id) -> <a href="./src/openai/types/conversations/conversation_deleted_resource.py">ConversationDeletedResource</a></code>

## Items

Types:

```python
from openai.types.conversations import ConversationItem, ConversationItemList
```

Methods:

- <code title="post /conversations/{conversation_id}/items">client.conversations.items.<a href="./src/openai/resources/conversations/items.py">create</a>(conversation_id, \*\*<a href="src/openai/types/conversations/item_create_params.py">params</a>) -> <a href="./src/openai/types/conversations/conversation_item_list.py">ConversationItemList</a></code>
- <code title="get /conversations/{conversation_id}/items/{item_id}">client.conversations.items.<a href="./src/openai/resources/conversations/items.py">retrieve</a>(item_id, \*, conversation_id, \*\*<a href="src/openai/types/conversations/item_retrieve_params.py">params</a>) -> <a href="./src/openai/types/conversations/conversation_item.py">ConversationItem</a></code>
- <code title="get /conversations/{conversation_id}/items">client.conversations.items.<a href="./src/openai/resources/conversations/items.py">list</a>(conversation_id, \*\*<a href="src/openai/types/conversations/item_list_params.py">params</a>) -> <a href="./src/openai/types/conversations/conversation_item.py">SyncConversationCursorPage[ConversationItem]</a></code>
- <code title="delete /conversations/{conversation_id}/items/{item_id}">client.conversations.items.<a href="./src/openai/resources/conversations/items.py">delete</a>(item_id, \*, conversation_id) -> <a href="./src/openai/types/conversations/conversation.py">Conversation</a></code>

# Evals

Types:
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "openai"
version = "1.100.3"
version = "1.101.0"
description = "The official Python library for the openai API"
dynamic = ["readme"]
license = "Apache-2.0"
Expand Down
1 change: 1 addition & 0 deletions src/openai/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -386,5 +386,6 @@ def _reset_client() -> None: # type: ignore[reportUnusedFunction]
completions as completions,
fine_tuning as fine_tuning,
moderations as moderations,
conversations as conversations,
vector_stores as vector_stores,
)
38 changes: 38 additions & 0 deletions src/openai/_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,7 @@
completions,
fine_tuning,
moderations,
conversations,
vector_stores,
)
from .resources.files import Files, AsyncFiles
Expand All @@ -69,6 +70,7 @@
from .resources.responses.responses import Responses, AsyncResponses
from .resources.containers.containers import Containers, AsyncContainers
from .resources.fine_tuning.fine_tuning import FineTuning, AsyncFineTuning
from .resources.conversations.conversations import Conversations, AsyncConversations
from .resources.vector_stores.vector_stores import VectorStores, AsyncVectorStores

__all__ = ["Timeout", "Transport", "ProxiesTypes", "RequestOptions", "OpenAI", "AsyncOpenAI", "Client", "AsyncClient"]
Expand Down Expand Up @@ -254,6 +256,12 @@ def responses(self) -> Responses:

return Responses(self)

@cached_property
def conversations(self) -> Conversations:
from .resources.conversations import Conversations

return Conversations(self)

@cached_property
def evals(self) -> Evals:
from .resources.evals import Evals
Expand Down Expand Up @@ -573,6 +581,12 @@ def responses(self) -> AsyncResponses:

return AsyncResponses(self)

@cached_property
def conversations(self) -> AsyncConversations:
from .resources.conversations import AsyncConversations

return AsyncConversations(self)

@cached_property
def evals(self) -> AsyncEvals:
from .resources.evals import AsyncEvals
Expand Down Expand Up @@ -802,6 +816,12 @@ def responses(self) -> responses.ResponsesWithRawResponse:

return ResponsesWithRawResponse(self._client.responses)

@cached_property
def conversations(self) -> conversations.ConversationsWithRawResponse:
from .resources.conversations import ConversationsWithRawResponse

return ConversationsWithRawResponse(self._client.conversations)

@cached_property
def evals(self) -> evals.EvalsWithRawResponse:
from .resources.evals import EvalsWithRawResponse
Expand Down Expand Up @@ -905,6 +925,12 @@ def responses(self) -> responses.AsyncResponsesWithRawResponse:

return AsyncResponsesWithRawResponse(self._client.responses)

@cached_property
def conversations(self) -> conversations.AsyncConversationsWithRawResponse:
from .resources.conversations import AsyncConversationsWithRawResponse

return AsyncConversationsWithRawResponse(self._client.conversations)

@cached_property
def evals(self) -> evals.AsyncEvalsWithRawResponse:
from .resources.evals import AsyncEvalsWithRawResponse
Expand Down Expand Up @@ -1008,6 +1034,12 @@ def responses(self) -> responses.ResponsesWithStreamingResponse:

return ResponsesWithStreamingResponse(self._client.responses)

@cached_property
def conversations(self) -> conversations.ConversationsWithStreamingResponse:
from .resources.conversations import ConversationsWithStreamingResponse

return ConversationsWithStreamingResponse(self._client.conversations)

@cached_property
def evals(self) -> evals.EvalsWithStreamingResponse:
from .resources.evals import EvalsWithStreamingResponse
Expand Down Expand Up @@ -1111,6 +1143,12 @@ def responses(self) -> responses.AsyncResponsesWithStreamingResponse:

return AsyncResponsesWithStreamingResponse(self._client.responses)

@cached_property
def conversations(self) -> conversations.AsyncConversationsWithStreamingResponse:
from .resources.conversations import AsyncConversationsWithStreamingResponse

return AsyncConversationsWithStreamingResponse(self._client.conversations)

@cached_property
def evals(self) -> evals.AsyncEvalsWithStreamingResponse:
from .resources.evals import AsyncEvalsWithStreamingResponse
Expand Down
8 changes: 8 additions & 0 deletions src/openai/_module_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@
from .resources.responses.responses import Responses
from .resources.containers.containers import Containers
from .resources.fine_tuning.fine_tuning import FineTuning
from .resources.conversations.conversations import Conversations
from .resources.vector_stores.vector_stores import VectorStores

from . import _load_client
Expand Down Expand Up @@ -130,6 +131,12 @@ def __load__(self) -> VectorStores:
return _load_client().vector_stores


class ConversationsProxy(LazyProxy["Conversations"]):
@override
def __load__(self) -> Conversations:
return _load_client().conversations


chat: Chat = ChatProxy().__as_proxied__()
beta: Beta = BetaProxy().__as_proxied__()
files: Files = FilesProxy().__as_proxied__()
Expand All @@ -147,3 +154,4 @@ def __load__(self) -> VectorStores:
moderations: Moderations = ModerationsProxy().__as_proxied__()
fine_tuning: FineTuning = FineTuningProxy().__as_proxied__()
vector_stores: VectorStores = VectorStoresProxy().__as_proxied__()
conversations: Conversations = ConversationsProxy().__as_proxied__()
2 changes: 1 addition & 1 deletion src/openai/_version.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# File generated from our OpenAPI spec by Stainless. See CONTRIBUTING.md for details.

__title__ = "openai"
__version__ = "1.100.3" # x-release-please-version
__version__ = "1.101.0" # x-release-please-version
67 changes: 66 additions & 1 deletion src/openai/pagination.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,14 @@

from ._base_client import BasePage, PageInfo, BaseSyncPage, BaseAsyncPage

__all__ = ["SyncPage", "AsyncPage", "SyncCursorPage", "AsyncCursorPage"]
__all__ = [
"SyncPage",
"AsyncPage",
"SyncCursorPage",
"AsyncCursorPage",
"SyncConversationCursorPage",
"AsyncConversationCursorPage",
]

_T = TypeVar("_T")

Expand Down Expand Up @@ -123,3 +130,61 @@ def next_page_info(self) -> Optional[PageInfo]:
return None

return PageInfo(params={"after": item.id})


class SyncConversationCursorPage(BaseSyncPage[_T], BasePage[_T], Generic[_T]):
data: List[_T]
has_more: Optional[bool] = None
last_id: Optional[str] = None

@override
def _get_page_items(self) -> List[_T]:
data = self.data
if not data:
return []
return data

@override
def has_next_page(self) -> bool:
has_more = self.has_more
if has_more is not None and has_more is False:
return False

return super().has_next_page()

@override
def next_page_info(self) -> Optional[PageInfo]:
last_id = self.last_id
if not last_id:
return None

return PageInfo(params={"after": last_id})


class AsyncConversationCursorPage(BaseAsyncPage[_T], BasePage[_T], Generic[_T]):
data: List[_T]
has_more: Optional[bool] = None
last_id: Optional[str] = None

@override
def _get_page_items(self) -> List[_T]:
data = self.data
if not data:
return []
return data

@override
def has_next_page(self) -> bool:
has_more = self.has_more
if has_more is not None and has_more is False:
return False

return super().has_next_page()

@override
def next_page_info(self) -> Optional[PageInfo]:
last_id = self.last_id
if not last_id:
return None

return PageInfo(params={"after": last_id})
33 changes: 33 additions & 0 deletions src/openai/resources/conversations/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
# File generated from our OpenAPI spec by Stainless. See CONTRIBUTING.md for details.

from .items import (
Items,
AsyncItems,
ItemsWithRawResponse,
AsyncItemsWithRawResponse,
ItemsWithStreamingResponse,
AsyncItemsWithStreamingResponse,
)
from .conversations import (
Conversations,
AsyncConversations,
ConversationsWithRawResponse,
AsyncConversationsWithRawResponse,
ConversationsWithStreamingResponse,
AsyncConversationsWithStreamingResponse,
)

__all__ = [
"Items",
"AsyncItems",
"ItemsWithRawResponse",
"AsyncItemsWithRawResponse",
"ItemsWithStreamingResponse",
"AsyncItemsWithStreamingResponse",
"Conversations",
"AsyncConversations",
"ConversationsWithRawResponse",
"AsyncConversationsWithRawResponse",
"ConversationsWithStreamingResponse",
"AsyncConversationsWithStreamingResponse",
]
Loading
Loading