Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .github/CODEOWNERS
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
* @edgee-cloud/edgeers
26 changes: 26 additions & 0 deletions .github/workflows/check.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
name: Check
on:
push:
branches:
- main
pull_request:

jobs:
Check:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Install uv
uses: astral-sh/setup-uv@v4
with:
enable-cache: true
- name: Set up Python
run: uv python install 3.12
- name: Install dependencies
run: uv sync --all-extras
- name: Ruff format check
run: uv run ruff format --check .
- name: Ruff lint
run: uv run ruff check .
- name: Run tests
run: uv run pytest
30 changes: 30 additions & 0 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
name: Release to PyPI

on:
push:
tags:
- "v*"

jobs:
release:
runs-on: ubuntu-latest
permissions:
id-token: write # Required for trusted publishing

steps:
- uses: actions/checkout@v4

- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.12"

- name: Install build dependencies
run: pip install build

- name: Build package
run: python -m build

- name: Publish to PyPI
uses: pypa/gh-action-pypi-publish@release/v1

71 changes: 68 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ response = edgee.send(
input="What is the capital of France?",
)

print(response.choices[0].message["content"])
print(response.text)
```

### Full Input with Messages
Expand Down Expand Up @@ -67,8 +67,40 @@ response = edgee.send(
},
)

if response.choices[0].message.get("tool_calls"):
print(response.choices[0].message["tool_calls"])
if response.tool_calls:
print(response.tool_calls)
```

### Streaming

Access chunk properties for streaming:

```python
for chunk in edgee.stream(model="gpt-4o", input="Tell me a story"):
if chunk.text:
print(chunk.text, end="", flush=True)
```

#### Alternative: Using send(stream=True)

```python
for chunk in edgee.send(model="gpt-4o", input="Tell me a story", stream=True):
if chunk.text:
print(chunk.text, end="", flush=True)
```

#### Accessing Full Chunk Data

When you need complete access to the streaming response:

```python
for chunk in edgee.stream(model="gpt-4o", input="Hello"):
if chunk.role:
print(f"Role: {chunk.role}")
if chunk.text:
print(chunk.text, end="", flush=True)
if chunk.finish_reason:
print(f"\nFinish: {chunk.finish_reason}")
```

## Response
Expand All @@ -79,6 +111,12 @@ class SendResponse:
choices: list[Choice]
usage: Optional[Usage]

# Convenience properties for easy access
text: str | None # Shortcut for choices[0].message["content"]
message: dict | None # Shortcut for choices[0].message
finish_reason: str | None # Shortcut for choices[0].finish_reason
tool_calls: list | None # Shortcut for choices[0].message["tool_calls"]

@dataclass
class Choice:
index: int
Expand All @@ -91,3 +129,30 @@ class Usage:
completion_tokens: int
total_tokens: int
```

### Streaming Response

```python
@dataclass
class StreamChunk:
choices: list[StreamChoice]

# Convenience properties for easy access
text: str | None # Shortcut for choices[0].delta.content
role: str | None # Shortcut for choices[0].delta.role
finish_reason: str | None # Shortcut for choices[0].finish_reason

@dataclass
class StreamChoice:
index: int
delta: StreamDelta
finish_reason: str | None

@dataclass
class StreamDelta:
role: str | None # Only present in first chunk
content: str | None
tool_calls: list[dict] | None
```

To learn more about this SDK, please refer to the [dedicated documentation](https://www.edgee.cloud/docs/sdk/python).
Loading
Loading