Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion runtime/prompty/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -132,7 +132,7 @@ class SimplePromptyTracer:
json.dump(trace, f, indent=4)
```

The tracing mechanism is supported for all of the prompty runtime internals and can be used to trace the execution of the prompt along with all of the paramters. There is also a `@trace` decorator that can be used to trace the execution of any function external to the runtime. This is provided as a facility to trace the execution of the prompt and whatever supporting code you have.
The tracing mechanism is supported for all of the prompty runtime internals and can be used to trace the execution of the prompt along with all of the parameters. There is also a `@trace` decorator that can be used to trace the execution of any function external to the runtime. This is provided as a facility to trace the execution of the prompt and whatever supporting code you have.

```python
import prompty
Expand Down
2 changes: 1 addition & 1 deletion runtime/promptycs/Prompty.Core/Prompty.cs
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ public static async Task<Prompty> LoadAsync(string path, string configuration =
/// Load a prompty file using the provided text content.
/// </summary>
/// <param name="text">Id of the configuration to use.</param>
/// <param name="gloablConfig">Global configuration to use.</param>
/// <param name="globalConfig">Global configuration to use.</param>
/// <param name="path">Optional: File path to the prompty file.</param>
public static Prompty Load(string text, Dictionary<string, object> globalConfig, string? path = null)
{
Expand Down
2 changes: 1 addition & 1 deletion runtime/promptycs/Tests/chat.prompty
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
name: Contoso Chat Prompt
description: A retail assistent for Contoso Outdoors products retailer.
description: A retail assistant for Contoso Outdoors products retailer.
authors:
- Cassie Breviu
model:
Expand Down
2 changes: 1 addition & 1 deletion runtime/promptyjs/tests/prompts/basic.mustache.prompty
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ You are helping {{firstName}} to find answers to their questions.
Use their name to address them in your responses.

# context
Use the follow contex to provide a more personalized response to {{firstName}}:
Use the following context to provide a more personalized response to {{firstName}}:
{{context}}

{{#chat_history}}
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
[
{
"role": "system",
"content": "You are an AI assistant who helps people find information. As the assistant, \nyou answer questions briefly, succinctly, and in a personable manner using \nmarkdown and even add some personal flair with appropriate emojis.\n\n# Customer\nYou are helping Seth to find answers to their questions.\nUse their name to address them in your responses.\n\n# context\nUse the follow contex to provide a more personalized response to Seth:\nThe Alpine Explorer Tent boasts a detachable divider for privacy, numerous mesh windows and adjustable vents for ventilation, and a waterproof design. It even has a built-in gear loft for storing your outdoor essentials. In short, it&#39;s a blend of privacy, comfort, and convenience, making it your second home in the heart of nature!"
"content": "You are an AI assistant who helps people find information. As the assistant, \nyou answer questions briefly, succinctly, and in a personable manner using \nmarkdown and even add some personal flair with appropriate emojis.\n\n# Customer\nYou are helping Seth to find answers to their questions.\nUse their name to address them in your responses.\n\n# context\nUse the following context to provide a more personalized response to Seth:\nThe Alpine Explorer Tent boasts a detachable divider for privacy, numerous mesh windows and adjustable vents for ventilation, and a waterproof design. It even has a built-in gear loft for storing your outdoor essentials. In short, it&#39;s a blend of privacy, comfort, and convenience, making it your second home in the heart of nature!"
},
{
"role": "assistant",
Expand Down
2 changes: 1 addition & 1 deletion runtime/promptyjs/tests/prompts/basic.prompty
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ You are helping {{firstName}} to find answers to their questions.
Use their name to address them in your responses.

# context
Use the follow contex to provide a more personalized response to {{firstName}}:
Use the following context to provide a more personalized response to {{firstName}}:
{{context}}

user:
Expand Down
2 changes: 1 addition & 1 deletion runtime/promptyjs/tests/prompts/basic.prompty.parsed.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
[
{
"role": "system",
"content": "You are an AI assistant who helps people find information. As the assistant, \nyou answer questions briefly, succinctly, and in a personable manner using \nmarkdown and even add some personal flair with appropriate emojis.\n\n# Customer\nYou are helping Seth to find answers to their questions.\nUse their name to address them in your responses.\n\n# context\nUse the follow contex to provide a more personalized response to Seth:\nThe Alpine Explorer Tent boasts a detachable divider for privacy, numerous mesh windows and adjustable vents for ventilation, and a waterproof design. It even has a built-in gear loft for storing your outdoor essentials. In short, it&#39;s a blend of privacy, comfort, and convenience, making it your second home in the heart of nature!"
"content": "You are an AI assistant who helps people find information. As the assistant, \nyou answer questions briefly, succinctly, and in a personable manner using \nmarkdown and even add some personal flair with appropriate emojis.\n\n# Customer\nYou are helping Seth to find answers to their questions.\nUse their name to address them in your responses.\n\n# context\nUse the following context to provide a more personalized response to Seth:\nThe Alpine Explorer Tent boasts a detachable divider for privacy, numerous mesh windows and adjustable vents for ventilation, and a waterproof design. It even has a built-in gear loft for storing your outdoor essentials. In short, it&#39;s a blend of privacy, comfort, and convenience, making it your second home in the heart of nature!"
},
{
"role": "user",
Expand Down
2 changes: 1 addition & 1 deletion web/docs/contributing/page.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ tags:
[Prompty](https://github.com/microsoft/prompty) is an open-source project from Microsoft that makes it easy for developers to _create, manage, debug, and evaluate_ LLM prompts for generative AI applications. We welcome contributions from the community that can help make the technology more useful, and usable, by developers from all backgrounds. Before you get started, review this page for contributor guidelines.

## Code Of Conduct
Read the project's [Code of Conduct](https://github.com/microsoft/prompty/blob/main/CODE_OF_CONDUCT.md) and adhere to it. The project is alse governed by the Microsoft Open Source Code of Conduct - [Read their FAQ](https://opensource.microsoft.com/codeofconduct/faq/) to learn why the CoC matters and how you can raise concerns or provide feedback.
Read the project's [Code of Conduct](https://github.com/microsoft/prompty/blob/main/CODE_OF_CONDUCT.md) and adhere to it. The project is also governed by the Microsoft Open Source Code of Conduct - [Read their FAQ](https://opensource.microsoft.com/codeofconduct/faq/) to learn why the CoC matters and how you can raise concerns or provide feedback.

## Providing feedback

Expand Down
2 changes: 1 addition & 1 deletion web/docs/getting-started/concepts/page.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ The [Prompty specification](https://github.com/microsoft/prompty/blob/main/Promp

### 1.2 The Prompty Tooling

The [Prompty Visual Studio Code Extension](https://marketplace.visualstudio.com/items?itemName=ms-toolsai.prompty) helps you create, manage, and execute, your `.prompty` assets - effectively giving you a _playground_ right in your editor, to streamline your prompt engineering workflow and speed up your prototype iterations. We'll get hands-on experience with this in the [Tutorials](/docs/tutorials) section. For now, click to expand the section and get an intutive sense for how this enhances your developer experience.
The [Prompty Visual Studio Code Extension](https://marketplace.visualstudio.com/items?itemName=ms-toolsai.prompty) helps you create, manage, and execute, your `.prompty` assets - effectively giving you a _playground_ right in your editor, to streamline your prompt engineering workflow and speed up your prototype iterations. We'll get hands-on experience with this in the [Tutorials](/docs/tutorials) section. For now, click to expand the section and get an intuitive sense for how this enhances your developer experience.

<details>
<summary> **Learn More**: The Prompty Visual Studio Code Extension </summary>
Expand Down
2 changes: 1 addition & 1 deletion web/docs/getting-started/debugging-prompty/page.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -120,7 +120,7 @@ The trace output is divided into three: _load, prepare_ and _run_. Load refers t
![Trace Output](trace-output.png)


From the trace output, you can see the inputs, outputs and metrics such as time to execute the prompt and tokens. This information can be used to debug and fix any issues in your code. For example, we can see output has been truncated and the `Completion Tokens` count is less than 1000, which might not be sufficent for the prompt to generate different outputs. We can increase the `max_tokens` in our Prompty to 1000 to generate more tokens. Once done, run the code again and confirm you get 5 examples of the short message inviting friends to a Game Night.
From the trace output, you can see the inputs, outputs and metrics such as time to execute the prompt and tokens. This information can be used to debug and fix any issues in your code. For example, we can see output has been truncated and the `Completion Tokens` count is less than 1000, which might not be sufficient for the prompt to generate different outputs. We can increase the `max_tokens` in our Prompty to 1000 to generate more tokens. Once done, run the code again and confirm you get 5 examples of the short message inviting friends to a Game Night.

![updated trace output](trace-bug-fixed.png)

Expand Down
2 changes: 1 addition & 1 deletion web/docs/guides/prompty-observability/page.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ class SimplePromptyTracer:
json.dump(trace, f, indent=4)
```

The tracing mechanism is supported for all of the prompty runtime internals and can be used to trace the execution of the prompt along with all of the paramters. There is also a `@trace` decorator that can be used to trace the execution of any function external to the runtime. This is provided as a facility to trace the execution of the prompt and whatever supporting code you have.
The tracing mechanism is supported for all of the prompty runtime internals and can be used to trace the execution of the prompt along with all of the parameters. There is also a `@trace` decorator that can be used to trace the execution of any function external to the runtime. This is provided as a facility to trace the execution of the prompt and whatever supporting code you have.

```python
import prompty
Expand Down
2 changes: 1 addition & 1 deletion web/docs/tutorials/using-langchain/page.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ This guide explains how to use Prompty templates with the [LangChain framework](

## What is LangChain

[LangChain](https://python.langchain.com/docs/introduction/) is a composable framework for building context-aware, reasoning applications pwoered by large language models, and using your organization's data and APIs. It has an [extensive set of integrations](https://python.langchain.com/docs/integrations/providers/) with both plaforms ([like AzureAI](https://python.langchain.com/api_reference/azure_ai/)) and tools (like [Prompty](https://python.langchain.com/docs/integrations/providers/microsoft/)) - allowing you to work seamlessly with a diverse set of model providers and integrations.
[LangChain](https://python.langchain.com/docs/introduction/) is a composable framework for building context-aware, reasoning applications pwoered by large language models, and using your organization's data and APIs. It has an [extensive set of integrations](https://python.langchain.com/docs/integrations/providers/) with both platforms ([like AzureAI](https://python.langchain.com/api_reference/azure_ai/)) and tools (like [Prompty](https://python.langchain.com/docs/integrations/providers/microsoft/)) - allowing you to work seamlessly with a diverse set of model providers and integrations.

> Explore these resources · [Microsoft Integrations](https://python.langchain.com/docs/integrations/providers/microsoft/) · [`langchain-prompty` docs](https://python.langchain.com/api_reference/prompty/) · [`langchain-prompty` package](https://pypi.org/project/langchain-prompty/) · [`langchain-prompty` source](https://github.com/langchain-ai/langchain/tree/master/libs/partners/prompty)

Expand Down
6 changes: 3 additions & 3 deletions web/docs/tutorials/using-semantic-kernel/page.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -35,13 +35,13 @@ The "Advanced" code example below explains how this can be used to populate rele

## Prerequisites

1. **Install [Microsoft.SemanticKernel.Prompty (Alpha)](https://www.nuget.org/packages/Microsoft.SemanticKernel.Prompty/1.24.1-alpha) package** with this command:
1. **Install [Microsoft.SemanticKernel.Prompty (Beta)](https://www.nuget.org/packages/Microsoft.SemanticKernel.Prompty/1.47.0-beta) package** with this command:

```
dotnet add package Microsoft.SemanticKernel.Prompty --version 1.24.1-alpha
dotnet add package Microsoft.SemanticKernel.Prompty --version 1.47.0-beta
```

1. **Setup Semantic Kernel and configure it**. Follow the [Semantic Kernel](https://learn.microsoft.com/en-us/semantic-kernel/get-started/quick-start-guide?pivots=programming-language-csharp) quickstart for guidance if you are new to this framework.
1. **Setup Semantic Kernel and configure it**. Follow the [Semantic Kernel](https://learn.microsoft.com/semantic-kernel/get-started/quick-start-guide?pivots=programming-language-csharp) quickstart for guidance if you are new to this framework.

---

Expand Down
Loading