This project leverages the power of the Microsoft Semantic Kernel to interact with the OpenAI API.
- Welcome to SK Playground!
- Companion Article Series
- Project Structure
- Core Features
- Usage Examples
- Input Files
- Function and Plugin Organization
- Jupyter Notebook Environment
- Scripts Usage
- Extras
- Showroom
This repository serves as a companion to a series of articles discussing the integration and utilization of Semantic Kernel. These articles provide deeper insights into the concepts and functionalities demonstrated in this repository.
- Intro to Semantic Kernel - Part One
- Intro to Semantic Kernel - Part Two
- Intro to Semantic Kernel - Part Three
- Intro to Semantic Kernel - Part Four
- Intro to Semantic Kernel - Part Five
- Intro to Semantic Kernel - Addendum
.
├── SkPlayground.csproj
├── SkPlayground.sln
├── appsettings.plugins.json
│
├── config
│ └── appsettings.json.example
│
├── desc
│ ├── action_planner
│ │ ├── create_html_doc.txt
│ │ ├── download_document.txt
│ │ └── find_url.txt
│ ├── dotnet_project.txt
│ ├── keycloak_helm_chart.txt
│ ├── keycloak_prod_with_mysql.txt
│ ├── postgresl_helm_chart.txt
│ ├── sequential_planner
│ │ ├── extract_js.txt
│ │ ├── generate_secret_plan.txt
│ │ └── keycloak_plan.txt
│ └── typescript_nestjs_project.txt
│
├── notebooks
│ ├── demo1.ipynb
│ ├── demo1_v1.0.0-beta1.ipynb
│ └── demo_rag.ipynb
│
├── scripts
│ └── parse.sh
│
├── plugins
│ ├── Assistant
│ ├── DevOps
│ ├── Engineering
│ ├── Html
│ ├── Http
│ ├── KeyAndCertGenerator
│ ├── SecretYamlGenerator
│ ├── SecretYamlUpdater
│ └── TextMemoryEx
│
├── webserver
│ ├── assets
│ ├── binders
│ ├── config
│ ├── controllers
│ ├── dtos
│ ├── formatters
│ ├── middleware
│ └── responses
SkPlayground is built on C# and .NET 7, using Semantic Kernel from Microsoft. It is equipped with several plugins:
- Chat: A chat functionality capable of interfacing with memory sourced externally from vector or SQL databases.
- Kubernetes: Generates YAML files based on user descriptions to complete specific tasks.
- Helm: Creates Helm v3 Charts as per user specifications.
- TypeScript: Generates a README.md with a detailed description and source codes for building NodeJS projects based on TypeScript.
- CSharp: Generates a README.md with a detailed description and source codes for building .NET projects based on C#.
- CreateHtmlDoc: Generate a HTML file
- ExtractJS: Extract embedded JavaScript from a HTML document.
- ExecuteGetAsync: Execute a GET request.
- ExecutePostAsync: Execute a POST request.
- ExecutePutAsync: Execute a PUT request.
- GenerateBase64KeyAndCert: Create a base64-encoded private key and certificate.
- Extract: Extract key or certificate from a base64-encoded string.
- CreateSecretYaml: Create a Kubernetes Secret YAML file.
- UpdateKubernetesSecretYamlString: Update the data section of a Kubernetes Secret YAML.
- SaveAsync: accepts two additional arguments:
descriptionandadditionalMetadata.
The program can be executed via the command line using the dotnet run command, along with specifying three arguments: -i (or --input) for the input file, -f (or --function) for the plugin function to be executed, and -m (or --method) to select one of the available program methods listed below. The input file should contain a description of the task, and the function argument should specify which function to run. When no method is selected with the -m flag, the RunDefault method will be used. The launch.json of the VSCode is already set up and contains various examples that showcase the usage of all possible methods.
| Method | Description | Arguments |
|---|---|---|
RunWebServer |
Initiates a web server to host a Crypto Assistant Plugin API. | None |
RunWithActionPlanner |
Processes the given prompt using an action planner. | FileInfo file |
RunWithSequentialPlanner |
Processes the given prompt using a sequential planner. | FileInfo file |
RunWithHooks |
Showcases pre- and post-execution hooks. | None |
RunWithHooks2 |
Showcases pre- and post-execution hooks, with a different hook configuration. | None |
RunWithRag |
Showcases Retrieval-augmented Generation (RAG) | None |
When using Planners, there's no need to manually set the function.
Here are some examples:
This command reads the description from the dotnet_project.txt file and executes the CSharp function to generate a README for a C# project:
dotnet run -- -i ./desc/dotnet_project.txt -f CSharpAssuming there's a typescript_project.txt file with the appropriate description, you can generate a README for a TypeScript project as follows:
dotnet run -- -i ./desc/typescript_project.txt -f TypeScriptIf you have a description for a Kubernetes setup in a file called kubernetes_desc.txt, you can generate the necessary YAML files using the following command:
dotnet run -- -i ./desc/kubernetes_desc.txt -f KubernetesGiven a description in a file named keycloak_helm_desc.txt, you can generate Helm v3 Charts for a Keycloak deployment like this:
dotnet run -- -i ./desc/keycloak_helm_desc.txt -f HelmEnsure that the necessary plugins are correctly placed under the "plugins" directory and that the descriptions in the input files are well-formatted to get the desired outputs.
Input files are housed in the desc folder and contain descriptions provided by the user. Here are a few examples:
Create an application that takes a string, hashes it with SHA256, and then returns that hash back to the user.
Deploy latest available version of Keycloak (quarkus-based variant) that meets the following criteria:
- uses an external PostgreSQL instance created by another Helm Chart
- runs in production mode
- uses self-signed certificates
- creates a realm named "test-realm"
Deploy keycloak (quarkus variant) that uses mysql as its backend.
Keycloak runs in prod mode and is TLS secured with a self-signed certificate.
Use images from bitnami.
The functions within Semantic Kernel are organized into subfolders under the plugins folder, and are identified as either semantic or native functions. A plugin encapsulates one or more functions. Semantic functions are accompanied by a config.json and skprompt.txt file, situated within their respective subfolders for configuration and prompt setup. The config.json file elucidates the function's input parameters and descriptive information essential for the AI service, while skprompt.txt contains the prompt setup. Conversely, native functions, written in C#, have their configurations defined directly within the C# code, and do not require separate config.json and skprompt.txt files. These configurations are achieved using annotations in the code which allows the kernel to comprehend the behavior of the function, and are typically defined as public methods of a class representing the plugin in a file named after the plugin (e.g., HttpPlugin.cs) within the same plugin directory. Both semantic and native functions can reside within the same plugin directory and are loaded into the same plugin namespace by the kernel, provided their names are unique within the namespace.
The program's behavior can be tailored using the appsettings.plugins.json configuration file. This file is read at the start of the application, allowing you to specify the location of the plugins folder as well as the available plugins.
Here's an example configuration:
{
"PluginSettings": {
"Root": "plugins",
"Plugins": [
"Assistant",
"DevOps",
"Engineering",
"Html",
"Http",
"KeyAndCertGenerator",
"SecretYamlGenerator",
"SecretYamlUpdater",
"TextMemoryEx"
]
}
}In this configuration:
-
Root: Specifies the location of the plugins folder relative to the project's root directory. By default, it's set to "plugins", but you can change it to reference a different folder. -
Plugins: Specifies the available plugins, which are expected to be found within subdirectories of the specified plugins folder.
This setup allows a flexible structure, enabling you to organize your functions and plugins as per your project's requirements. You can change the Root and Plugins settings in the appsettings.plugins.json file to point to different directories or to include different sets of plugins, without needing to modify the program's source code.
Additionally, there's a configuration file named appsettings.json.example located in the config directory. This file is essential for the correct operation of the application, as it contains configurations for the GPT model, service type, and your API key, among other settings.
Here's an example configuration:
{
"endpointType": "text-completion",
"serviceType": "OpenAI",
"serviceId": "text-davinci-003",
"deploymentOrModelId": "text-davinci-003",
"apiKey": "... your OpenAI key ...",
"orgId": ""
}Before running the program, you'll need to:
- Rename
appsettings.json.exampletoappsettings.json. - Populate the
appsettings.jsonfile with the correct data:endpointType: Specifies the endpoint type for the GPT model.serviceType: Specifies the service type, in this case,OpenAI.serviceIdanddeploymentOrModelId: Specify the ID of the GPT model.apiKey: Your OpenAI API key.orgId: (Optional) Your organization ID if applicable.
Once these steps are completed, the program will be able to read the appsettings.json file and use the specified configurations to interact with the OpenAI GPT model.
Alternatively, instead of placing sensitive information like the API key in the appsettings.json file, you can use the dotnet user-secrets tool to configure these values securely. This way, the API key and other sensitive data won't be exposed in the appsettings.json file. Here's how you can do it:
-
Initialize user secrets for your project:
dotnet user-secrets init
-
Set the required secrets:
dotnet user-secrets set "apiKey" "... your OpenAI key ..."
Repeat the above command for each configuration setting you'd like to store as a user secret, replacing "apiKey" with the configuration key, and "... your OpenAI key ..." with the value.
Once these steps are completed, the program will be able to read the configuration values from the user secrets and use the specified configurations to interact with the OpenAI GPT model without exposing sensitive data in the appsettings.json file.
This project incorporates a Jupyter Notebook environment, allowing for an interactive and dynamic approach to executing and testing code snippets in a live, document-based setting. The notebooks directory houses all the Jupyter notebooks related to this project.
-
Installing Jupyter Notebook Environment
- Ensure you have .NET Interactive installed. Run the following command:
dotnet tool install --global Microsoft.dotnet-interactive
- Install the .NET kernels for Jupyter:
dotnet interactive jupyter install
- Change the current directory to the project root:
cd path/to/SkPlayground
- Ensure you have .NET Interactive installed. Run the following command:
-
Launching Jupyter Notebook
- Launch Jupyter Notebook:
jupyter notebook
This will open the Jupyter Notebook interface in your web browser, where you can navigate to the
notebooksdirectory and open the notebook of your choice. - Launch Jupyter Notebook:
In C# Jupyter Notebooks, you can load libraries and dependencies using the #r "nuget:..." directive. For instance, to load the Microsoft.Extensions.Configuration.Json library, use the following command:
#r "nuget:Microsoft.Extensions.Configuration.Json,7.0.0"Inside the scripts directory, you will find the script parse.sh. This script is designed to process the output generated by the DevOps plugin functions Helm and Kubernetes.
When processing the output of the Helm function, the script helps in generating the Helm charts. It creates the required directories, YAML files, and other necessary items for a Helm chart. Once generated, these Helm charts can be applied using the Helm tool with the following command:
helm install [CHART] [NAME] --namespace [NAMESPACE][CHART]: Path to the directory containing the generated Helm chart.[NAME]: A name you choose for this release of the chart.[NAMESPACE]: The namespace in which to install the chart.
On the other hand, when processing the output of the Kubernetes function, the script creates all the necessary YAML files for Kubernetes resources. Once the YAML files are generated, they can be applied to your Kubernetes cluster using the kubectl command like so:
kubectl apply -f [FILENAME][FILENAME]: The path to the generated YAML file or directory containing the YAML files.
This project facilitates the creation of Helm charts through a straightforward process. Here is a step-by-step walkthrough of generating a Helm chart using the Helm function of the DevOps plugin:
-
Executing the Plugin Function
Begin by running the desired plugin function using
dotnet run. In this example, we're using theHelmfunction of theDevOpsplugin to process a description file (keycloak_helm_chart.txt). The OpenAI completion result is redirected to a text file (output.txt).dotnet run -- -i ./desc/keycloak_helm_chart.txt -f Helm > output.txt -
Parsing the Output
Utilize the provided bash script (
parse.sh) to process the output file, generating the necessary files and directory structure for the Helm chart../scripts/parse.sh -f output.txt -o keycloak
-
Exploring the Generated Chart
Navigate to the generated chart directory and utilize the
treecommand to visualize the created files and directories. The Helm chart is now ready for use and is structured as per Helm's standard directory structure.cd keycloak tree -L 2 # Output: . └── keycloak ├── Chart.yaml ├── templates └── values.yaml
This workflow streamlines the process of transforming human-readable descriptions into deployable Helm charts, showcasing the power and efficiency of automating DevOps tasks through the Semantic Kernel and OpenAI's capabilities integrated within this project.
Additionally, this workflow is equally applicable when utilizing the Kubernetes function of the DevOps plugin. The parsing script (parse.sh) is designed to handle both Helm charts and pure Kubernetes YAML outputs seamlessly. When invoked with completions from the Kubernetes function, the script generates one or more Kubernetes YAML files instead, following the same directory structuring convention, making it a versatile tool for your DevOps automation tasks.
In the original Semantic Kernel project, the TextMemoryPlugin provided a method named SaveAsync to persist information into the database. However, it only allowed for saving a singular input argument, leaving the description and additionalMetadata fields in the database unfilled. In order to address this limitation, a new implementation named TextMemoryExPlugin has been introduced. This extended version enhances the SaveAsync method to accommodate and persist the two additional arguments: description and additionalMetadata. Here's a comparison of the old and new JSON database entries illustrating the improvement:
Old JSON database entry:
{
"is_reference": false,
"external_source_name": "",
"id": "6295c180-edc5-453e-93f6-7919924868ba",
"description": "",
"text": "I was born in Berlin.",
"additional_metadata": ""
}New JSON database entry:
{
"is_reference": false,
"external_source_name": "",
"id": "8820aa60-1b2d-411a-8476-85eac498f132",
"description": "Collections",
"text": "I have a collection of vintage stamps.",
"additional_metadata": "Item: Stamps"
}Below is the enhanced implementation of the SaveAsync method within the TextMemoryExPlugin class:
[SKFunction]
[Description("Save information to semantic memory")]
public async Task SaveAsync(
[Description("The information to save")] string input,
[Description("Description")][DefaultValue(null)] string description,
[Description("Additional Metadata")][DefaultValue(null)] string additionalMetadata,
[SKName("collection")][Description("Memories collection associated with the information to save")][DefaultValue("generic")] string collection,
[SKName("key")][Description("The key associated with the information to save")] string key,
ILoggerFactory? loggerFactory,
CancellationToken cancellationToken = default(CancellationToken)
)
{
if (string.IsNullOrWhiteSpace(collection) || string.IsNullOrWhiteSpace(key))
{
throw new Exception("collection and key must not be empty");
}
loggerFactory?.CreateLogger(typeof(TextMemoryExPlugin)).LogDebug("Saving memory to collection '{0}'", collection);
await _memory.SaveInformationAsync(collection, input, key, description, additionalMetadata, cancellationToken).ConfigureAwait(continueOnCapturedContext: false);
}






