Skip to content

Commit

Permalink
Merge branch 'microsoft:main' into fix-model-produced-invalid-content
Browse files Browse the repository at this point in the history
  • Loading branch information
davorrunje authored Aug 28, 2024
2 parents 31e4db9 + 4193cea commit 0e76c51
Show file tree
Hide file tree
Showing 81 changed files with 3,059 additions and 806 deletions.
12 changes: 8 additions & 4 deletions CONTRIBUTORS.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,14 +18,18 @@
| Xiaoyun Zhang | [LittleLittleCloud](https://github.com/LittleLittleCloud) | Microsoft | AutoGen.Net, group chat | Yes | [Backlog - AutoGen.Net](https://github.com/microsoft/autogen/issues) - Available most of the time (PST) |
| Yiran Wu | [yiranwu0](https://github.com/yiranwu0) | Penn State University | alt-models, group chat, logging | Yes | |
| Beibin Li | [BeibinLi](https://github.com/BeibinLi) | Microsoft Research | alt-models | Yes | |
| Gagan Bansal | [gagb](https://github.com/gagb) | Microsoft Research | Complex Tasks | | |
| Gagan Bansal | [gagb](https://github.com/gagb) | Microsoft Research | All | | |
| Adam Fourney | [afourney](https://github.com/afourney) | Microsoft Research | Complex Tasks | | |
| Ricky Loynd | [rickyloynd-microsoft](https://github.com/rickyloynd-microsoft) | Microsoft Research | Teachability | | |
| Eric Zhu | [ekzhu](https://github.com/ekzhu) | Microsoft Research | Infra | | |
| Jack Gerrits | [jackgerrits](https://github.com/jackgerrits) | Microsoft Research | Infra | | |
| Eric Zhu | [ekzhu](https://github.com/ekzhu) | Microsoft Research | All, Infra | | |
| Jack Gerrits | [jackgerrits](https://github.com/jackgerrits) | Microsoft Research | All, Infra | | |
| David Luong | [DavidLuong98](https://github.com/DavidLuong98) | Microsoft | AutoGen.Net | | |
| Davor Runje | [davorrunje](https://github.com/davorrunje) | airt.ai | Tool calling, IO | | Available most of the time (Central European Time) |

| Friederike Niedtner | [Friderike](https://www.microsoft.com/en-us/research/people/fniedtner/) | Microsoft Research | PM | | |
| Rafah Hosn | [Rafah](https://www.microsoft.com/en-us/research/people/raaboulh/) | Microsoft Research | PM | | |
| Robin Moeur | [Robin](https://www.linkedin.com/in/rmoeur/) | Microsoft Research | PM | | |
| Jingya Chen | [jingyachen](https://github.com/JingyaChen) | Microsoft | UX Design, AutoGen Studio | | |
| Suff Syed | [suffsyed](https://github.com/suffsyed) | Microsoft | UX Design, AutoGen Studio | | |

## I would like to join this list. How can I help the project?
> We're always looking for new contributors to join our team and help improve the project. For more information, please refer to our [CONTRIBUTING](https://microsoft.github.io/autogen/docs/contributor-guide/contributing) guide.
Expand Down
2 changes: 2 additions & 0 deletions TRANSPARENCY_FAQS.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,8 @@ While AutoGen automates LLM workflows, decisions about how to use specific LLM o
- Current version of AutoGen was evaluated on six applications to illustrate its potential in simplifying the development of high-performance multi-agent applications. These applications are selected based on their real-world relevance, problem difficulty and problem solving capabilities enabled by AutoGen, and innovative potential.
- These applications involve using AutoGen to solve math problems, question answering, decision making in text world environments, supply chain optimization, etc. For each of these domains AutoGen was evaluated on various success based metrics (i.e., how often the AutoGen based implementation solved the task). And, in some cases, AutoGen based approach was also evaluated on implementation efficiency (e.g., to track reductions in developer effort to build). More details can be found at: https://aka.ms/AutoGen/TechReport
- The team has conducted tests where a “red” agent attempts to get the default AutoGen assistant to break from its alignment and guardrails. The team has observed that out of 70 attempts to break guardrails, only 1 was successful in producing text that would have been flagged as problematic by Azure OpenAI filters. The team has not observed any evidence that AutoGen (or GPT models as hosted by OpenAI or Azure) can produce novel code exploits or jailbreak prompts, since direct prompts to “be a hacker”, “write exploits”, or “produce a phishing email” are refused by existing filters.
- We also evaluated [a team of AutoGen agents](https://github.com/microsoft/autogen/tree/gaia_multiagent_v01_march_1st/samples/tools/autogenbench/scenarios/GAIA/Templates/Orchestrator) on the [GAIA benchmarks](https://arxiv.org/abs/2311.12983), and got [SOTA results](https://huggingface.co/spaces/gaia-benchmark/leaderboard) as of
March 1, 2024.

## What are the limitations of AutoGen? How can users minimize the impact of AutoGen’s limitations when using the system?
AutoGen relies on existing LLMs. Experimenting with AutoGen would retain common limitations of large language models; including:
Expand Down
16 changes: 15 additions & 1 deletion dotnet/AutoGen.sln
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "AutoGen.Gemini.Sample", "sa
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "AutoGen.AotCompatibility.Tests", "test\AutoGen.AotCompatibility.Tests\AutoGen.AotCompatibility.Tests.csproj", "{6B82F26D-5040-4453-B21B-C8D1F913CE4C}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "AutoGen.OpenAI.V1.Sample", "sample\AutoGen.OpenAI.Sample\AutoGen.OpenAI.V1.Sample.csproj", "{0E635268-351C-4A6B-A28D-593D868C2CA4}"
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "AutoGen.OpenAI.Sample", "sample\AutoGen.OpenAI.Sample\AutoGen.OpenAI.Sample.csproj", "{0E635268-351C-4A6B-A28D-593D868C2CA4}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "AutoGen.WebAPI.Sample", "sample\AutoGen.WebAPI.Sample\AutoGen.WebAPI.Sample.csproj", "{12079C18-A519-403F-BBFD-200A36A0C083}"
EndProject
Expand All @@ -74,6 +74,10 @@ Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "AutoGen.AzureAIInference.Te
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "AutoGen.Tests.Share", "test\AutoGen.Test.Share\AutoGen.Tests.Share.csproj", "{143725E2-206C-4D37-93E4-9EDF699826B2}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "AutoGen.OpenAI", "src\AutoGen.OpenAI\AutoGen.OpenAI.csproj", "{3AF1CBEC-2877-41E9-92AE-3A391B2AA9E8}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "AutoGen.OpenAI.Tests", "test\AutoGen.OpenAI.Tests\AutoGen.OpenAI.Tests.csproj", "{42A8251C-E7B3-47BB-A82E-459952EBE132}"
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Debug|Any CPU = Debug|Any CPU
Expand Down Expand Up @@ -212,6 +216,14 @@ Global
{143725E2-206C-4D37-93E4-9EDF699826B2}.Debug|Any CPU.Build.0 = Debug|Any CPU
{143725E2-206C-4D37-93E4-9EDF699826B2}.Release|Any CPU.ActiveCfg = Release|Any CPU
{143725E2-206C-4D37-93E4-9EDF699826B2}.Release|Any CPU.Build.0 = Release|Any CPU
{3AF1CBEC-2877-41E9-92AE-3A391B2AA9E8}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{3AF1CBEC-2877-41E9-92AE-3A391B2AA9E8}.Debug|Any CPU.Build.0 = Debug|Any CPU
{3AF1CBEC-2877-41E9-92AE-3A391B2AA9E8}.Release|Any CPU.ActiveCfg = Release|Any CPU
{3AF1CBEC-2877-41E9-92AE-3A391B2AA9E8}.Release|Any CPU.Build.0 = Release|Any CPU
{42A8251C-E7B3-47BB-A82E-459952EBE132}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{42A8251C-E7B3-47BB-A82E-459952EBE132}.Debug|Any CPU.Build.0 = Debug|Any CPU
{42A8251C-E7B3-47BB-A82E-459952EBE132}.Release|Any CPU.ActiveCfg = Release|Any CPU
{42A8251C-E7B3-47BB-A82E-459952EBE132}.Release|Any CPU.Build.0 = Release|Any CPU
EndGlobalSection
GlobalSection(SolutionProperties) = preSolution
HideSolutionNode = FALSE
Expand Down Expand Up @@ -250,6 +262,8 @@ Global
{5C45981D-1319-4C25-935C-83D411CB28DF} = {18BF8DD7-0585-48BF-8F97-AD333080CE06}
{5970868F-831E-418F-89A9-4EC599563E16} = {F823671B-3ECA-4AE6-86DA-25E920D3FE64}
{143725E2-206C-4D37-93E4-9EDF699826B2} = {F823671B-3ECA-4AE6-86DA-25E920D3FE64}
{3AF1CBEC-2877-41E9-92AE-3A391B2AA9E8} = {18BF8DD7-0585-48BF-8F97-AD333080CE06}
{42A8251C-E7B3-47BB-A82E-459952EBE132} = {F823671B-3ECA-4AE6-86DA-25E920D3FE64}
EndGlobalSection
GlobalSection(ExtensibilityGlobals) = postSolution
SolutionGuid = {93384647-528D-46C8-922C-8DB36A382F0B}
Expand Down
6 changes: 4 additions & 2 deletions dotnet/eng/Version.props
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,9 @@
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<AzureOpenAIVersion>1.0.0-beta.17</AzureOpenAIVersion>
<SemanticKernelVersion>1.15.1</SemanticKernelVersion>
<SemanticKernelExperimentalVersion>1.15.1-alpha</SemanticKernelExperimentalVersion>
<AzureOpenAIV2Version>2.0.0-beta.3</AzureOpenAIV2Version>
<SemanticKernelVersion>1.18.1-rc</SemanticKernelVersion>
<SemanticKernelExperimentalVersion>1.18.1-alpha</SemanticKernelExperimentalVersion>
<SystemCodeDomVersion>5.0.0</SystemCodeDomVersion>
<MicrosoftCodeAnalysisVersion>4.3.0</MicrosoftCodeAnalysisVersion>
<ApprovalTestVersion>6.0.0</ApprovalTestVersion>
Expand All @@ -16,6 +17,7 @@
<GoogleCloudAPIPlatformVersion>3.0.0</GoogleCloudAPIPlatformVersion>
<JsonSchemaVersion>4.3.0.2</JsonSchemaVersion>
<AzureAIInferenceVersion>1.0.0-beta.1</AzureAIInferenceVersion>
<OpenAISDKVersion>2.0.0-beta.10</OpenAISDKVersion>
<PowershellSDKVersion>7.4.4</PowershellSDKVersion>
</PropertyGroup>
</Project>
90 changes: 37 additions & 53 deletions dotnet/sample/AutoGen.BasicSamples/CodeSnippet/CreateAnAgent.cs
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,10 @@

using AutoGen;
using AutoGen.Core;
using AutoGen.OpenAI.V1;
using AutoGen.OpenAI;
using AutoGen.OpenAI.Extension;
using FluentAssertions;
using OpenAI;

public partial class AssistantCodeSnippet
{
Expand Down Expand Up @@ -32,23 +34,18 @@ public void CodeSnippet2()
{
#region code_snippet_2
// get OpenAI Key and create config
var apiKey = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY");
string endPoint = Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT"); // change to your endpoint
var apiKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY");
var model = "gpt-4o-mini";

var llmConfig = new AzureOpenAIConfig(
endpoint: endPoint,
deploymentName: "gpt-3.5-turbo-16k", // change to your deployment name
apiKey: apiKey);
var openAIClient = new OpenAIClient(apiKey);

// create assistant agent
var assistantAgent = new AssistantAgent(
var assistantAgent = new OpenAIChatAgent(
name: "assistant",
systemMessage: "You are an assistant that help user to do some tasks.",
llmConfig: new ConversableAgentConfig
{
Temperature = 0,
ConfigList = new[] { llmConfig },
});
chatClient: openAIClient.GetChatClient(model))
.RegisterMessageConnector()
.RegisterPrintMessage();
#endregion code_snippet_2
}

Expand All @@ -71,27 +68,21 @@ public async Task CodeSnippet4()
// get OpenAI Key and create config
var apiKey = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY");
string endPoint = Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT"); // change to your endpoint

var llmConfig = new AzureOpenAIConfig(
endpoint: endPoint,
deploymentName: "gpt-3.5-turbo-16k", // change to your deployment name
apiKey: apiKey);
var model = "gpt-4o-mini";
var openAIClient = new OpenAIClient(new System.ClientModel.ApiKeyCredential(apiKey), new OpenAIClientOptions
{
Endpoint = new Uri(endPoint),
});
#region code_snippet_4
var assistantAgent = new AssistantAgent(
var assistantAgent = new OpenAIChatAgent(
chatClient: openAIClient.GetChatClient(model),
name: "assistant",
systemMessage: "You are an assistant that convert user input to upper case.",
llmConfig: new ConversableAgentConfig
{
Temperature = 0,
ConfigList = new[]
{
llmConfig
},
FunctionContracts = new[]
{
this.UpperCaseFunctionContract, // The FunctionDefinition object for the UpperCase function
},
});
functions: [
this.UpperCaseFunctionContract.ToChatTool(), // The FunctionDefinition object for the UpperCase function
])
.RegisterMessageConnector()
.RegisterPrintMessage();

var response = await assistantAgent.SendAsync("hello");
response.Should().BeOfType<ToolCallMessage>();
Expand All @@ -106,31 +97,24 @@ public async Task CodeSnippet5()
// get OpenAI Key and create config
var apiKey = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY");
string endPoint = Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT"); // change to your endpoint

var llmConfig = new AzureOpenAIConfig(
endpoint: endPoint,
deploymentName: "gpt-3.5-turbo-16k", // change to your deployment name
apiKey: apiKey);
var model = "gpt-4o-mini";
var openAIClient = new OpenAIClient(new System.ClientModel.ApiKeyCredential(apiKey), new OpenAIClientOptions
{
Endpoint = new Uri(endPoint),
});
#region code_snippet_5
var assistantAgent = new AssistantAgent(
name: "assistant",
systemMessage: "You are an assistant that convert user input to upper case.",
llmConfig: new ConversableAgentConfig
{
Temperature = 0,
ConfigList = new[]
{
llmConfig
},
FunctionContracts = new[]
{
this.UpperCaseFunctionContract, // The FunctionDefinition object for the UpperCase function
},
},
functionMap: new Dictionary<string, Func<string, Task<string>>>
var functionCallMiddleware = new FunctionCallMiddleware(
functions: [this.UpperCaseFunctionContract],
functionMap: new Dictionary<string, Func<string, Task<string>>>()
{
{ this.UpperCaseFunctionContract.Name, this.UpperCaseWrapper }, // The wrapper function for the UpperCase function
{ this.UpperCaseFunctionContract.Name, this.UpperCase },
});
var assistantAgent = new OpenAIChatAgent(
name: "assistant",
systemMessage: "You are an assistant that convert user input to upper case.",
chatClient: openAIClient.GetChatClient(model))
.RegisterMessageConnector()
.RegisterStreamingMiddleware(functionCallMiddleware);

var response = await assistantAgent.SendAsync("hello");
response.Should().BeOfType<TextMessage>();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@

using AutoGen;
using AutoGen.Core;
using AutoGen.OpenAI.V1;
using FluentAssertions;

public partial class FunctionCallCodeSnippet
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,9 @@
#region snippet_GetStartCodeSnippet
using AutoGen;
using AutoGen.Core;
using AutoGen.OpenAI.V1;
using AutoGen.OpenAI;
using AutoGen.OpenAI.Extension;
using OpenAI;
#endregion snippet_GetStartCodeSnippet

public class GetStartCodeSnippet
Expand All @@ -13,16 +15,14 @@ public async Task CodeSnippet1()
{
#region code_snippet_1
var openAIKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new Exception("Please set OPENAI_API_KEY environment variable.");
var gpt35Config = new OpenAIConfig(openAIKey, "gpt-3.5-turbo");
var openAIClient = new OpenAIClient(openAIKey);
var model = "gpt-4o-mini";

var assistantAgent = new AssistantAgent(
var assistantAgent = new OpenAIChatAgent(
name: "assistant",
systemMessage: "You are an assistant that help user to do some tasks.",
llmConfig: new ConversableAgentConfig
{
Temperature = 0,
ConfigList = [gpt35Config],
})
chatClient: openAIClient.GetChatClient(model))
.RegisterMessageConnector()
.RegisterPrintMessage(); // register a hook to print message nicely to console

// set human input mode to ALWAYS so that user always provide input
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@

using System.Text.Json;
using AutoGen.Core;
using AutoGen.OpenAI.V1;
using AutoGen.OpenAI;
using FluentAssertions;

namespace AutoGen.BasicSample.CodeSnippet;
Expand Down
31 changes: 15 additions & 16 deletions dotnet/sample/AutoGen.BasicSamples/CodeSnippet/OpenAICodeSnippet.cs
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,12 @@

#region using_statement
using AutoGen.Core;
using AutoGen.OpenAI.V1;
using AutoGen.OpenAI.V1.Extension;
using Azure.AI.OpenAI;
using AutoGen.OpenAI;
using AutoGen.OpenAI.Extension;
#endregion using_statement
using FluentAssertions;
using OpenAI;
using OpenAI.Chat;

namespace AutoGen.BasicSample.CodeSnippet;
#region weather_function
Expand All @@ -32,40 +33,39 @@ public async Task CreateOpenAIChatAgentAsync()
{
#region create_openai_chat_agent
var openAIKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new Exception("Please set OPENAI_API_KEY environment variable.");
var modelId = "gpt-3.5-turbo";
var modelId = "gpt-4o-mini";
var openAIClient = new OpenAIClient(openAIKey);

// create an open ai chat agent
var openAIChatAgent = new OpenAIChatAgent(
openAIClient: openAIClient,
chatClient: openAIClient.GetChatClient(modelId),
name: "assistant",
modelName: modelId,
systemMessage: "You are an assistant that help user to do some tasks.");

// OpenAIChatAgent supports the following message types:
// - IMessage<ChatRequestMessage> where ChatRequestMessage is from Azure.AI.OpenAI

var helloMessage = new ChatRequestUserMessage("Hello");
var helloMessage = new UserChatMessage("Hello");

// Use MessageEnvelope.Create to create an IMessage<ChatRequestMessage>
var chatMessageContent = MessageEnvelope.Create(helloMessage);
var reply = await openAIChatAgent.SendAsync(chatMessageContent);

// The type of reply is MessageEnvelope<ChatResponseMessage> where ChatResponseMessage is from Azure.AI.OpenAI
reply.Should().BeOfType<MessageEnvelope<ChatResponseMessage>>();
// The type of reply is MessageEnvelope<ChatCompletion> where ChatResponseMessage is from Azure.AI.OpenAI
reply.Should().BeOfType<MessageEnvelope<ChatCompletion>>();

// You can un-envelop the reply to get the ChatResponseMessage
ChatResponseMessage response = reply.As<MessageEnvelope<ChatResponseMessage>>().Content;
response.Role.Should().Be(ChatRole.Assistant);
ChatCompletion response = reply.As<MessageEnvelope<ChatCompletion>>().Content;
response.Role.Should().Be(ChatMessageRole.Assistant);
#endregion create_openai_chat_agent

#region create_openai_chat_agent_streaming
var streamingReply = openAIChatAgent.GenerateStreamingReplyAsync(new[] { chatMessageContent });

await foreach (var streamingMessage in streamingReply)
{
streamingMessage.Should().BeOfType<MessageEnvelope<StreamingChatCompletionsUpdate>>();
streamingMessage.As<MessageEnvelope<StreamingChatCompletionsUpdate>>().Content.Role.Should().Be(ChatRole.Assistant);
streamingMessage.Should().BeOfType<MessageEnvelope<StreamingChatCompletionUpdate>>();
streamingMessage.As<MessageEnvelope<StreamingChatCompletionUpdate>>().Content.Role.Should().Be(ChatMessageRole.Assistant);
}
#endregion create_openai_chat_agent_streaming

Expand All @@ -77,7 +77,7 @@ public async Task CreateOpenAIChatAgentAsync()
// now the agentWithConnector supports more message types
var messages = new IMessage[]
{
MessageEnvelope.Create(new ChatRequestUserMessage("Hello")),
MessageEnvelope.Create(new UserChatMessage("Hello")),
new TextMessage(Role.Assistant, "Hello", from: "user"),
new MultiModalMessage(Role.Assistant,
[
Expand Down Expand Up @@ -106,9 +106,8 @@ public async Task OpenAIChatAgentGetWeatherFunctionCallAsync()

// create an open ai chat agent
var openAIChatAgent = new OpenAIChatAgent(
openAIClient: openAIClient,
chatClient: openAIClient.GetChatClient(modelId),
name: "assistant",
modelName: modelId,
systemMessage: "You are an assistant that help user to do some tasks.")
.RegisterMessageConnector();

Expand Down
Loading

0 comments on commit 0e76c51

Please sign in to comment.