Skip to content

svgmaster/OpenAIApi

Repository files navigation

OpenAI API Library for C++ (Unofficial)

This is Community-maintained OpenAI API Library for modern C++, provides access to the OpenAI API to implementing in C++ applications. The main parts of the code in this library follow OpenAPI specification and was generated by the OpenAPI Generator

Dependencies

Build tools and Libraries:

Installations

Library installation on Linux:

You can compile and install the library with these commands:

$ git clone https://github.com/svgmaster/OpenAIApi
$ cd OpenAIApi/build
$ cmake ..
$ make -j4
$ sudo make install

Library installation on MacOS

You can install dependencies with these commands:

brew install gcc cmake openssl curl

You can compile and install the library, same like Linux instructions.

Library installation on Windows

Build on Windows with Visual Studio (VS2022)

Prerequisites:

vcpkg install curl:x64-windows
  • Open folder containing source code
  • Select 'OpenAIApi.sln' solution file
  • Once visual studio opens.
  • Select Build > Build Solution.

Usage

To use the library needs to be configured with your account's secret key, which is available on the official OpenAI website. We recommend setting it as an environment variable. Here's an example of initializing the library with the API key and Organization Key loaded from an environment variable and creating a completion:

Important note: Don't expose your secret API key. See here for more details.

When build your solution for Windows don't forget to add this library "Ws2_32.lib" "Wldap32.lib" "Crypt32.lib" to linker See here for details.

In this example ( and library ) we use std:future, for more information See here

#include <iostream>

#include <OpenAIApi/Client.h>
using namespace OpenAIApi;

int main()
{ 
    /* get API key for authentication */
    std::string apiKey = getenv("OPENAI_API_KEY");
    
    /* optional specify which organization is used for an API request */
    std::string orgId = getenv("OPENAI_ORG_ID"); 
	
    Client client = Client( apiKey, orgId );
    
    std::string response = client.createCompletion({
        {"model", "text-davinci-002"},
        {"prompt" , "Say this is an example"},
        {"temperature", 0},
        {"max_tokens", 10}
        }).get().Model() // call function asynchronously 
        ["choices"][0]["text"];	
		
	std::cout << response << std::endl;
	
	return 0;
}        

Check out the full API documentation for examples of all the available functions.

Or follow this link Best practices for prompt engineering with OpenAI API.

Error handling

In situation API requests can return errors due to invalid inputs or other issues. These errors can be handled with a try...catch statement:

try 
{
    Client* client = new Client( "OPENAI_API_KEY" );
    
    std::string response = client->createCompletion({
        {"model", "text-davinci-008"}, // throw error: this model does not exists
        {"prompt" , "Say this is an example"},
        }).get().Model() // call function asynchronously 
        ["choices"][0]["text"];
		
	delete client; // always free up memory :)	
		
	std::cout << response << std::endl;
}
catch ( OpenAIException error )
{
    std::cout << error.getMessage() << std::endl;
}

Contributing

Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

See the open issues for a full list of proposed features (and known issues).

ChangeLog

Last version is v0.3.1

ChangeLog

License

Distributed under the MIT License. See LICENSE for more information.

(back to top)