Open-Source Whisper.net
Dotnet bindings for OpenAI Whisper made possible by whisper.cpp
Build type | Build Status |
---|---|
CI Status (Native + dotnet) |
To install Whisper.net with all the available runtimes, run the following command in the Package Manager Console:
PM> Install-Package Whisper.net.AllRuntimes
Or add a package reference in your .csproj
file:
<PackageReference Include="Whisper.net.AllRuntimes" Version="1.7.4" />
Whisper.net
is the main package that contains the core functionality but does not include any runtimes. Whisper.net.AllRuntimes
includes all available runtimes for Whisper.net.
To install a specific runtime, you can install them individually and combine them as needed. For example, to install the CPU runtime, add the following package references:
<PackageReference Include="Whisper.net" Version="1.7.4" />
<PackageReference Include="Whisper.net.Runtime" Version="1.7.4" />
We also have a custom-built GPT inside ChatGPT, which can help you with information based on this code, previous issues, and releases. Available here.
Please try to ask it before publishing a new question here, as it can help you a lot faster.
Whisper.net comes with multiple runtimes to support different platforms and hardware acceleration. Below are the available runtimes:
The default runtime that uses the CPU for inference. It is available on all platforms and does not require any additional dependencies.
- Simple usage example
- Simple usage example (without Async processing)
- NAudio integration for mp3
- NAudio integration for resampled wav
- Simple channel diarization
- Blazor example
- Windows: Microsoft Visual C++ Redistributable for at least Visual Studio 2019 (x64) Download Link
- Linux:
libstdc++6
,glibc 2.31
- macOS: TBD
- For x86/x64 platforms, the CPU must support AVX, AVX2, FMA and F16C instructions. If your CPU does not support these instructions, you'll need to use the
Whisper.net.Runtime.NoAvx
runtime instead.
- Windows x86, x64, ARM64
- Linux x64, ARM64, ARM
- macOS x64, ARM64 (Apple Silicon)
- Android
- iOS
- MacCatalyst
- tvOS
- WebAssembly
For CPUs that do not support AVX instructions.
- Windows: Microsoft Visual C++ Redistributable for at least Visual Studio 2019 (x64) Download Link
- Linux:
libstdc++6
,glibc 2.31
- macOS: TBD
- Windows x86, x64, ARM64
- Linux x64, ARM64, ARM
Contains the native whisper.cpp library with NVidia CUDA support enabled.
- Everything from Whisper.net.Runtime pre-requisites
- NVidia GPU with CUDA support
- CUDA Toolkit (>= 12.1)
- Windows x64
- Linux x64
Contains the native whisper.cpp library with Apple CoreML support enabled.
- macOS x64, ARM64 (Apple Silicon)
- iOS
- MacCatalyst
- tvOS
Contains the native whisper.cpp library with Intel OpenVino support enabled.
- Everything from Whisper.net.Runtime pre-requisites
- OpenVino Toolkit (>= 2024.4)
- Windows x64
- Linux x64
Contains the native whisper.cpp library with Vulkan support enabled.
- Everything from Whisper.net.Runtime pre-requisites
- Vulkan Toolkit (>= 1.3.290.0)]
- Windows x64
You can install and use multiple runtimes in the same project. The runtime will be automatically selected based on the platform you are running the application on and the availability of the native runtime.
The following order of priority will be used by default:
Whisper.net.Runtime.Cuda
(NVidia devices with all drivers installed)Whisper.net.Runtime.Vulkan
(Windows x64 with Vulkan installed)Whisper.net.Runtime.CoreML
(Apple devices)Whisper.net.Runtime.OpenVino
(Intel devices)Whisper.net.Runtime
(CPU inference)Whisper.net.Runtime.NoAvx
(CPU inference without AVX support)
To change the order or force a specific runtime, set the RuntimeLibraryOrder
on the RuntimeOptions
:
RuntimeOptions.RuntimeLibraryOrder =
[
RuntimeLibrary.CoreML,
RuntimeLibrary.OpenVino,
RuntimeLibrary.Cuda,
RuntimeLibrary.Cpu
];
Each version of Whisper.net is tied to a specific version of Whisper.cpp. The version of Whisper.net is the same as the version of Whisper it is based on. For example, Whisper.net 1.2.0 is based on Whisper.cpp 1.2.0.
However, the patch version is not tied to Whisper.cpp. For example, Whisper.net 1.2.1 is based on Whisper.cpp 1.2.0 and Whisper.net 1.7.0 is based on Whisper.cpp 1.7.1.
Whisper.net uses Ggml models to perform speech recognition and translation. You can find more about Ggml models here.
For easier integration, Whisper.net provides a Downloader using Hugging Face.
var modelName = "ggml-base.bin";
if (!File.Exists(modelName))
{
using var modelStream = await WhisperGgmlDownloader.GetGgmlModelAsync(GgmlType.Base);
using var fileWriter = File.OpenWrite(modelName);
await modelStream.CopyToAsync(fileWriter);
}
using var whisperFactory = WhisperFactory.FromPath("ggml-base.bin");
using var processor = whisperFactory.CreateBuilder()
.WithLanguage("auto")
.Build();
using var fileStream = File.OpenRead(wavFileName);
await foreach (var result in processor.ProcessAsync(fileStream))
{
Console.WriteLine($"{result.Start}->{result.End}: {result.Text}");
}
You can find the documentation and code samples here.
This section describes how to build the native runtime libraries for Whisper.net. Normally, you would not need to build the runtime libraries yourself, as they are available as NuGet packages.
The build scripts are a combination of PowerShell scripts and a Makefile. You can read each of them for the underlying cmake
commands being used, or run them directly from the scripts.
You can also check the github actions available here
make android
Before running, create an environment variable for NDK_PATH
with the path to your Android NDK. For example:
NDK_PATH=/Users/UserName/Library/Developer/Xamarin/android-sdk-macosx/ndk-bundle
make apple
Compiling the Apple libraries requires a Mac with Xcode installed.
make apple_coreml
Compiling the Apple libraries requires a Mac with Xcode installed.
make linux
Import the PowerShell module:
Import-Module ./windows-scripts.ps1
Run BuildWindowsAll
to build all Windows libraries.
Alternatively, you can run BuildWindows
with the desired parameters.
BuildWindows -Arch "x64" -Configuration "Release" -NoAvx $true
MIT License. See LICENSE for details.