Merge pull request #1895 from carlfm01/windows-docs
Add Windows build doc
This commit is contained in:
commit
4689a309cb
@ -57,6 +57,7 @@ See the output of `deepspeech -h` for more information on the use of `deepspeech
|
|||||||
* [Python 3.6](https://www.python.org/)
|
* [Python 3.6](https://www.python.org/)
|
||||||
* [Git Large File Storage](https://git-lfs.github.com/)
|
* [Git Large File Storage](https://git-lfs.github.com/)
|
||||||
* Mac or Linux environment
|
* Mac or Linux environment
|
||||||
|
* Go to [build README](examples/net_framework/README.md) to start building DeepSpeech for Windows from source.
|
||||||
|
|
||||||
## Getting the code
|
## Getting the code
|
||||||
|
|
||||||
|
134
examples/net_framework/README.md
Normal file
134
examples/net_framework/README.md
Normal file
@ -0,0 +1,134 @@
|
|||||||
|
# Building DeepSpeech native client for Windows
|
||||||
|
|
||||||
|
Now we can build the native client of DeepSpeech and run inference on Windows using the C# client, to do that we need to compile the `native_client`.
|
||||||
|
|
||||||
|
**Table of Contents**
|
||||||
|
|
||||||
|
- [Prerequisites](#prerequisites)
|
||||||
|
- [Getting the code](#getting-the-code)
|
||||||
|
- [Configuring the paths](#configuring-the-paths)
|
||||||
|
- [Adding environment variables](#adding-environment-variables)
|
||||||
|
- [MSYS2 paths](#msys2-paths)
|
||||||
|
- [BAZEL path](#bazel-path)
|
||||||
|
- [Python path](#python-path)
|
||||||
|
- [CUDA paths](#cuda-paths)
|
||||||
|
- [Building the native_client](#building-the-native_client)
|
||||||
|
- [Build for CPU](#cpu)
|
||||||
|
- [Build with CUDA support](#gpu-with-cuda)
|
||||||
|
- [Using the generated library](#using-the-generated-library)
|
||||||
|
## Prerequisites
|
||||||
|
|
||||||
|
* [Python 3.6](https://www.python.org/)
|
||||||
|
* [Git Large File Storage](https://git-lfs.github.com/)
|
||||||
|
* [MSYS2(x86_64)](https://www.msys2.org/)
|
||||||
|
* [Bazel v0.17.2](https://github.com/bazelbuild/bazel/releases)
|
||||||
|
* [Windows 10 SDK](https://developer.microsoft.com/en-us/windows/downloads/windows-10-sdk)
|
||||||
|
* Windows 10
|
||||||
|
* [Visual Studio 2017 Community](https://visualstudio.microsoft.com/vs/community/)
|
||||||
|
|
||||||
|
Inside the Visual Studio Installer enable `MS Build Tools` and `VC++ 2015.3 v14.00 (v140) toolset for desktop`.
|
||||||
|
|
||||||
|
If you want to enable CUDA support you need to install:
|
||||||
|
|
||||||
|
* [CUDA 9.0 and cuDNN 7.3.1](https://developer.nvidia.com/cuda-90-download-archive)
|
||||||
|
|
||||||
|
It may compile with other versions, as we don't extensively test other versions, we highly recommend sticking to the recommended versions in order to avoid compilation errors caused by incompatible versions.
|
||||||
|
|
||||||
|
## Getting the code
|
||||||
|
|
||||||
|
We need to clone `mozilla/DeepSpeech` and `mozilla/tensorflow`.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git clone https://github.com/mozilla/DeepSpeech
|
||||||
|
```
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git clone https://github.com/mozilla/tensorflow
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuring the paths
|
||||||
|
|
||||||
|
We need to create a symbolic link, for this example let's suppose that we cloned into `D:\cloned` and now the structure looks like:
|
||||||
|
|
||||||
|
.
|
||||||
|
├── D:\
|
||||||
|
│ ├── cloned # Contains DeepSpeech and tensorflow side by side
|
||||||
|
│ │ ├── DeepSpeech # Root of the cloned DeepSpeech
|
||||||
|
│ │ ├── tensorflow # Root of the cloned Mozilla's tensorflow
|
||||||
|
└── ...
|
||||||
|
|
||||||
|
Change your path accordingly to your path structure, for the structure above we are going to use the following command:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
mklink /d "D:\cloned\tensorflow\native_client" "D:\cloned\DeepSpeech\native_client"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Adding environment variables
|
||||||
|
|
||||||
|
After you have installed the requirements there are few environment variables that we need to add to our `PATH` variable of the system variables.
|
||||||
|
|
||||||
|
#### MSYS2 paths
|
||||||
|
|
||||||
|
For MSYS2 we need to add `bin` directory, if you installed in the default route the path that we need to add should looks like `C:\msys64\usr\bin`. Now we can run `pacman`:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pacman -Syu
|
||||||
|
```
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pacman -Su
|
||||||
|
```
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pacman -S patch unzip
|
||||||
|
```
|
||||||
|
|
||||||
|
#### BAZEL path
|
||||||
|
|
||||||
|
For BAZEL we need to add the path to the executable, make sure you rename the executable to `bazel`.
|
||||||
|
|
||||||
|
To check the version installed you can run:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
bazel version
|
||||||
|
```
|
||||||
|
|
||||||
|
#### PYTHON path
|
||||||
|
|
||||||
|
Add your `python.exe` path to the `PATH` variable.
|
||||||
|
|
||||||
|
|
||||||
|
#### CUDA paths
|
||||||
|
|
||||||
|
If you run CUDA enabled `native_client` we need to add the following to the `PATH` variable.
|
||||||
|
|
||||||
|
```
|
||||||
|
C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v9.0\bin
|
||||||
|
```
|
||||||
|
|
||||||
|
### Building the native_client
|
||||||
|
|
||||||
|
There's one last command to run before building, you need to run the [configure.py](https://github.com/mozilla/tensorflow/blob/master/configure.py) inside `tensorflow` cloned directory.
|
||||||
|
|
||||||
|
At this point we are ready to start building the `native_client`, go to `tensorflow` directory that you cloned, following our examples should be `D:\cloned\tensorflow`.
|
||||||
|
|
||||||
|
#### CPU
|
||||||
|
We will add AVX/AVX2 support in the command, please make sure that your CPU supports these instructions before adding the flags, if not you can remove them.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
bazel build -c opt --copt=/arch:AVX --copt=/arch:AVX2 //native_client:libdeepspeech.so
|
||||||
|
```
|
||||||
|
|
||||||
|
#### GPU with CUDA
|
||||||
|
If you enabled CUDA in [configure.py](https://github.com/mozilla/tensorflow/blob/master/configure.py) configuration command now you can add `--config=cuda` to compile with CUDA support.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
bazel build -c opt --config=cuda --copt=/arch:AVX --copt=/arch:AVX2 //native_client:libdeepspeech.so
|
||||||
|
```
|
||||||
|
|
||||||
|
Be patient, if you enabled AVX/AVX2 and CUDA it will take a long time. Finally you should see it stops and shows the path to the generated `libdeepspeech.so`.
|
||||||
|
|
||||||
|
|
||||||
|
## Using the generated library
|
||||||
|
|
||||||
|
As for now we can only use the generated `libdeepspeech.so` with the C# clients, go to [DeepSpeech/examples/net_framework/CSharpExamples/](https://github.com/mozilla/DeepSpeech/tree/master/examples/net_framework/CSharpExamples) in your DeepSpeech directory and open the Visual Studio solution, then we need to build in debug or release mode, finally we just need to copy `libdeepspeech.so` to the generated `x64/Debug` or `x64/Release` directory.
|
Loading…
x
Reference in New Issue
Block a user