parent
315a67bf69
commit
32e9b8cd3e
|
@ -54,7 +54,7 @@ Quicker inference can be performed using a supported NVIDIA GPU on Linux. See th
|
|||
|
||||
Please ensure you have the required `CUDA dependencies <USING.rst#cuda-dependency>`_.
|
||||
|
||||
See the output of ``deepspeech -h`` for more information on the use of ``deepspeech``. (If you experience problems running ``deepspeech``\ , please check `required runtime dependencies <native_client/README.md#required-dependencies>`_\ ).
|
||||
See the output of ``deepspeech -h`` for more information on the use of ``deepspeech``. (If you experience problems running ``deepspeech``\ , please check `required runtime dependencies <native_client/README.rst#required-dependencies>`_\ ).
|
||||
|
||||
----
|
||||
|
||||
|
|
|
@ -54,7 +54,7 @@ You'll also need to install the ``ds_ctcdecoder`` Python package. ``ds_ctcdecode
|
|||
|
||||
pip3 install $(python3 util/taskcluster.py --decoder)
|
||||
|
||||
This command will download and install the ``ds_ctcdecoder`` package. You can override the platform with ``--arch`` if you want the package for ARM7 (\ ``--arch arm``\ ) or ARM64 (\ ``--arch arm64``\ ). If you prefer building the ``ds_ctcdecoder`` package from source, see the `native_client README file <native_client/README.md>`_.
|
||||
This command will download and install the ``ds_ctcdecoder`` package. You can override the platform with ``--arch`` if you want the package for ARM7 (\ ``--arch arm``\ ) or ARM64 (\ ``--arch arm64``\ ). If you prefer building the ``ds_ctcdecoder`` package from source, see the `native_client README file <native_client/README.rst>`_.
|
||||
|
||||
Recommendations
|
||||
^^^^^^^^^^^^^^^
|
||||
|
@ -155,7 +155,7 @@ Exporting a model for inference
|
|||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
If the ``--export_dir`` parameter is provided, a model will have been exported to this directory during training.
|
||||
Refer to the corresponding `README.md <native_client/README.md>`_ for information on building and running a client that can use the exported model.
|
||||
Refer to the corresponding `README.rst <native_client/README.rst>`_ for information on building and running a client that can use the exported model.
|
||||
|
||||
Exporting a model for TFLite
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
|
|
@ -7,7 +7,7 @@ Inference using a DeepSpeech pre-trained model can be done with a client/languag
|
|||
* `The Python package/language binding <#using-the-python-package>`_
|
||||
* `The Node.JS package/language binding <#using-the-nodejs-package>`_
|
||||
* `The Command-Line client <#using-the-command-line-client>`_
|
||||
* `The .NET client/language binding <native_client/dotnet/README.md>`_
|
||||
* `The .NET client/language binding <native_client/dotnet/README.rst>`_
|
||||
|
||||
Running ``deepspeech`` might, see below, require some runtime dependencies to be already installed on your system:
|
||||
|
||||
|
@ -161,12 +161,12 @@ Note: the following command assumes you `downloaded the pre-trained model <#gett
|
|||
|
||||
./deepspeech --model models/output_graph.pbmm --alphabet models/alphabet.txt --lm models/lm.binary --trie models/trie --audio audio_input.wav
|
||||
|
||||
See the help output with ``./deepspeech -h`` and the `native client README <native_client/README.md>`_ for more details.
|
||||
See the help output with ``./deepspeech -h`` and the `native client README <native_client/README.rst>`_ for more details.
|
||||
|
||||
Installing bindings from source
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
If pre-built binaries aren't available for your system, you'll need to install them from scratch. Follow these `\ ``native_client`` installation instructions <native_client/README.md>`_.
|
||||
If pre-built binaries aren't available for your system, you'll need to install them from scratch. Follow these `\ ``native_client`` installation instructions <native_client/README.rst>`_.
|
||||
|
||||
Third party bindings
|
||||
^^^^^^^^^^^^^^^^^^^^
|
||||
|
|
|
@ -166,7 +166,7 @@ The path of the system tree can be overridden from the default values defined in
|
|||
Android devices
|
||||
^^^^^^^^^^^^^^^
|
||||
|
||||
We have preliminary support for Android relying on TensorFlow Lite, with Java and JNI bindinds. For more details on how to experiment with those, please refer to ``native_client/java/README.md``.
|
||||
We have preliminary support for Android relying on TensorFlow Lite, with Java and JNI bindinds. For more details on how to experiment with those, please refer to ``native_client/java/README.rst``.
|
||||
|
||||
Please refer to TensorFlow documentation on how to setup the environment to build for Android (SDK and NDK required).
|
||||
|
||||
|
|
|
@ -2,7 +2,7 @@
|
|||
DeepSpeech Java / Android bindings
|
||||
==================================
|
||||
|
||||
This is still preliminary work. Please refer to ``native_client/README.md`` for
|
||||
This is still preliminary work. Please refer to ``native_client/README.rst`` for
|
||||
building ``libdeepspeech.so`` and ``deepspeech`` binary for Android on ARMv7 and
|
||||
ARM64 arch.
|
||||
|
||||
|
|
|
@ -13,4 +13,4 @@ bdist-dir=temp_build
|
|||
build-dir=temp_build
|
||||
|
||||
[metadata]
|
||||
description-file = ../README.md
|
||||
description-file = ../README.rst
|
||||
|
|
Loading…
Reference in New Issue