STT/bin
Reuben Morais efdaa61e2c Revive transcribe.py
Update to use Coqpit based config handling, fix multiprocesing setup, and add CI coverage.
2021-11-19 13:57:44 +01:00
..
compare_samples.py Run pre-commit hooks on all files 2021-05-18 13:45:52 +02:00
data_set_tool.py Break dependency cycle between augmentation and config 2021-05-19 20:19:36 +02:00
graphdef_binary_to_text.py Add changes from pre-commit hook 2021-06-10 10:49:54 -04:00
import_aidatatang.py Run pre-commit hooks on all files 2021-05-18 13:45:52 +02:00
import_aishell.py Run pre-commit hooks on all files 2021-05-18 13:45:52 +02:00
import_ccpmf.py Run pre-commit hooks on all files 2021-05-18 13:45:52 +02:00
import_cv2.py Run pre-commit hooks on all files 2021-05-18 13:45:52 +02:00
import_cv.py Run pre-commit hooks on all files 2021-05-18 13:45:52 +02:00
import_fisher.py Run pre-commit hooks on all files 2021-05-18 13:45:52 +02:00
import_freestmandarin.py Run pre-commit hooks on all files 2021-05-18 13:45:52 +02:00
import_gram_vaani.py Run pre-commit hooks on all files 2021-05-18 13:45:52 +02:00
import_ldc93s1.py Run pre-commit hooks on all files 2021-05-18 13:45:52 +02:00
import_librivox.py Add changes from pre-commit hook 2021-06-10 10:49:54 -04:00
import_lingua_libre.py Run pre-commit hooks on all files 2021-05-18 13:45:52 +02:00
import_m-ailabs.py Run pre-commit hooks on all files 2021-05-18 13:45:52 +02:00
import_magicdata.py Run pre-commit hooks on all files 2021-05-18 13:45:52 +02:00
import_mls_english.py Run pre-commit hooks on all files 2021-05-18 13:45:52 +02:00
import_primewords.py Run pre-commit hooks on all files 2021-05-18 13:45:52 +02:00
import_slr57.py Run pre-commit hooks on all files 2021-05-18 13:45:52 +02:00
import_swb.py Run pre-commit hooks on all files 2021-05-18 13:45:52 +02:00
import_swc.py Run pre-commit hooks on all files 2021-05-18 13:45:52 +02:00
import_ted.py Add changes from pre-commit hook 2021-06-10 10:49:54 -04:00
import_timit.py Reformat importers with black 2020-03-31 13:43:30 +02:00
import_tuda.py Run pre-commit hooks on all files 2021-05-18 13:45:52 +02:00
import_vctk.py Run pre-commit hooks on all files 2021-05-18 13:45:52 +02:00
import_voxforge.py Add changes from pre-commit hook 2021-06-10 10:49:54 -04:00
ops_in_graph.py Reformat importers with black 2020-03-31 13:43:30 +02:00
play.py Break dependency cycle between augmentation and config 2021-05-19 20:19:36 +02:00
README.rst Move from Markdown to reStructuredText 2019-10-04 12:07:32 +02:00
run-ci-graph_augmentations.sh Cleaner lines for CI args 2021-07-21 12:07:24 -04:00
run-ci-ldc93s1_checkpoint_bytes.sh Remove alphabet.txt from CI tests with bytes_output_mode 2021-07-21 07:16:58 -04:00
run-ci-ldc93s1_checkpoint_sdb.sh Cleaner lines for CI args 2021-07-21 12:07:24 -04:00
run-ci-ldc93s1_checkpoint.sh Cleaner lines for CI args 2021-07-21 12:07:24 -04:00
run-ci-ldc93s1_new_bytes_tflite.sh Remove alphabet.txt from CI tests with bytes_output_mode 2021-07-21 07:16:58 -04:00
run-ci-ldc93s1_new_bytes.sh Split train.py into separate modules 2021-08-25 18:57:30 +02:00
run-ci-ldc93s1_new_metrics.sh Cleaner lines for CI args 2021-07-21 12:07:24 -04:00
run-ci-ldc93s1_new_sdb_csv.sh Cleaner lines for CI args 2021-07-21 12:07:24 -04:00
run-ci-ldc93s1_new_sdb.sh Cleaner lines for CI args 2021-07-21 12:07:24 -04:00
run-ci-ldc93s1_new.sh Split train.py into separate modules 2021-08-25 18:57:30 +02:00
run-ci-ldc93s1_singleshotinference.sh Exercise training graph inference/Flashlight decoder in extra training tests 2021-10-30 14:59:32 +02:00
run-ci-ldc93s1_tflite.sh Cleaner lines for CI args 2021-07-21 12:07:24 -04:00
run-ci-sample_augmentations.sh Remove code refs to TaskCluster 2021-04-12 12:51:55 +02:00
run-ci-transfer.sh Switch flag/config handling to Coqpit 2021-05-19 20:19:36 +02:00
run-ldc93s1.py Revive transcribe.py 2021-11-19 13:57:44 +01:00
run-ldc93s1.sh Revive transcribe.py 2021-11-19 13:57:44 +01:00

Utility scripts
===============

This folder contains scripts that can be used to do training on the various included importers from the command line. This is useful to be able to run training without a browser open, or unattended on a remote machine. They should be run from the base directory of the repository. Note that the default settings assume a very well-specified machine. In the situation that out-of-memory errors occur, you may find decreasing the values of ``--train_batch_size``\ , ``--dev_batch_size`` and ``--test_batch_size`` will allow you to continue, at the expense of speed.