Restore updated documentation on TFLite binary size
PiperOrigin-RevId: 302990033 Change-Id: I2ac25dff0d5e53f0c1b82f8f88fde1e5d52bb893
This commit is contained in:
parent
43121ac128
commit
4997cdbbaa
@ -28,9 +28,10 @@ improve:
|
|||||||
TensorFlow Lite works with a huge range of devices, from tiny microcontrollers
|
TensorFlow Lite works with a huge range of devices, from tiny microcontrollers
|
||||||
to powerful mobile phones.
|
to powerful mobile phones.
|
||||||
|
|
||||||
Key Point: The TensorFlow Lite binary is smaller than 300KB when all supported
|
Key Point: The TensorFlow Lite binary is ~1MB when all 125+ supported operators
|
||||||
operators are linked, and less than 200KB when using only the operators needed
|
are linked (for 32-bit ARM builds), and less than 300KB when using only the
|
||||||
for supporting the common image classification models InceptionV3 and MobileNet.
|
operators needed for supporting the common image classification models
|
||||||
|
InceptionV3 and MobileNet.
|
||||||
|
|
||||||
## Get started
|
## Get started
|
||||||
|
|
||||||
|
Loading…
Reference in New Issue
Block a user