Factory for constructing InterpreterApi instances.
Deprecated; please use the InterpreterApi.create method instead.
| InterpreterApi |
create(File modelFile, InterpreterApi.Options options)
Constructs an
InterpreterApi instance, using the specified model and options. |
| InterpreterApi |
create(ByteBuffer byteBuffer, InterpreterApi.Options options)
Constructs an
InterpreterApi instance, using the specified model and options. |
Constructs an InterpreterApi instance, using the specified model and options. The model
will be loaded from a file.
| modelFile | A file containing a pre-trained TF Lite model. |
|---|---|
| options | A set of options for customizing interpreter behavior. |
| IllegalArgumentException | if modelFile does not encode a valid TensorFlow Lite
model.
|
|---|
Constructs an InterpreterApi instance, using the specified model and options. The model
will be read from a ByteBuffer.
| byteBuffer | A pre-trained TF Lite model, in binary serialized form. The ByteBuffer should
not be modified after the construction of an InterpreterApi instance. The ByteBuffer can be either a MappedByteBuffer that memory-maps a model file, or a
direct ByteBuffer of nativeOrder() that contains the bytes content of a model. |
|---|---|
| options | A set of options for customizing interpreter behavior. |
| IllegalArgumentException | if byteBuffer is not a MappedByteBuffer nor a
direct ByteBuffer of nativeOrder.
|
|---|