3. Function

3.1. init_and_finish

This section describes the TopsInference init and finish functions.

3.1.1. TopsInference_init

This section describes the TopsInference topsInference_init funtion.

TOPS_INFERENCE_EXPORT TIFStatus topsInference_init()

sdk init function only call once at the beginning of the process

Returns

TIFStatus

3.1.2. TopsInference_finish

This section describes the TopsInference topsInference_finish funtion.

TOPS_INFERENCE_EXPORT TIFStatus topsInference_finish()

sdk finish function only call once at the end of the process

Returns

TIFStatus

3.2. device

This section describes the TopsInference device functions.

typedef void *handler_t

3.2.1. Set_device

This section describes the TopsInference set_device funtion.

TOPS_INFERENCE_EXPORT handler_t set_device(uint32_t card_id, const uint32_t *cluster_ids, uint32_t cluster_ids_size = 1, IErrorManager *error_manager = nullptr)

specify the running device until release. set_device are isolated from each other under multi-process.

  1. Under multi-thread condition, each sub-thread will exclusively utilize the claimed resource if set_device is called within the sub-thread.

  2. if set_device() is called in main thread, not called in sub thread, the sub thread will share the cluster resources claimed by main thread.

  3. if both main thread and sub thread claimed resource with set_device(), the resource claimed by sub thread is used within sub thread.

  4. if some sub threads claims resource with set_device(), some does not, each sub thread individually follow the above rule 3 and 2 based on its resource claiming status.

Parameters
  • card_id – Specify the current card id

  • cluster_ids – Specify the current clusters ids

  • cluster_ids_size – Specify the current cluster num

Returns

handler_t Specify the handler

3.2.2. Release_device

This section describes the TopsInference release_device funtion.

TOPS_INFERENCE_EXPORT bool release_device(handler_t)

free the running device

Parameters

handler_t – use set_device() handler_t

Returns

bool

Returns

  • true – Return true if succeed

  • false – Return false if fail

3.3. error_manager

This section describes the TopsInference error_manager functions.

3.3.1. Create_error_manager

This section describes the TopsInference create_error_manager funtion.

TOPS_INFERENCE_EXPORT IErrorManager *create_error_manager()

Create an instance of IErrorManager.

3.3.2. Release_error_manager

This section describes the TopsInference release_error_manager funtion.

TOPS_INFERENCE_EXPORT bool release_error_manager(IErrorManager *error_manager)

Release ErrorManager object.

Parameters

error_manager – pointer of IErrorManager

Returns

bool

Returns

  • true – Return true if succeed

  • false – Return false if fail

3.4. parser

This section describes the TopsInference parser functions.

3.4.1. Create_parser

This section describes the TopsInference create_parser funtion.

TOPS_INFERENCE_EXPORT IParser *create_parser(ParserType parse_type, IErrorManager *error_manager = nullptr)

Create an instance of IParser with specified parse_type.

Note

Examples:

TopsInference::IParser* parser = TopsInference::create_parser(
    TopsInference::TIF_ONNX);

Parameters

parse_type – use TIF_ONNX to parse onnx model

Returns

IParser* pointer of IParser, or nullptr if create IParser failure

3.4.2. Release_parser

This section describes the TopsInference release_parser funtion.

TOPS_INFERENCE_EXPORT bool release_parser(IParser *parser)

Release Parser object.

Parameters

parser – pointer of IParser

Returns

bool

Returns

  • true – Return true if succeed

  • false – Return false if fail

3.5. optimizer

This section describes the TopsInference optimizer functions.

3.5.1. Create_optimizer

This section describes the TopsInference create_optimizer funtion.

TOPS_INFERENCE_EXPORT IOptimizer *create_optimizer(IErrorManager *error_manager = nullptr)

Create an instance of IOptimizer.

Note

Examples:

TopsInference::IOptimizer* optimizer = TopsInference::create_optimizer();

Returns

IOptimizer* pointer of IOptimizer, or nullptr if create IOptimizer failure

3.5.2. Release_optimizer

This section describes the TopsInference release_optimizer funtion.

TOPS_INFERENCE_EXPORT bool release_optimizer(IOptimizer *optimizer)

Release Optimizer object.

Parameters

optimizer – pointer of IOptimizer

Returns

bool

Returns

  • true – Return true if succeed

  • false – Return false if fail

3.6. engine

This section describes the TopsInference engine functions.

3.6.1. Create_engine

This section describes the TopsInference create_engine funtion.

TOPS_INFERENCE_EXPORT IEngine *create_engine(IErrorManager *error_manager = nullptr)

Create an instance of IEngine.

Note

Examples:

TopsInference::IEngine* engine = TopsInference::create_engine();

Returns

IEngine* pointer of IEngine, or nullptr if create IEngine failure

3.6.2. Release_engine

This section describes the TopsInference release_engine funtion.

TOPS_INFERENCE_EXPORT bool release_engine(IEngine *engine)

Release Engine object.

Parameters

engine – pointer of IEngine

Returns

bool

Returns

  • true – Return true if succeed

  • false – Return false if fail

3.7. network

This section describes the TopsInference network functions.

3.7.1. Release_network

This section describes the TopsInference release_network funtion.

TOPS_INFERENCE_EXPORT bool release_network(INetwork *network)

Release Network object.

Parameters

network – pointer of INetwork

Returns

bool

Returns

  • true – Return true if succeed

  • false – Return false if fail

3.8. stream

This section describes the TopsInference stream functions.

3.8.1. Create_stream

This section describes the TopsInference create_stream funtion.

TOPS_INFERENCE_EXPORT bool create_stream(topsInferStream_t *streamCtx)

Create an instance of Stream, cannot create more than 1000 streams.

Parameters

streamCtx – the double pointer of stream to be created

Returns

bool whether the stream has been created

3.8.2. Synchronize_stream

This section describes the TopsInference synchronize_stream funtion.

TOPS_INFERENCE_EXPORT bool synchronize_stream(topsInferStream_t stream)

Stream Synchronize, Wait for any pending asynchronous action in stream.

Parameters

stream – the pointer of stream to synchronize

3.8.3. Destroy_stream

This section describes the TopsInference destroy_stream funtion.

TOPS_INFERENCE_EXPORT bool destroy_stream(topsInferStream_t stream)

Destroy stream object.

Parameters

stream – pointer of stream

Returns

bool

Returns

  • true – Return true if succeed

  • false – Return false if fail

3.9. future

This section describes the TopsInference future functions.

3.9.1. Create_future

This section describes the TopsInference create_future funtion.

TOPS_INFERENCE_EXPORT IFuture *create_future()

Create an instance of IFuture.

3.9.2. Destroy_future

This section describes the TopsInference destroy_future funtion.

TOPS_INFERENCE_EXPORT bool destroy_future(IFuture *future)

Destroy Future object.

Parameters

future – pointer of IFuture

Returns

bool

Returns

  • true – Return true if succeed

  • false – Return false if fail

3.10. tensor

This section describes the TopsInference tensor functions.

3.10.1. Create_tensor

This section describes the TopsInference create_tensor funtion.

TOPS_INFERENCE_EXPORT TensorPtr_t create_tensor()

Create an instance of ITensor.

Returns

TensorPtr_t pointer of ITensor, or nullptr if create tensor failture

3.10.2. Destroy_tensor

This section describes the TopsInference destroy_tensor funtion.

TOPS_INFERENCE_EXPORT TIFStatus destroy_tensor(TensorPtr_t itensor)

Destroy tensor.

See also

Status code

Parameters

itensor – pointer of a tensor object

Returns

TIFStatus

3.11. refit

This section describes the TopsInference refit functions.

3.11.1. Create_refitter

This section describes the TopsInference create_refitter funtion.

TOPS_INFERENCE_EXPORT IRefitter *create_refitter(IEngine *engine, IErrorManager *error_manager = nullptr)

Refitter Create.

Parameters

engine – pointer of IEngine

Returns

IRefitter* pointer of IRefitter, or nullptr if create IRefitter failture

3.11.2. Release_refitter

This section describes the TopsInference release_refitter funtion.

TOPS_INFERENCE_EXPORT bool release_refitter(IRefitter *refitter)

Refiter Release.

Parameters

refitter – pointer of IRefitter

Returns

bool

Returns

  • true – Return true if succeed

  • false – Return false if fail

3.12. memory

This section describes the TopsInference memory functions.

3.12.1. Mem_alloc

This section describes the TopsInference mem_alloc funtion.

TOPS_INFERENCE_EXPORT bool mem_alloc(void **ptr, int64_t size)

memory allocate

Parameters
  • ptr – double pointer of GCU memory to allocate

  • size – GCU memory size

Returns

bool

Returns

  • true – Return true if succeed

  • false – Return false if fail

3.12.2. Mem_free

This section describes the TopsInference mem_free funtion.

TOPS_INFERENCE_EXPORT bool mem_free(void *ptr)

Free GCU device memory.

Parameters

ptr – The allocated buffer address

Returns

bool

Returns

  • true – Return true if succeed

  • false – Return false if fail

3.12.3. Mem_copy

This section describes the TopsInference mem_copy funtion.

TOPS_INFERENCE_EXPORT bool mem_copy(void *src, void *dst, int64_t size, MemcpyKind kind)

Copy memory from GCU device memory to Host memory, or copy memory from Host memory to GCU device memory.

See also

MemcpyKind

Parameters
  • src – The source buffer address

  • dst – The destination buffer address

  • size – The bytes copied from src to dst

  • kind – The kind of copying,

Returns

bool

Returns

  • true – Return true if succeed

  • false – Return false if fail

3.12.4. Mem_copy_async

This section describes the TopsInference mem_copy_async funtion.

TOPS_INFERENCE_EXPORT bool mem_copy_async(void *src, void *dst, int64_t size, MemcpyKind kind, topsInferStream_t stream)

Copy memory from GCU device memory to Host memory async, or copy memory from Host memory to GCU device memory async.

See also

MemcpyKind

See also

topsInferStream_t

Parameters
  • src – The source buffer address

  • dst – The destination buffer address

  • size – The bytes copied from src to dst

  • kind – The kind of copying,

  • stream – The stream on which the mem copy runs,

Returns

bool

Returns

  • true – Return true if succeed

  • false – Return false if fail

3.13. topsinference_version

This section describes the TopsInference topsinference_version functions.

3.13.1. version

This section describes the TopsInference version funtion.

TOPS_INFERENCE_EXPORT VersionInfo version()

TopsInference version information.

Returns

VersionInfo, TopsInference version