Skip to main content
The 2024 Developer Survey results are live! See the results

Questions tagged [onnxruntime]

ONNX Runtime is a cross-platform inference and training machine-learning accelerator.

onnxruntime
0 votes
0 answers
14 views

How to perform Faster Object detection on webcam using onnx runtime web?

This is the javascript script which is performing object detection accurately but processing is taking time which creates lags in frames.There is no smoothness in it. This is the script from some code ...
Arsalan Jibran's user avatar
0 votes
0 answers
14 views

"CUDA failure 700" when using onnxruntime backend optimizated with TensorRT in Triton

I want to deploy my ONNX model using Triton. Here is my model configuration, which works fine when using one specified GPU. name: "yolox" platform: "onnxruntime_onnx" ...
Aitar's user avatar
  • 23
0 votes
0 answers
19 views

How to convert onnx with onnx.data to openvino IR format

I am using mo to convert onnx to openvino IR format.But when encountering onnx and onnx.data, it reported error. mo --input_model G:\convert_model\onnx-fp16\text_encoder\model.onnx --input_shape [1,77]...
littlestone's user avatar
-2 votes
0 answers
19 views

inference in the browser [closed]

I am trying to perform inference in the browser using a webcam as the source, but I am struggling to find a working solution. Do you have any ideas or guidelines on how to achieve this loading a onnx ...
Gelso77's user avatar
  • 1,881
0 votes
0 answers
12 views

ONNXRuntimeError for exported torchvision / fasterrcnn_mobilenet_v3_large_fpn

I get an ONNXRuntimeError when running FasterRCNN model exported to .onnx Error: onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Non-zero ...
Julio Milani's user avatar
0 votes
0 answers
19 views

Fail to load external data file "onnx model"in onnx runtime in devexpress application?

I have put onnx model in my wwwroot/js folder but it is not being detected and I am getting this error. error = Error: failed to load external data file: js/sevensegment.onnx at gn (https://cdn....
Arsalan Jibran's user avatar
0 votes
1 answer
34 views

Getting errors in `wgpu` crate when used as a dependency of `wonnx`

When I try and build my project, I get these errors in the wgpu crate, which is a dependency of the wonnx crate that I am using. ❯ $env:RUSTFLAGS="--cfg=web_sys_unstable_apis"; wasm-pack ...
Jacob Marshall's user avatar
0 votes
0 answers
24 views

Export a teknium/OpenHermes-2.5-Mistral-7B model to ONNX

I am trying to export teknium/OpenHermes-2.5-Mistral-7B to ONNX, This is my code: import torch from transformers import AutoModelForCausalLM, AutoTokenizer import onnx model_name = "teknium/...
mohammed yazid Berrached's user avatar
0 votes
0 answers
18 views

How to export a temporal forecasting transformer model as an onnx model?

I have a TFT model that is performing a timeseries forecasting using the get_stallion dataset. See the link below for reference: https://pytorch-forecasting.readthedocs.io/en/stable/tutorials/stallion....
Sudeeksha Vandrangi's user avatar
0 votes
0 answers
15 views

Android ONNXRuntime Multi Thread multi models

I am trying to run two different models on android device and I successfully made two different sessions and run two models. But the problem is that they are running in serialized manner which is ...
JS J's user avatar
  • 43
-1 votes
0 answers
41 views

I transfer a torch model to onnx with dynamic input size, but when i infer onnx model by cpp onnxruntime sdk, it make an error

onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Non-zero status code returned while running Mul node. Name:'/encoders0/encoders0.0/self_attn/...
weiyang hu's user avatar
1 vote
1 answer
92 views

SAM build can't resolve dependencies for 'onnxruntime'

My problem Summary I am trying to implement YOLO on AWS Lambda, and when trying to build a Lambda Layer with ONNX, I get Error: PythonPipBuilder:ResolveDependencies - {onnxruntime==1.18.1(wheel)}. ...
Adcade's user avatar
  • 112
1 vote
1 answer
76 views

ONNX-Python: Can someone explain the Calibration_Data_Reader requested by the static_quantization-function?

I am using the ONNX-Python-library. I am trying to quantize ai-models statically using the quantize_static() function imported from onnxruntime.quantization. This function takes a ...
Zylon's user avatar
  • 11
0 votes
0 answers
23 views

How to encode data passed to/from remote ONNX models via HTTP?

Assume we have a remote ONNX ML model (running on onnxruntime), and we want to expose it via a REST API for making predictions. How to properly encode the inputs/outputs passed in HTTP messages? As ...
Quasy's user avatar
  • 29
0 votes
1 answer
138 views

Onnxruntime Test Error after Successfully Converting Model to ONNX

I have a simple PyTorch model that I'm attempting to convert to ONNX format. The forward() function consists of a call to nn.transformer.encoder(). The conversion to ONNX using torch.onnx.export() ...
Dodiak's user avatar
  • 149

15 30 50 per page
1
2 3 4 5
25