Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Web] Unable to change log level and hide warning messages when creating inference session from buffer #17377

Closed
xenova opened this issue Aug 31, 2023 · 9 comments
Labels
platform:web issues related to ONNX Runtime web; typically submitted using template

Comments

@xenova
Copy link

xenova commented Aug 31, 2023

Describe the issue

It is impossible to disable warning messages generated when creating an onnx session if you use a buffer to create the model (instead of a path).

Linked to:

cc @fs-eire @guschmue

To reproduce

  1. Get model files. For this example, I'll use whisper-tiny.en's decoder, but it happens for most models.

    git clone https://huggingface.co/Xenova/whisper-tiny.en

    or just download the single onnx file needed from here.

  2. Run this code:

    import fs from 'fs';
    import ort from 'onnxruntime-node'
    
    const options = {
        executionProviders: ['cpu'],
        graphOptimizationLevel: 'disabled',
    }
    
    const path = './models/whisper-tiny.en/onnx/decoder_model_merged_quantized.onnx';
    const session = await ort.InferenceSession.create(path, options);
    console.log(session);
    
    // Read as buffer
    const buffer = fs.readFileSync(path);
    const sessionBroken = await ort.InferenceSession.create(buffer, options);
    console.log(sessionBroken);
  3. See output:
    WORKING:

    m {
      handler: OnnxruntimeSessionHandler {
        inputNames: [...],
        outputNames: [...]
      }
    }
    

    BROKEN:

    2023-08-31 21:25:16.663056737 [W:onnxruntime:, graph.cc:3490 CleanUnusedInitializersAndNodeArgs] Removing initializer '/model/decoder/Shape_4_output_0'. It is not used by any node and should be removed from the model.
    ...
    2023-08-31 21:25:16.718106598 [W:onnxruntime:, graph.cc:3490 CleanUnusedInitializersAndNodeArgs] Removing initializer '/model/decoder/layers.1/final_layer_norm/Constant_1_output_0'. It is not used by any node and should be removed from the model.
    m {
      handler: OnnxruntimeSessionHandler {
        inputNames: [...],
        outputNames: [...]
      }
    }
    

Urgency

Blocks the following issues from being fixed in transformers.js:

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

1.15.1

Execution Provider

'wasm'/'cpu' (WebAssembly CPU)

@xenova xenova added the platform:web issues related to ONNX Runtime web; typically submitted using template label Aug 31, 2023
@github-actions github-actions bot added the model:transformer issues related to a transformer model: BERT, GPT2, Hugging Face, Longformer, T5, etc. label Aug 31, 2023
@hariharans29 hariharans29 removed the model:transformer issues related to a transformer model: BERT, GPT2, Hugging Face, Longformer, T5, etc. label Sep 26, 2023
@hariharans29
Copy link
Member

CC @fs-eire @guschmue

@fs-eire
Copy link
Contributor

fs-eire commented Sep 27, 2023

I build main branch and tested locally and cannot reproduce this issue. I will do a test on 1.15.1 later

@fs-eire
Copy link
Contributor

fs-eire commented Sep 27, 2023

I can reproduce this issue on 1.15.1 and 1.16.0. However, my local build seems has no issue. I am taking further investigations.

@fs-eire
Copy link
Contributor

fs-eire commented Oct 2, 2023

I found the root cause.

So this was a bug and it's fixed in main branch. However unfortunately v1.16 release does not include this change.

I will check if we are able to do this bugfix in 1.16.1 patch

@xenova
Copy link
Author

xenova commented Oct 2, 2023

Great, thanks for that!

snnn pushed a commit that referenced this issue Oct 4, 2023
### Description
fix session option access in Node.js binding


### Motivation and Context
This is a bug that affect transformer.js using ONNX Runtime Node.js
binding. Issue: #17377

This bug is already fixed in main branch, but it is not picked in 1.16
release.
@fs-eire
Copy link
Contributor

fs-eire commented Oct 11, 2023

v1.16.1 is released with this fix.

@fs-eire fs-eire closed this as completed Oct 11, 2023
@wesbos
Copy link

wesbos commented Mar 1, 2024

Getting these again. Any way to filter them out? Its overwhelming my console

2024-03-01 16:26:38.495 node[30406:15443260] 2024-03-01 16:26:38.495795 [W:onnxruntime:, graph.cc:3490 CleanUnusedInitializersAndNodeArgs] Removing initializer '/model/decoder/layers.2/final_layer_norm/Constant_1_output_0'. It is not used by any node and should be removed from the model.

@fs-eire
Copy link
Contributor

fs-eire commented Mar 1, 2024

@wesbos The issue should be fixed in v1.16.1.

If you are just asking about how to deal with the warning messages, here are the solutions -

  1. optimize the model: try to use the latest exporter to re-export the model, or use a model optimizer to optimize the model to remove the unused initializer. This is the way to solve the root cause.
  2. set logSeverityLevel in session options to 3 or above to disallow the message being print out
  3. set graphOptimizationLevel in session option to 'disabled' so that graph optimizer will not run. not recommended as this may slow down the model inferencing.

@AmitDJagtap
Copy link

For those who are still stumped, adding a nodejs code snippet here to set log level to 3 .

warning levels

VERBOSE = 0,
INFO = 1,
WARNING = 2,
ERROR = 3,
FATAL = 4

your code


import { pipeline, env } from '@xenova/transformers';
import fs from 'fs';
env.cacheDir = './.cache';
env.backends.onnx.logSeverityLevel = 3; // this line here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
platform:web issues related to ONNX Runtime web; typically submitted using template
Projects
None yet
Development

No branches or pull requests

5 participants