Skip to content

hotfix: enum inference type wrong string value #1704

hotfix: enum inference type wrong string value

hotfix: enum inference type wrong string value #1704

Triggered via pull request June 11, 2024 15:32
Status Failure
Total duration 15m 8s
Artifacts

pr-check.yaml

on: pull_request
Matrix: lint-format-unit
Fit to window
Zoom out
Zoom in

Annotations

12 errors
src/utils/inferenceUtils.spec.ts > parseInferenceType > llamacpp should return the proper InferenceType.LLAMA_CPP: packages/backend/src/utils/inferenceUtils.spec.ts#L95
AssertionError: expected 'none' to be 'llama-cpp' // Object.is equality - Expected + Received - llama-cpp + none ❯ src/utils/inferenceUtils.spec.ts:95:44
src/utils/inferenceUtils.spec.ts > getInferenceType > single model with llamacpp backend should return InferenceType.LLAMA_CPP: packages/backend/src/utils/inferenceUtils.spec.ts#L121
AssertionError: expected 'none' to be 'llama-cpp' // Object.is equality - Expected + Received - llama-cpp + none ❯ src/utils/inferenceUtils.spec.ts:121:7
src/utils/inferenceUtils.spec.ts > getInferenceType > multiple model with llamacpp backend should return InferenceType.LLAMA_CPP: packages/backend/src/utils/inferenceUtils.spec.ts#L134
AssertionError: expected 'none' to be 'llama-cpp' // Object.is equality - Expected + Received - llama-cpp + none ❯ src/utils/inferenceUtils.spec.ts:134:7
linter, formatters and unit tests / macos-12
Process completed with exit code 1.
src/utils/inferenceUtils.spec.ts > parseInferenceType > llamacpp should return the proper InferenceType.LLAMA_CPP: packages/backend/src/utils/inferenceUtils.spec.ts#L95
AssertionError: expected 'none' to be 'llama-cpp' // Object.is equality - Expected + Received - llama-cpp + none ❯ src/utils/inferenceUtils.spec.ts:95:44
src/utils/inferenceUtils.spec.ts > getInferenceType > single model with llamacpp backend should return InferenceType.LLAMA_CPP: packages/backend/src/utils/inferenceUtils.spec.ts#L121
AssertionError: expected 'none' to be 'llama-cpp' // Object.is equality - Expected + Received - llama-cpp + none ❯ src/utils/inferenceUtils.spec.ts:121:7
src/utils/inferenceUtils.spec.ts > getInferenceType > multiple model with llamacpp backend should return InferenceType.LLAMA_CPP: packages/backend/src/utils/inferenceUtils.spec.ts#L134
AssertionError: expected 'none' to be 'llama-cpp' // Object.is equality - Expected + Received - llama-cpp + none ❯ src/utils/inferenceUtils.spec.ts:134:7
linter, formatters and unit tests / ubuntu-22.04
Process completed with exit code 1.
src/utils/inferenceUtils.spec.ts > parseInferenceType > llamacpp should return the proper InferenceType.LLAMA_CPP: packages/backend/src/utils/inferenceUtils.spec.ts#L95
AssertionError: expected 'none' to be 'llama-cpp' // Object.is equality - Expected + Received - llama-cpp + none ❯ src/utils/inferenceUtils.spec.ts:95:44
src/utils/inferenceUtils.spec.ts > getInferenceType > single model with llamacpp backend should return InferenceType.LLAMA_CPP: packages/backend/src/utils/inferenceUtils.spec.ts#L121
AssertionError: expected 'none' to be 'llama-cpp' // Object.is equality - Expected + Received - llama-cpp + none ❯ src/utils/inferenceUtils.spec.ts:121:7
src/utils/inferenceUtils.spec.ts > getInferenceType > multiple model with llamacpp backend should return InferenceType.LLAMA_CPP: packages/backend/src/utils/inferenceUtils.spec.ts#L134
AssertionError: expected 'none' to be 'llama-cpp' // Object.is equality - Expected + Received - llama-cpp + none ❯ src/utils/inferenceUtils.spec.ts:134:7
linter, formatters and unit tests / windows-2022
Process completed with exit code 1.