Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

'ramalama ps' returns exception on macOS when no container-based llms are running #488

Open
planetf1 opened this issue Nov 24, 2024 · 3 comments

Comments

@planetf1
Copy link

On macOS Sonama I have one model running. This appears not to be running as a container but is working well, with Apple Silicon metal/gpu support. podman ps shows a variety of other containers (of mine) running, but no llms from ralalama

The command should not hit an exception - either it should report that no containers are running, or -- better -- would be to include any non-container llms being served. In my case that is

  501 27676  1627   0  4:35pm ttys021    0:00.80 llama-server --port 9999 -m /Users/jonesn/.local/share/ramalama/models/ollama/llama3.1:8b --host 0.0.0.0

Here's the failing command:

~ ramalama ps
Traceback (most recent call last):
  File "/opt/homebrew/bin/ramalama", line 92, in <module>
    main(sys.argv[1:])
    ~~~~^^^^^^^^^^^^^^
  File "/opt/homebrew/bin/ramalama", line 67, in main
    args.func(args)
    ~~~~~~~~~^^^^^^
  File "/opt/homebrew/share/ramalama/ramalama/cli.py", line 390, in list_containers
    if len(_list_containers(args)) == 0:
           ~~~~~~~~~~~~~~~~^^^^^^
  File "/opt/homebrew/share/ramalama/ramalama/cli.py", line 380, in _list_containers
    output = run_cmd(conman_args).stdout.decode("utf-8").strip()
             ~~~~~~~^^^^^^^^^^^^^
  File "/opt/homebrew/share/ramalama/ramalama/common.py", line 96, in run_cmd
    return subprocess.run(args, check=True, cwd=cwd, stdout=stdout, stderr=stderr)
           ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/[email protected]/3.13.0_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/subprocess.py", line 554, in run
    with Popen(*popenargs, **kwargs) as process:
         ~~~~~^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/[email protected]/3.13.0_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/subprocess.py", line 1036, in __init__
    self._execute_child(args, executable, preexec_fn, close_fds,
    ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                        pass_fds, cwd, env,
                        ^^^^^^^^^^^^^^^^^^^
    ...<5 lines>...
                        gid, gids, uid, umask,
                        ^^^^^^^^^^^^^^^^^^^^^^
                        start_new_session, process_group)
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/[email protected]/3.13.0_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/subprocess.py", line 1837, in _execute_child
    and os.path.dirname(executable)
        ~~~~~~~~~~~~~~~^^^^^^^^^^^^
  File "<frozen posixpath>", line 177, in dirname
TypeError: expected str, bytes or os.PathLike object, not NoneType
@ericcurtin
Copy link
Collaborator

Sounds like a good idea to me! I guess we just grep the pids in python and display that. Wanna take a go at implementing this @planetf1 ? Thanks for the feedback!

@ericcurtin
Copy link
Collaborator

And we should try and make this technique portable so that it works across macOS and Linux (whilst adding no dependancies). If we must we can do:

if macos:
do this
else if linux and not using containers:
do this

@rhatdan
Copy link
Member

rhatdan commented Nov 25, 2024

Did you have neither docker podman installed?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants