phys-npps-members-l AT lists.bnl.gov
Subject: ALL NPPS Members
List archive
Re: [[Phys-npps-members-l] ] Instructions for Offsite Access to Ollama Server on npps0
- From: "Ye, Shuwei" <yesw AT bnl.gov>
- To: "Galgoczi, Gabor (PO)" <ggalgoczi1 AT bnl.gov>, Torre Wenaus <wenaus AT gmail.com>
- Cc: NPPS members <phys-npps-members-l AT lists.bnl.gov>
- Subject: Re: [[Phys-npps-members-l] ] Instructions for Offsite Access to Ollama Server on npps0
- Date: Fri, 13 Mar 2026 22:12:49 +0000
Dear Gabor,
I am responsible for managing the Ollama server on npps0. I can look into configuring the server to use only one GPU and limiting the VRAM to 24 GB.
Best regards,
--Shuwei
From: Galgoczi, Gabor (PO) <ggalgoczi1 AT bnl.gov>
Sent: Friday, March 13, 2026 4:18 PM
To: Torre Wenaus <wenaus AT gmail.com>; Ye, Shuwei <yesw AT bnl.gov>
Cc: NPPS members <phys-npps-members-l AT lists.bnl.gov>
Subject: Re: Instructions for Offsite Access to Ollama Server on npps0
Sent: Friday, March 13, 2026 4:18 PM
To: Torre Wenaus <wenaus AT gmail.com>; Ye, Shuwei <yesw AT bnl.gov>
Cc: NPPS members <phys-npps-members-l AT lists.bnl.gov>
Subject: Re: Instructions for Offsite Access to Ollama Server on npps0
Dear All,
Who is the owner of the Ollama process? Could you restrict it to GPU0 using CUDA_VISIBLE_DEVICES=0? Could also limit the utilized RAM by OLLAMA_MAX_VRAM setting.
When it is running it uses most of the RAM of both GPUs. We can not do eic-opticks work and our GitHub CI tests also fail due to insufficient free memory.
Thank you,
Gabor
From: phys-npps-members-l-request AT lists.bnl.gov <phys-npps-members-l-request AT lists.bnl.gov> on behalf of Ye, Shuwei <yesw AT bnl.gov>
Sent: Wednesday, March 11, 2026 11:11 AM
To: Torre Wenaus <wenaus AT gmail.com>
Cc: NPPS members <phys-npps-members-l AT lists.bnl.gov>
Subject: [[Phys-npps-members-l] ] Instructions for Offsite Access to Ollama Server on npps0
Sent: Wednesday, March 11, 2026 11:11 AM
To: Torre Wenaus <wenaus AT gmail.com>
Cc: NPPS members <phys-npps-members-l AT lists.bnl.gov>
Subject: [[Phys-npps-members-l] ] Instructions for Offsite Access to Ollama Server on npps0
Dear Torre,
You can find the detailed instructions for offsite access to the Ollama server on our group machine npps0 in the following document:
For example, to use the model
qwen3.5:35b via
claude CLI on your laptop, follow these steps:-
Establish an SSH tunnel to forward the local port 1080 to the remote Ollama server:Code
ssh -f -N -L 1080:130.199.21.114:11434 ssh.bnl.gov -
Set the required environment variables:Code
export ANTHROPIC_AUTH_TOKEN=ollama export ANTHROPIC_API_KEY="" export ANTHROPIC_BASE_URL=http://localhost:1080 -
Launch Claude with the specified model:Code
claude --model qwen3.5:35b
Please let me know if you encounter any issues or need further assistance.
Best regards,
--Shuwei
-
[[Phys-npps-members-l] ] Instructions for Offsite Access to Ollama Server on npps0,
Ye, Shuwei, 03/11/2026
-
Re: [[Phys-npps-members-l] ] Instructions for Offsite Access to Ollama Server on npps0,
Galgoczi, Gabor (PO), 03/13/2026
-
Re: [[Phys-npps-members-l] ] Instructions for Offsite Access to Ollama Server on npps0,
Ye, Shuwei, 03/13/2026
-
Re: [[Phys-npps-members-l] ] Instructions for Offsite Access to Ollama Server on npps0,
Smirnov, Dmitri, 03/13/2026
-
Re: [[Phys-npps-members-l] ] Instructions for Offsite Access to Ollama Server on npps0,
Ye, Shuwei, 03/16/2026
- Re: [[Phys-npps-members-l] ] Instructions for Offsite Access to Ollama Server on npps0, Zhaoyu Yang, 03/16/2026
-
Re: [[Phys-npps-members-l] ] Instructions for Offsite Access to Ollama Server on npps0,
Ye, Shuwei, 03/16/2026
- Re: [[Phys-npps-members-l] ] Instructions for Offsite Access to Ollama Server on npps0, Galgoczi, Gabor (PO), 03/13/2026
-
Re: [[Phys-npps-members-l] ] Instructions for Offsite Access to Ollama Server on npps0,
Smirnov, Dmitri, 03/13/2026
-
Re: [[Phys-npps-members-l] ] Instructions for Offsite Access to Ollama Server on npps0,
Ye, Shuwei, 03/13/2026
-
Re: [[Phys-npps-members-l] ] Instructions for Offsite Access to Ollama Server on npps0,
Galgoczi, Gabor (PO), 03/13/2026
Archive powered by MHonArc 2.6.24.