Skip to Content.
Sympa Menu

phys-npps-members-l - Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda

phys-npps-members-l AT lists.bnl.gov

Subject: ALL NPPS Members

List archive

Chronological Thread  
  • From: "Ye, Shuwei" <yesw AT bnl.gov>
  • To: "Viren, Brett" <bviren AT bnl.gov>, Torre Wenaus <wenaus AT gmail.com>
  • Cc: NPPS members <Phys-npps-members-l AT lists.bnl.gov>, "Qian, Xin" <xqian AT bnl.gov>, "Yu, Haiwang" <hyu AT bnl.gov>
  • Subject: Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda
  • Date: Fri, 12 Sep 2025 17:02:16 +0000

Hi Brett,

Have you tried the aider feature --watch-files? It could provide the inline AI chat

For example,

$ aider --no-git --watch-files /tmp/yesw/Convert-rst.sh

$ vim /tmp/yesw/Convert-rst.sh

Inside the opened file, I can just write something like:

# Define a function to print out the usage help AI!

It will trigger the aider session to read your prompt, insert the function inside that file for you.

Cheers,

--Shuwei


From: phys-npps-members-l-request AT lists.bnl.gov <phys-npps-members-l-request AT lists.bnl.gov> on behalf of Torre Wenaus <wenaus AT gmail.com>
Sent: Thursday, September 11, 2025 4:10 PM
To: Viren, Brett <bviren AT bnl.gov>
Cc: NPPS members <Phys-npps-members-l AT lists.bnl.gov>; Qian, Xin <xqian AT bnl.gov>; Yu, Haiwang <hyu AT bnl.gov>
Subject: Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda
 
> Emacs+gptel Thanks for the tip :-) On Thu, Sep 11, 2025 at 8: 56 AM Brett Viren <bv@ bnl. gov> wrote: Torre Wenaus <wenaus@ gmail. com> writes: > Is that what you would call comanage with ollama behind? Or. . . ? I hadn't heard
ZjQcmQRYFpfptBannerStart
This Message Is From an External Sender
This message came from outside your organization.
 
ZjQcmQRYFpfptBannerEnd
> Emacs+gptel
Thanks for the tip :-)

On Thu, Sep 11, 2025 at 8:56 AM Brett Viren <bv AT bnl.gov> wrote:
Torre Wenaus <wenaus AT gmail.com> writes:

> Is that what you would call comanage with ollama behind? Or...?

I hadn't heard of COmanage, but reading about it now, it's seems to be a
full-featured "auth backend" from InCommon.  My understanding is that
Keycloak is the approximate equivalent that is in popular use at BNL.

So, no, what I meant by "frontend" was the user-touching web app that
intermediates between human and "LLM backend" resources for chat type
interaction.  This UI frontend could (probably should) use COmanage or
Keycloak as an "auth backend".

I think using Open-WebUI as that UI frontend is a reasonable thing to
consider if the security and auth issues can be sorted.  I do wonder how
Open-WebUI copes when more users than LLM resources hit it.

> Today has not been very positive to Open-WebUI!

Perhaps I need to disclose some caveats of my experience with
Open-WebUI.  My usages was almost least-effort:

  $ uv run open-webui serve

I did go a bit further to try to make it preserve its configuration
after a restart, including the unique admin password I would set (and
reset...).  I was only partly successful.  Sometimes it held and
sometimes it disappeared.  I gave up on that problem and lived with the
race between a restart and first login.  Since it was exposed only to
localhost, it was a safe race to run.  I'd shutdown the Open-WebUI
server when I finished each chat session.

Presumably, paying more attention can overcome these problems.  Yet more
would be needed to use an auth backend.

The other caveat is that I've stopped using Open-WebUI for the past few
months.  It is possible that today's version is better behaved.

The combination of free Gemini API and Emacs+gptel have fully spoiled me
away from Open-WebUI entirely and also reduced my use of Ollama.  But,
clearly that's not a solution for Ask PanDA! :)

-Brett.


--
-- Torre Wenaus, BNL NPPS Group Leader, ATLAS and ePIC experiments
-- BNL 510A 1-222 | 631-681-7892



Archive powered by MHonArc 2.6.24.

Top of Page