phys-npps-members-l AT lists.bnl.gov
Subject: ALL NPPS Members
List archive
Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda
- From: Brett Viren <bv AT bnl.gov>
- To: "Ye, Shuwei" <yesw AT bnl.gov>
- Cc: Torre Wenaus <wenaus AT gmail.com>, NPPS members <Phys-npps-members-l AT lists.bnl.gov>, "Qian, Xin" <xqian AT bnl.gov>, "Yu, Haiwang" <hyu AT bnl.gov>
- Subject: Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda
- Date: Fri, 12 Sep 2025 13:46:26 -0400
Hi Shuwei,
No, I have not tried this. It looks useful.
In general, I've not been able to get "friendly" with aider despite a
couple tries. That's not to discourage others as I know aider is very
powerful and popular.
In fact, most of my use of Emacs+gtpel is via its chat mode UI. There,
the big win for me is having the chat session expressed in org-mode
markup (Markdown is an option) and saving it to a regular file on my own
disk. I can then "grep" the library of chats to revisit and possibly
restart. That's way more useful than, eg, Gemini's web UI.
The tasks I give the LLM are usually either conceptual planning or
problem solving or they use the LLM as a user/developer manual.
Emacs+gptel has ways to let the LLM directly edit code but I've not yet
gotten friendly with these either.
I guess I'm still a little uncomfortable letting LLM do wholesale code
editing for most of my "real" projects. Though, I'm not above some
vibing. :)
Cheers,
-Brett.
"Ye, Shuwei" <yesw AT bnl.gov> writes:
> Hi Brett,
>
> Have you tried the aider feature --watch-files? It could provide the inline
> AI chat
>
> For example,
>
> $ aider --no-git --watch-files /tmp/yesw/Convert-rst.sh
>
> $ vim /tmp/yesw/Convert-rst.sh
>
> Inside the opened file, I can just write something like:
>
> # Define a function to print out the usage help AI!
>
> It will trigger the aider session to read your prompt, insert the function
> inside that file for you.
>
> Cheers,
>
> --Shuwei
>
>
> ---------------------------------------------------------------------------------------
> From: phys-npps-members-l-request AT lists.bnl.gov
> <phys-npps-members-l-request AT lists.bnl.gov> on behalf of
> Torre Wenaus <wenaus AT gmail.com>
> Sent: Thursday, September 11, 2025 4:10 PM
> To: Viren, Brett <bviren AT bnl.gov>
> Cc: NPPS members <Phys-npps-members-l AT lists.bnl.gov>; Qian, Xin
> <xqian AT bnl.gov>; Yu, Haiwang
> <hyu AT bnl.gov>
> Subject: Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda
>
> This Message Is From an External Sender
> This message came from outside your organization.
>
>> Emacs+gptel
> Thanks for the tip :-)
>
> On Thu, Sep 11, 2025 at 8:56 AM Brett Viren <bv AT bnl.gov> wrote:
>
> Torre Wenaus <wenaus AT gmail.com> writes:
>
> > Is that what you would call comanage with ollama behind? Or...?
>
> I hadn't heard of COmanage, but reading about it now, it's seems to be a
> full-featured "auth backend" from InCommon. My understanding is that
> Keycloak is the approximate equivalent that is in popular use at BNL.
>
> So, no, what I meant by "frontend" was the user-touching web app that
> intermediates between human and "LLM backend" resources for chat type
> interaction. This UI frontend could (probably should) use COmanage or
> Keycloak as an "auth backend".
>
> I think using Open-WebUI as that UI frontend is a reasonable thing to
> consider if the security and auth issues can be sorted. I do wonder how
> Open-WebUI copes when more users than LLM resources hit it.
>
> > Today has not been very positive to Open-WebUI!
>
> Perhaps I need to disclose some caveats of my experience with
> Open-WebUI. My usages was almost least-effort:
>
> $ uv run open-webui serve
>
> I did go a bit further to try to make it preserve its configuration
> after a restart, including the unique admin password I would set (and
> reset...). I was only partly successful. Sometimes it held and
> sometimes it disappeared. I gave up on that problem and lived with the
> race between a restart and first login. Since it was exposed only to
> localhost, it was a safe race to run. I'd shutdown the Open-WebUI
> server when I finished each chat session.
>
> Presumably, paying more attention can overcome these problems. Yet more
> would be needed to use an auth backend.
>
> The other caveat is that I've stopped using Open-WebUI for the past few
> months. It is possible that today's version is better behaved.
>
> The combination of free Gemini API and Emacs+gptel have fully spoiled me
> away from Open-WebUI entirely and also reduced my use of Ollama. But,
> clearly that's not a solution for Ask PanDA! :)
>
> -Brett.
Attachment:
signature.asc
Description: PGP signature
-
Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda
, (continued)
-
Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda,
Ye, Shuwei, 09/10/2025
-
Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda,
Torre Wenaus, 09/10/2025
- Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda, Torre Wenaus, 09/10/2025
-
Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda,
Zhao, Xin, 09/10/2025
- Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda, Torre Wenaus, 09/10/2025
-
Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda,
Torre Wenaus, 09/10/2025
-
Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda,
Brett Viren, 09/10/2025
-
Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda,
Torre Wenaus, 09/10/2025
-
Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda,
Brett Viren, 09/11/2025
-
Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda,
Torre Wenaus, 09/11/2025
-
Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda,
Ye, Shuwei, 09/12/2025
- Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda, Brett Viren, 09/12/2025
-
Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda,
Ye, Shuwei, 09/12/2025
-
Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda,
Torre Wenaus, 09/11/2025
-
Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda,
Brett Viren, 09/11/2025
-
Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda,
Torre Wenaus, 09/10/2025
-
Re: [[Phys-npps-members-l] ] Wed NPPS AI/ML agenda,
Ye, Shuwei, 09/10/2025
Archive powered by MHonArc 2.6.24.