phys-npps-mgmt-l AT lists.bnl.gov
Subject: NPPS Leadership Team
List archive
Re: [[Phys-npps-mgmt-l] ] Fwd: "Shovel ready AI" proposals
- From: Paul Nilsson <Paul.Nilsson AT cern.ch>
- To: "mkirby AT bnl.gov" <mkirby AT bnl.gov>, Torre Wenaus <wenaus AT gmail.com>
- Cc: NPPS leadership team <Phys-npps-mgmt-l AT lists.bnl.gov>, "tmaeno AT bnl.gov" <tmaeno AT bnl.gov>, Michel Hernandez Villanueva <mhernande1 AT bnl.gov>
- Subject: Re: [[Phys-npps-mgmt-l] ] Fwd: "Shovel ready AI" proposals
- Date: Tue, 15 Apr 2025 16:03:00 +0000
Hi,
A few ideas; In principle, the “Ask PanDA”-tool I’m working on could be turned into an “intelligent agent” using MCP. The new version is anyway plug-in based, so switching to a technology that is essentially doing just that for you is preferable. However, as far as I understand this doesn’t support all models yet and some are via third-party (or even wrappers), as seems to be the case for Llama which might still be a bit experimental. A limiting factor is also that we don’t have access to that many GPUs, unless we are going down the commercial road. I’m just getting set up at SLAC to start testing Llama, using their GPUs (two I believe, ie the same situation as NERSC and BNL). As for CERN, they seem to be planning an LLM as a service, but it seems that is at a very early planning stage.
I think it’s good to get some experience with MCP technology and could be developed into something more than a chatbot – actually, personally, I’m more interested in log analysis than chatbots (we are going to get that for free so to speak), and we also have some ideas about the panda monitor, related to error messages, what they mean and so on. A worrying factor is, as I said, the limited number of available GPUs. Two GPUs only gives a handful of tokens per second at SLAC e.g., so hardly for production. But we need to start somewhere.
Cheers, Paul
From:
Kirby, Michael <mkirby AT bnl.gov>
Hi Torre,
The idea of having something like MCP cooked into services definitely seems like it would have significant advantages by making getting a “data source” connected via an easy plugin. If I understand correctly, MCP acts as a standardization interface for getting real-time data into an AI client? So for PanDA, you could take MCP to slurp in the current data on running jobs, queues, and then let a user ask the question “how soon will my jobs be finished?”, and Claude will tell them if they should get a coffee or go to dinner or go on vacation? That’s maybe oversimplified, but it’s absolutely a useful type of thing that would have a huge impact on user experience and/or operational decisions.
Cheers, Kirby
Michael Kirby (he/him/his) Senior Physicist
|
-
[[Phys-npps-mgmt-l] ] Fwd: "Shovel ready AI" proposals,
Torre Wenaus, 04/14/2025
-
Re: [[Phys-npps-mgmt-l] ] Fwd: "Shovel ready AI" proposals,
Kirby, Michael, 04/15/2025
-
Re: [[Phys-npps-mgmt-l] ] Fwd: "Shovel ready AI" proposals,
Paul Nilsson, 04/15/2025
-
Re: [[Phys-npps-mgmt-l] ] Fwd: "Shovel ready AI" proposals,
Torre Wenaus, 04/15/2025
-
Re: [[Phys-npps-mgmt-l] ] Fwd: "Shovel ready AI" proposals,
Torre Wenaus, 04/15/2025
-
Re: [[Phys-npps-mgmt-l] ] Fwd: "Shovel ready AI" proposals,
Paul Nilsson, 04/15/2025
-
Re: [[Phys-npps-mgmt-l] ] Fwd: "Shovel ready AI" proposals,
Torre Wenaus, 04/15/2025
- Re: [[Phys-npps-mgmt-l] ] Fwd: "Shovel ready AI" proposals, Torre Wenaus, 04/15/2025
-
Re: [[Phys-npps-mgmt-l] ] Fwd: "Shovel ready AI" proposals,
Torre Wenaus, 04/15/2025
-
Re: [[Phys-npps-mgmt-l] ] Fwd: "Shovel ready AI" proposals,
Paul Nilsson, 04/15/2025
-
Re: [[Phys-npps-mgmt-l] ] Fwd: "Shovel ready AI" proposals,
Torre Wenaus, 04/15/2025
-
Re: [[Phys-npps-mgmt-l] ] Fwd: "Shovel ready AI" proposals,
Torre Wenaus, 04/15/2025
-
Re: [[Phys-npps-mgmt-l] ] Fwd: "Shovel ready AI" proposals,
Paul Nilsson, 04/15/2025
-
Re: [[Phys-npps-mgmt-l] ] Fwd: "Shovel ready AI" proposals,
Kirby, Michael, 04/15/2025
Archive powered by MHonArc 2.6.24.