Please note that eDoc will be permanently shut down in the first quarter of 2021!      Home News About Us Contact Contributors Disclaimer Privacy Policy Help FAQ

Quick Search
My eDoc
Session History
Support Wiki
Direct access to
document ID:

          Institute: MPI für Kernphysik     Collection: Heavy Flavour Physics     Display Documents

ID: 63274.0, MPI für Kernphysik / Heavy Flavour Physics
DIRAC - Distributed Infrastructure with Remote Agent Control
Authors:van Herwijnen, Eric; Closier, Joel; Frank, Markus; Gaspar, Clara; Loverre, Francoise; Ponce, Sebastien; Graciani Diaz, Roberto; Galli, Domenico; Marconi, Umberto; Vagnoni, Vincenzo; Brook, Nicholas; Buckley, A.; Harrison, K.; Schmelling, Michael; Egede, Ulrik; Tsaregorotsev, Andrei; Garonne, V.; Bogdanchikov, B.; Korolko, Ivan; Washbrook, A.; Palacios, Juan P.; Klous, Sander; Saborido, Juan J.; Khan, Akram; Pickford, A.; Soroko, A.; Romanovski, V.; Patrick, G.N.; Kuznetsov, Genady; Gandelman, Miriam
Research Context:LHCb
Date of Publication (YYYY-MM-DD):2003
Sequence Number:TUAT006
Name of Conference/Meeting:2003 Conference for Computing in High-Energy and Nuclear Physics (CHEP 03)
Place of Conference/Meeting:La Jolla, California, USA
(Start) Date of Conference/Meeting
End Date of Conference/Meeting 
Review Status:Internal review
Audience:Experts Only
Abstract / Description:This paper describes DIRAC, the LHCb Monte Carlo production system. DIRAC has a client/server architecture based on: · Compute elements distributed among the collaborating institutes; Databases for production management, bookkeeping (the metadata catalogue) and software configuration; · Monitoring and cataloguing services for updating and accessing the databases. Locally installed software agents implemented in Python monitor the local batch queue, interrogate the production database for any outstanding production requests using the XML-RPC protocol and initiate the job submission. The agent checks and, if necessary, installs any required software automatically. After the job has processed the events, the agent transfers the output data and updates the metadata catalogue. DIRAC has been successfully installed at 18 collaborating institutes, including the DataGrid, and has been used in recent Physics Data Challenges. In the near to medium term future we must use a mixed environment with different types of grid middleware or no middleware. We describe how this flexibility has been achieved and how ubiquitously available grid middleware would improve DIRAC.
External Publication Status:published
Document Type:Conference-Paper
Affiliations:MPI für Kernphysik/Group W. Hofmann/Heavy Flavour Physics (K. T. Knöpfle, M. Schmelling)
External Affiliations:CERN; Barcelona; Bologna; Bristol; Cambridge; GridKA, Karlsruhe; Imperial College, London; In2P3 Marseille; INP Novosibirsk; ITEP Moscow; Liverpool; NIKHEF and Vrije Universiteit Amsterdam; Santiago de Compostela; ScotGrid Edinburgh; ScotGrid Glasgow;
<BR><BR><BR>Oxford; IHEP Protvino; RAL; UFRJ, Rio de Janeiro.
Full Text:
You have privileges to view the following file(s):
2003_CHEP.pdf  [694,00 Kb] [Comment:preprint version of proceedings]  
The scope and number of records on eDoc is subject to the collection policies defined by each institute - see "info" button in the collection browse view.