Modern High Energy Physics experiments are essentially subatomic microscopes designed to generate hundreds of thousands of images per second that are analyzed by a cluster of networked on-line computers. The problem of real-time data acquisition from these experiments and distributing them to the on-line computers presents a serious challenge because of the potential for data loss, network delays and congestion. This project aims to develop a flexible, commercially available, fully programmable instrument with standardized network and data links to mediate the data traffic using its internal storage and processing resources. The module is designed to maximize the network throughput and minimize the chance for data loss. The Phase I effort is focused on the design, development and testing of the module. The module design is based on parallel processing technology and high speed networking techniques. This module will undergo initial testing at the University of California San Diego. The Phase II effort is aimed at studying the Phase I module behavior in an environment representing a real system. Tests will involve multiple modules, data generators, and a small network of on-line computers. Following system level tests, the prototype module will be revised and upgraded in preparation for tests in an actual physics experiment.
Commercial Applications and Other Benefits as described by the awardee:A high performance, low cost, industrial grade module optimized for networked image storage/retrieval and processing at high data rates is an innovative new product. The Large Hadron Collider (LHC) experiments at the European Center for Nuclear Research (CERN) could utilize thousands of units. Commercial applications of the module involve transfer of large volumes or multiple streams of images such as in medical imaging, machine vision, and video server applications.