Philosophical Society of Washington

Minutes of the 2161st Meeting


Speaker: Frederica Darema, National Science Foundation
Topic: “Dynamic Data Driven Application Systems”

The 2161st meeting of the Philosophical Society of Washington was called to order in the Powell Auditorium of the Cosmos Club at 8:20 PM. President Haapala was in the chair. The president introduced the speaker for the evening, Dr. Frederica Darema, Senior Science and Technology Advisor to the Computer and Information Science And Engineering Directorate, National Science Foundation. Dr. Darema is also Director of the Next Generation Software Program. Dr. Darema received her B.S. degree from the School of Physics and Mathematics at the University of Athens, Greece, and M.S. and Ph.D. degrees in Theoretical Nuclear Physics from the Illinois Institute of Technology and U.C Davis respectively. She was a Fulbright Scholar and a Distinguished Scholar. In 1984 she proposed the computational model used for programming today's massively parallel and distributed computing systems.


Dynamic Data Driven Application Systems (DDDAS) are application simulations that can accept and respond dynamically to new data injected at execution time, and/or have the ability to control measurement processes. DDDAS represents a new paradigm for applications, simulations, and measurements methods. Another title that conveys the notion of the DDDAS paradigm, is Symbiotic Measurement and Simulation Systems.

Computer simulations start with a theoretical model of the system, an abstract conceptual description of the physical or other system, like a process management system. The theoretical models are expressed in a mathematical representation, and these mathematical expressions are, in turn, coded into computer programs - that's the application or simulation software. Measurements also are made on properties or characteristics of the system. These measurement data are used as inputs to the computer programs, and the application executes and produces results that characterize the system under study. A new instantiation the execution of the application can be generated with a new input data set and the process executes to simulate the behavior of the system with these new conditions. A simulation is validated if the output of the computer model compares closely with observations of the real system. The mathematical functions can be improved or fine-tuned to bring the results of the simulation closer to that of the external system. This is the iterative approach to modeling a physical system. The ability to inject new data as the simulation is running (executing), can result in improvements in the application simulation capabilities. The dynamically injected data at execution time, could be data collected before running of the simulation model, or it could be data collected in Real-Time, while the application is executing.

This ability to add data into a running simulation can create more accurate simulations of complex events. Real time data collection is in use today but still often subject to post-processing analysis. A DDDAS application running an analysis could find that an area of interest in the simulation can be more accurately described or analyzed with these additional data. Reversely the simulation can be used to control the measurements to be collected, thus resulting into more efficient measurement process capabilities. For example, for an application using sensor data, the DDDAS approach can be used to reduce the sampling rate from sensors not of interest and increase the sampling rate from sensors of interest thus leading to greater data collection efficiency.

There are many challenges in enabling DDDAS capabilities. The are new capabilities that need to be developed at the application level, for example ability of the application to interface with measurement systems. In addition the streamed data might introduce new modalities to describe the system, like a different level of the physics involved, in cases where the analysis is about a physical system. That in turn requires the ability to dynamically select the application components depending on the dynamically streamed data. In addition one needs application algorithms which are amenable and stable to dynamically injected data. In addition the systems support requirements for these kinds of environments, where the application requirements change during the execution depending on the streamed data, need capabilities beyond those of the present applications execution needs. For example, it becomes more difficult to schedule an application since we may not be able to know in advance memory requirements of the application or the duration execution times since these are adjusted dynamically.

Some examples of the applications, which would potentially benefit from DDDAS, include:

One of the technical challenges in designing DDDAS software is the need to dynamically switch software components and linked libraries while a program is executing without stopping the running application. The computer algorithms must be tolerant to perturbations of dynamically injected data, and able to handle data uncertainties. DDDAS will employ extended spectrum of assemblies of hardware platforms, from the computers where the application executes to the measurement systems, and the networks that connect them and transfer the data.

We are now moving beyond parallel computing, into GRID computing, with processing systems consisting of diverse components such as CPUs and distributed even across various locations, and other entities devoted to data collections also most likely in various geographical locations. Modern cars have over 1,000 sensors, specialized microprocessors continuously measuring various states such as air flow, tire pressure, wheel rotation, etc. Some of these do real time comparisons (i.e. anti-lock brakes) while others simply read and report a single measure. In a GRID computing system different CPU's would engage in data acquisition, advanced data visualization, data analysis and synthesis or large scale data base storage and retrieval. All these diverse resources are brought together to solve one single application. National Research Laboratories are using GRID computing now and companies such as IBM are investigating their commercial potential. We now have the requisite computing capability, RAM capacity and computing speed. We have developed some of the software expertise and experience. The time for embarking to develop Dynamic Data Driven Application Systems is now.

Dr. Darema then proceeded to list numerous examples of the applications and advantages of DDDAS, a few of which follow.

In conclusion, Dr. Darema stressed the value of strengthened academic and industry collaboration, improved technology transfer, and the need for incentives from federal agencies.



The speaker then kindly agreed to answer questions some of which included:

Q- Could DDDAS be used to enhance our ability to discriminate objects from aerial observations like radar or satellite data?

A- Yes it can be used to enhance the existing methods for imaging by eliminating noise and refining the analysis of the data in regions of interest, and result into more accurate object recognition, for example one wants discriminate a tank from a school bus.

Q- The new data might require new modes of describing the system. The heart of the issue is how is this done dynamically?

A- This is the challenge. Let's consider an example: the case of the crack propagation I mentioned. The hot fluid in the tube and the temperature gradients can cause stresses that cause the crack. Sensors detect the temperature gradients and the stresses, and these measurements can be injected into the ongoing finite element simulation to refine the prediction on the potential onset or extent of a crack. It can be the case that to enhance the prediction accuracy these data need to be further combined for example with molecular dynamics calculations to derive some of the parameters needed in the FEM. See http://www.dddas.org for more examples and details.

Q- Is this an expert system?

A- No. Expert systems have the ability to learn. DDDAS can request and use additional input data which it did not have at the time the application execution was started, but this is not learning in the expert systems sense.

Q- How is the data represented?

A- That depends on the application and on the measurements. The data representation is a challenge in enabling DDDAS, and goes beyond the current interoperability issues, because the data models of the applications are not necessarily compatible with the data collected by the measurements, especially where the measurements can be collected by many modes and systems, and geographically dispersed. It is a challenge, GRID computing for example also shows that one does not have all needed bandwidth all the time, and so there is additional complexity of compressing the data, or re-mapping the application.

Q- Would this work as a SETI program.

A- It's considerably different. SETI analyses a bunch of events at a time, by using the idle time on multiple platforms to distribute chunks of computation. There is no new data injected into an event while it is processed.

President Haapala thanked the speaker on behalf of the Society and presented her with a one year complementary membership. The president then made the usual beverage control and parking announcements, then reminded the audience that although the meetings are open to all, membership dues are the sole source of income. He requested non-members to join and members to consider contributions.

Attendance: 31
Temperature: 28.3° C
Weather: Cloudy and Cool

Respectfully submitted,
 
David F. Bleil
Recording Secretary

<—Previous Minutes - Abstract & Speaker Biography - Next Minutes—>
Meeting Archive - Home