Systems Architecture - The Empirical Way
Abstract Architectures to 'Optimal' Systems

It is a profound dislocation to have reality replaced by models - that revolution is won, that blood spilt. The first radical changes in global companies that I have witnessed, attributable to the systems engineering and architecture Virtual System Prototype inflection point, occurred in 2004. In these instances, the whole assumed order of architecture in the engineering process was tipped upside-down ...

Research Publication, 5th ACM Intl. Conference of Embedded Software

Profiles in Power: Optimizing Real-Time Systems for Power
Optimizing systems with complex objective functions is not intuitive. Complex tradeoffs between hardware structure and the softwre and algorithms that are executed on the hardware cannot be done by ratiocination or formal analysis alone ...

Workshop Publication - Power Aware Real-Time Computing Workshop

Home > Solutions > Optimizing Complex Real-time Systems

Optimizing Complex Real-time Systems

The point of competent architecture is to craft an optimum outcome at the outset of the engineering process NOT a sub-optimal backend fix-up to a verification failure.

The Process of Optimizing an Architecture

The optimization process is driven by a next candidate architecture selection process and an objective function that can be computed for each candidate architecture. The result produced from the objective function evaluation can be compared with other results to determine whether the current architecture is better or worse than previous candidates. Depending on the selection mechanism, the optimization process terminates when the best possible candidate architecture is found, or else, the best to this point in time is selected.

Unoptimized Vehicle Control Architectures

The current state of the art in building vehicle control architectures for luxury automobiles yields 50-80 electronic control units (ECU), all sporting a microprocessor of various capabilities, cooperating using communications over 4-6 networks. Yet, no one is able to say whether there should be 80 or 10 ECUs communicating through 1 or 10 networks in an optimal vehicle control architecture. That such rough engineering has survived so long in the face of desirable optimization constraints involving cost, performance, reliability, fault tolerance, safety and liability, is rather remarkable in a cost fixated industry like automotive engineering and production. And is, in part, explained by there being no formal or empirical methodologies that have been sufficient to enable the optimization of even a 20 ECU VCA system - until now!

EST's approach to resolving this problem

EST_ Various Optimization Surfaces
EST_ Various Optimization Surfaces

EST's approach to this problem employs an empirical process that uses accurate models to provide quantitative data resulting from stimulation. EST's ultra-high performance simulation capability enables the execution of a very large number of complex simulations, designed to exercise various measurable aspects of an architecture, in a very short amount of time. The measurements taken are used by the objective function to produce a figure of merit for the architecture that can be used to identify at any given time an optimal architecture. The issue of selecting a next good candidate architecture for stimulating, simulating and measuring is helped by the Design of Experiments (DoE) statistical methodology. DoE selection enables the traversing of a design space guided by principles that the next candidate chosen will produce the best, next outcome as determined by the objective function.

The empirical optimization process is also part of the verification process. A partial architecture is subjected to normal, boundary and corner-case tests in order to produce an optimal architecture and the appropriate tests for partial architectures are accumulated and run again on a selected new candidate architecture. The final aggregation of test forms the, very valuable, core of the verification tests. Note that the verification tests are produced as part of the initial architecture specification and also during design. They should never be written in a separate verification phase – as the "V" process espouses.

The EST approach to architecture and verification is the exact reverse of the current practice of hope, build, pray, then verify – in which no optimal result is ever demonstrable.

Traversing the design space

This typically means iteratively mapping elements of an abstract architectural model towards proximal physical models, which then are directly replaceable by physical facsimiles. This item will be the subject of a future technology page on our web site.

Benefits of the EST approach?

EST's break-through in modeling and analysis technologies, and mapping and optimization methodologies is set revolutionize:

  • Markets that require complex real-time control systems to be safe, reliable, and low cost with defined performance, in keeping with customer expectations, and

  • Companies that produce such systems by guaranteeing low liability and sufficient profitability to satisfy the R&D requirements of the next generation of complex control systems and fair returns to shareholders.

A large number of necessary verification tests are created during the empirical architecture process. This reinforces verification being process strand entwined in the process strand bundle that weaves together the architecture, design and development phases into a coherent process.

The Future – Traffic Architecture Optimization

Imagine optimizing traffic. A cluster of intercommunicating Vehicle Control Architectures that also communicate with traffic control and monitoring infrastructure may involve more than 1,000 ECUs interconnected by 100 or more communication fabrics. The questions become: What is a traffic architecture? What is an appropriate objective function with which to optimize it? Is such a huge problem even tractable? This presages another topic for a future EST Solutions web page.