LDRA Testbed

LDRA Testbed is a set of core static and dynamic analysis engines for both host and embedded software. LDRA Testbed is made by Liverpool Data Research Associates (LDRA). LDRA Testbed provides the means to enforce compliance with coding standards such as MISRA, JSF++ AV, CERT C, CWE and provides visibility of software flaws that might typically pass through the standard build and test process to become latent problems. In addition, test effectiveness feedback is provided through structural coverage analysis reporting facilities, which support the requirements of the DO-178B standard up to and including Level A.

Static analysis
Static analysis initiates LDRA Testbed activity by undertaking lexical and syntactic analysis of the source code for a single file or a complete system.

Programming standards checking
The enforcement of programming standards (or coding standards) is commonly regarded as good practice. The adherence to such standards can be automatically checked by products like LDRA Testbed. Main Static Analysis searches the source code for any programming standards violations, by checking the source files against the superset supplied with LDRA Testbed.

This system can be configured for: LDRA Testbed reports violations of the chosen set of standards in both textual reports and as annotations to graphical displays.
 * User-definable filters – switch standards on or off
 * Change standards from mandatory to optional or vice versa.
 * Use annotations to switch off standards for specific instances of violations.

Dynamic coverage analysis
Dynamic coverage analysis explores the semantics of the program under test via test data selection. It uses control and data flow models and compares them with the actual control and data flow as the program executes. Dynamic Analysis therefore forces the selection of test data which explores the structure of the source code.

The LDRA tool suite includes a dynamic coverage module. It is used to beneficial effect on software robustness and reliability during both development and maintenance cycles.

Quality report
Quality metrics such as Halstead complexity measures, cyclomatic complexity, and Knots metric are designed to verify that code is clear, maintainable, and testable. The quality report in the LDRA tool suite presents both a summary and a detailed breakdown of quality metrics that are deduced during static analysis.

Alternatives
A selection of LDRA's partners in the software testing market include: MathWorks who have integrated their Simulink tools, as well as IBM Rational Rose, Rational Rhapsody, IAR Embedded Workbench, Wind River and VxWorks.

LDRA's rivals include AdaTEST, Cantata++, Coverity, Klocwork, Parasoft and VectorCAST.