0% found this document useful (0 votes)
92 views

Effective Test Driven Development For Embedded Software

1. The document discusses test driven development (TDD) for embedded software. TDD prescribes writing test code before functional code to ensure the code is testable. 2. Traditional testing in embedded software relies on ad-hoc testing and debugging tools during development. Tests are discarded over time and valuable test knowledge is lost. 3. The document advocates applying TDD principles to embedded software by designing for testability. TDD involves writing automated unit tests first, then code to pass the tests, and refactoring code through repeated testing. This makes embedded code more robust and reduces bugs.

Uploaded by

jpr_joaopaulo
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
92 views

Effective Test Driven Development For Embedded Software

1. The document discusses test driven development (TDD) for embedded software. TDD prescribes writing test code before functional code to ensure the code is testable. 2. Traditional testing in embedded software relies on ad-hoc testing and debugging tools during development. Tests are discarded over time and valuable test knowledge is lost. 3. The document advocates applying TDD principles to embedded software by designing for testability. TDD involves writing automated unit tests first, then code to pass the tests, and refactoring code through repeated testing. This makes embedded code more robust and reduces bugs.

Uploaded by

jpr_joaopaulo
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

1

Effective Test Driven Development for


Embedded Software
Michael J. Karlesky, William I. Bereza, and Carl B. Erickson, PhD

practices producing an order of magnitude or more reduction


Abstract—Methodologies for effectively managing software in bugs over traditional methods. In particular, among these
development risk and producing quality software are taking hold practices, Test Driven Development (TDD) stands out. TDD is
in the IT industry. However, similar practices for embedded counterintuitive; it prescribes that test code be programmed
systems, particularly in resource constrained systems, have not
yet become prevalent. Today, quality in embedded software is
before the functional code those tests exercise is implemented.
generally tied to platform-specific testing tools geared towards Practicing TDD means designing software such that it can be
debugging. We present here an integrated collection of concrete tested at any time under automation. Designing for testability
concepts and practices that are decoupled from platform-specific in TDD is a higher calling than designing “good” code
tools. In fact, our approach drives the actual design of embedded because testable code is good code.
software. These strategies yield good design, systems that are Traditional testing strategies rarely impact the design of
testable under automation, and a significant reduction in
software flaws. Examples from an 8 bit system with 16k of
production code, are onerous for developers and testers, and
program memory and 255 bytes of RAM illustrate these ideas. often leave testing to the end of a project where budget and
time constraints threaten thorough testing. Test Driven
Index Terms—Design for testability, Microprogramming, Development systematically inverts these patterns.
Software quality, Software testing Practicing TDD follows these essential steps:
1. Identify a piece of system functionality to
I. INTRODUCTION implement (a single function or method).

S PECTACULAR software failures make headlines. The


problems plaguing Denver International Airport’s
automated baggage-handling system in the mid-90’s had
2. Program a test to verify that functionality.
3. Stub out the functional code under test
(to allow the test code to compile).
enough media presence to make a Hollywood star jealous. 4. Compile; run the test and see it fail.
Software bugs with far less media attention are a regularly 5. Flesh out the functional code.
occurring reality. Bugs derail business plans, exasperate 6. Compile; run the test.
managers and developers, and adversely impact the bottom 7. Refactor the functional code.
lines of companies of all sizes. 8. Repeat 6 & 7 until the test passes and the
Embedded software is a unique specialty within the broader functional code is cleanly implemented.
software field. High-level IT systems generally run in clean 9. Repeat 1-8 until all features are implemented.
environments and have little contact with the physical world.
While bugs are always costly, bugs in PC software or large In this paper, we draw from experience with resource-
enterprise applications can be patched with relative ease. In constrained systems that do not enjoy the “luxury” of an
contrast, flaws within the embedded software of an automobile operating system or object-oriented language (e.g. C++ or
fuel injection system can cause a massive and expensive Java). Within this context, practicing TDD has generally been
recall. Worse still is the very real prospect of loss of human regarded as prohibitively difficult. The direct interaction of
life due to embedded software flaws. The abilities, programming and hardware as well as limited resources for
complexities, and pervasiveness of embedded systems running test frameworks seem to set a hurdle too high to clear.
continue to grow thus growing the possibility and probability We shall demonstrate the application of a new software design
of truly expensive software flaws. pattern and a multi-tier strategy for testing that brings the
Effective methodologies for managing software risk and efficacy of TDD to even the lowest-level embedded software
producing quality software are beginning to take root in (and by extension any embedded system).
industry. For instance, the practices under the umbrella of While the importance of testing in embedded software is
“Agile Methodologies” are winning converts [1]. Widespread universally recognized, testing approaches, in general, have
anecdotal evidence and our own experience testify to “Agile” been tied to specific tools or platforms [2]. What we offer here
are concepts decoupled from particular tools and platforms.

Manuscript submitted April 2006. This paper is based on ideas developed II. CURRENT STATE OF TESTING IN EMBEDDED SOFTWARE &
and work performed for various clients as part of ongoing contract software
development engagements. SHORTCOMINGS
Michael J. Karlesky, William I. Bereza, and Carl B. Erickson, PhD are all
with Atomic Object LLC, 941 Wealthy Street SE, Grand Rapids, MI 49506 A. Ad-hoc Testing
USA. voice: (616) 776-6020 fax: (616) 776-6015
email: [email protected].
Experimentation and ad-hoc testing are often performed
2

during the development process to discover the idiosyncrasies conclusive from an accuracy standpoint and require elaborate
of the system under development. The knowledge gained in test apparatuses. While simulation environments are certainly
these efforts is then applied in the functional source code. necessary for aspects of system testing, a temperature chamber
With ad-hoc testing, test fixtures and experimentation code is not necessary to verify five lines of math code.
used to characterize the system and shape the functional code
are usually discarded or shelved. Over time, these resources III. TEST DRIVEN DEVELOPMENT
fall out of step with system development (or no longer exist at
all) and become vestigial remnants of the system’s evolution. A. Overview
Valuable, executable knowledge (in the form of code) is lost. Test Driven Development inverts the traditional software
Time will almost certainly be lost in later stages of development/test cycle. In TDD, the development cycle is not
development because these tests have not been kept current. a progression of writing functional code and then later testing
it. Instead, testing drives development. A developer looks for
B. Debugging
ways to make the system testable, designs accordingly, writes
Embedded software relies far more heavily on specialized tests and creates testing strategies, and then writes functional
debugging and system inspection tools than does high-level code to meet the requirements of the test-spawned design [3].
software. Most, if not all, of the need for these tools is created Testing takes different forms. At the highest levels (e.g.
by the multivariable equation of hardware and software integration and system testing) full automation is unusual. At
commingling. Bugs may be due to hardware, software, or the lowest level, TDD prescribes fully automated unit testing.
both. Thus, finding the source of unintended behavior In automated unit testing, a developer first writes a unit test
generally requires more effort than in high-level software. (a test that validates correct operation of a single module of
The existence of sophisticated debugging tools in embedded source code, for instance, a function or method [4]) and then
software creates an interesting side effect. With such advanced implements the complementary functional code. With each
debugging tools available (and needed), developers are system feature tackled, unit test code is added to an automated
inclined to design only for debugging and not for true testing. test suite. Full regression tests can take place all the time.
The assumption inherent in design-for-debug is that any and Further higher-level testing will complement these code-
all code is “debuggable.” The limitations in this assumption level tests. Whether this testing is integration or system
are threefold. First of all, undesired behaviors in a system testing, it will generally follow the patterns of traditional
under development can be due to any number of obscure software verification. Ideally, it will also include some
reasons – often in the least expected places. Relying on measure of automation.
design-for-debug is having faith that one is well-capable of Automated unit tests catch the majority of bugs at a low
finding a needle in a haystack. Secondly, debugging sessions level and leave for the human mind difficult testing issues like
are one-time events. After a bug is found and corrected there timing collisions or unexpected sub-system interactions.
is nothing in place to watch that same code and point out TDD provides several clear benefits:
undesired code interaction in the future. Finally, relying 1. Code is always tested.
heavily on debugging rarely enforces good coding practices; 2. Testing drives the design of the code. As a side
debugging can act as a psychological safety net. effect, the code tends to be improved because of
C. Final Testing the decoupling necessary to create testable code.
3. The system grows organically as more knowledge
The traditional “Waterfall” method of software
of the system is gained.
development prescribes a progression of design, build, and test
4. The knowledge of the system is captured in tests;
steps. Final testing is planned as the last major stage of
the tests are “living” documentation.
development and verification before release to production.
5. Developers can add new features or alter existing
Embedded projects, just as high-level software projects, most
code with confidence that automated regression
often follow these same steps.
testing will reveal failures and unexpected results.
Testing planned for the conclusion of a project presents two
problems. First of all, time constraints and budget limitations B. Particular Advantage of TDD in Embedded Software
usually squeeze final testing into a compressed time period or In the context of embedded software TDD provides a
eliminate it entirely. As such, tests that might prevent costly further advantage beyond those already listed. Because of the
future problems are sacrificed for the demands of the present variability of hardware and software during development,
day. Secondly, with testing so removed from development, bugs are due to hardware, software, or a combination of the
source code is unlikely to have been developed for ease of two. With TDD, software bugs can be eliminated to such a
testing. For example, a simple temperature measurement degree that it becomes far easier to pinpoint, by process of
might be implemented such that a code block contains both an elimination, the source of unexpected system behavior (i.e.
analog-to-digital conversion and the math routines that will hardware versus software).
produce a final temperature value. On the surface, there is
nothing wrong with this approach. In final testing, however,
the math of the routine can only be tested by subjecting the
entire system to actual temperature variations or by using a
special voltage simulation rig. These tests are not necessarily
3

IV. UNIT TESTING IN EMBEDDED SOFTWARE functional logic.

A. Model-Conductor-Hardware Design Pattern


Conductor
Design patterns are documented approaches to solving
commonly occurring problems in the realm of software
development. In high-level languages, a multitude of patterns
exist that address common situations in elegant and language-
independent ways [5].
When considering the challenges of applying TDD to
embedded software, the immediately apparent difficulty is the
hardware. If tests are to be run in an automated fashion, then
hardware functions must be automated as well. Initially, this
Model Mock Hardware Mock
seems to be a task too complicated to implement in a cost- Model Hardware
effective manner. Simulating hardware events in a Fig. 1. Relationships of a Conductor to its complementary MCH triad
comprehensive manner can require complicated platform- members and their mocks. The depicted mocks stand in for the concrete
specific tools or elaborate programmable hardware test members of the triad and allow for testing of the logic within the
Conductor. A mock Conductor (not depicted) allows the concrete Model
fixtures. However, using existing design patterns as and Hardware to be tested. Global variables within mocks capture
inspiration, another approach is available. function parameters and simulate return values used in test assertions.
In a Graphical User Interface (GUI), we find a situation
akin to embedded software. Namely, in GUI programming Model
there is a tendency to mix functional logic with event handling The Model in MCH models the current state of the system.
external to that logic. In this analogy, the asynchronous events For example, if an analog output is set to +5V, but the
and programming interfaces of on-screen widgets are similar feedback circuit reports +4V (with tolerance of 100mV), then
to external interrupts and hardware registers of embedded the Model will set a corresponding error state within itself for
systems. use by the Conductor. The Model ensures internal consistency
Software developers have created two patterns to address of states in this manner. The Model is only connected to the
the problems inherent in architecting good GUI applications. Conductor and has no direct reference to the Hardware.
The Model-View-Presenter (MVP) and Model-View-
Controller (MVC) design patterns effectively and cleanly Hardware
separate widget event handling from flow control logic [6]. As The Hardware in MCH represents a thin layer around the
a side effect, the separation these patterns provide allows for hardware itself. This member of the MCH triad encapsulates
automated testing of GUI presentation code free from a user the ports and registers used in the system. Interrupt Service
clicking upon actual widgets. Routines (ISR’s) notify the Conductor of system state
In MVP, the View represents a very thin wrapper around a changes. The Hardware is only connected to the Conductor
collection of widgets comprising a GUI. The Model saves and has no direct reference to the Model.
state external to the widgets and interfaces with programming
functions elsewhere in the system not directly related to the Conductor
GUI. The Presenter references both the Model and View and The Conductor in MCH contains the main control logic of
embodies the presentation logic necessary to process events the triad. The Conductor is triggered by the Hardware to
from the GUI’s widgets and change state on those widgets. process new data and events. Upon such triggers, the
With this separation, a View can be “mocked”, or simulated. Conductor sets the state within the Model and uses the state
A mocked View, under automated test control, can emit events contained by the Model in its logic to send commands or set
and receive calls from the Presenter. Similarly, the Model can data in the Hardware. The Conductor contains a control loop
be mocked. In this way, the presentation logic of the Presenter and acts as the intermediary between the Model and the
can be thoroughly tested in an automated fashion separate Hardware. The Conductor was so named because of its role as
from an on-screen GUI [7]. It is assumed that the widgets of a system director and because of its proximity to actual
the View are already well-tested by system vendors. Tests for electrical components.
the Presenter and Model are created and added to the For simple systems a single MCH triad may be sufficient.
automated test suite. Functional code is written to cause the Often, multiple triads are necessary to simplify the logical
tests to pass. Final verification by an actual user ensures that segregation of testing. In these cases, triads generally exist
all has been correctly connected in the production system apart independently of one another with a central “executor” to call
from the test system. each Conductor’s control loop. If there is any overlap between
Drawing from MVP and MVC, we developed the Model- triads, it tends to happen at the Hardware level.
Conductor-Hardware (MCH) pattern for use in embedded
B. Testing with Model-Conductor-Hardware
software. In this pattern, similar to its GUI cousins, MCH
allows the physical hardware to be mocked, forces functional Testing with MCH centers on making test assertions against
logic to be decoupled from hardware, and provides a means the information captured in mocks. Each functional member of
for automated unit test suites to test both the hardware and the triad is unit tested in isolation from the system via
4

composition with mocks of the system. The calls and 1) Automated Unit Testing
parameters of the triad member under test are captured within Developers use MCH to decouple functional logic code
the mocks for test assertions. The proper operation of the logic from hardware code and develop unit tests to be run in an
under test is revealed by its actions on the mocks. automated test framework. These tests are run on-chip, cross-
With mocks constructed for each member of the MCH triad, compiled on a PC, or executed in a platform simulator such
clear testing possibilities become apparent. Code testing via that automated regression tests can always be executed. Note
simulator, on-chip, or in a cross-compiled environment are all that work in this tier can progress without target hardware.
possible. The states and behavior within the Model are tested 2) Hardware Level Testing
independently of hardware events and functional logic. The Developers and engineers use a combination of unit tests,
system logic in the Conductor is tested with simulated events hardware features, and direct developer interaction to test
from Hardware and simulated states in the Model. With a hardware functions and hardware setup code. Using feedback
mock Conductor, even hardware register configuration code loops designed into the hardware, processor diagnostic
and ISR’s can be tested via a simulator, hardware test fixture, functions, hardware test fixtures, and user interaction, all
or board-level feedback loops. MCH code examples follow in hardware functions are tested. The approach taken here is
a later section of this paper. system-dependent. Once hardware functions are tested and
operational, it is likely that tests developed here will be run far
C. Unit Testing Framework
less frequently than in Tier 1.
Unit testing frameworks exist for nearly every high-level 3) Communication Channel Testing
programming language in common use today. The mechanics If the embedded system includes an external
of a test framework are relatively simple to implement [8]. A communication interface, developers use PC tools to exercise
framework holds test code apart from functional code, and capture test results of the system through this channel. A
provides functions for comparing expected and received complementary hardware test fixture, software test fixture,
results from the functional module under test, and collects and and/or significant human interaction are likely to be required
reports the test results for the entire test suite. to exercise the system and provoke communication events.
In our work, we have both customized the open source 4) End to End System Testing
project Embunit (Embedded Unit) and created a very Having confidence in low level test successes, developers
lightweight framework called Unity [9]. Embunit and Unity and/or testers manually exercise an end-to-end exploratory
are both C-based frameworks we modify for target platforms system test looking for emergent timing issues, responsiveness
as needed. deficiencies, UI inconsistencies, etc.
D. The cost of using Model-Conductor-Hardware B. Continuous Integration
MCH adds little to no overhead to a production embedded The technique of continuous integration regularly brings
system. Of course, mocks are not included in the final system. together a system’s code (possibly from multiple developers)
Further, MCH is essentially naming, organization, and calling and ensures via regression tests that new programming has not
conventions with little to no extra memory use or function broken existing programming. Automated build systems allow
calls; any overhead incurred by these conventions is easily source code and tests to be compiled and run automatically.
optimized away by the compiler. These ideas and tools are important supports to effective TDD
TDD, in general, does add to project cost in added but are beyond the scope of this paper [10].
developer time. However, clear savings are realized over the
system’s lifetime in reduced bugs, reduced likelihood of VI. EMBEDDED MODEL-CONDUCTOR-HARDWARE E XAMPLES
recall, and ease of feature additions and modifications.
A. MCH in a C-based Environment
V. TDD IN EMBEDDED SOFTWARE Creating mocks and tests in an embedded C environment is
accomplished through compiled mock.o implementations of
A. Four Tier Testing Strategy header file function declarations. For example, suppose
Thorough software testing includes automated unit testing hardware.h declares all functions for interfacing the
at the lowest level and integration and system testing at higher hardware features of a particular microcontroller. In this
levels. Unit testing was addressed in the preceding section. example, Conductor tests will verify that the Conductor makes
Implementing an overall TDD strategy in embedded software specific calls on the hardware with appropriate parameters. As
is a four tier testing approach. With each step up through the such, a mockhardware.c definition file will be written
tiers, less automated testing occurs and more human containing otherwise empty functions that store individual
interaction is required. However, each tier provides increasing function call parameter values or return specific values – both
test confidence and frees developers and testers to use their as defined by global variables. Object files for mock and
human intelligence and intuition for difficult testing matters functional code are linked together, and tests access the
such as sub-system interaction and timing collisions. previously mentioned global values to verify the Conductor
Automated testing at the lowest levels of system development calls to the mockhardware.h interface.
can eliminate a high number of bugs early on. Ultimately,
system flaws found earlier in the development process cost B. Code Samples
less than those found later. The following code blocks are examples drawn from a real-
5

world project developed for Savant Automation of Grand mockhardware.c


Rapids, Michigan. Savant builds Automated Guided Vehicles.
#include "mockhardware.h"
These samples pertain to a dedicated speed control board; the #include "hardware.h"
functions shown set output drive voltage. In this testing #include "conductor.h"
scenario, we illustrate a Conductor under test. The example
tests verify that the Conductor is correctly using the hardware millivolts Hardware_InputFeedbackVoltage;
millivolts Hardware_OutputDriveOutputVoltage;
interface (note use of ASSERT macros from test framework).
bool Hardware_OutputError;
hardware.h millivolts Hardware_GetFeedbackVoltage(void) {
return Hardware_InputFeedbackVoltage;
/// Get feedback from analog drive output. }
millivolts Hardware_GetFeedbackVoltage(void);
void Hardware_SetOutputVoltage(
/// Set the drive output voltage millivolts output){
void Hardware_SetOutputVoltage( Hardware_OutputDriveOutputVoltage = output;
millivolts output); }
/// Set the error flag. void Hardware_SetError(bool err) {
void Hardware_SetError(bool err); Hardware_OutputError = err;
}
model.h
mockmodel.h
typedef struct _ModelInstance {
millivolts FeedbackVoltage; /// Modeled system feedback voltage
millivolts OutputVoltage; extern millivolts Model_FeedbackVoltage;
bool Error;
} ModelInstance; /// Modeled system output voltage
extern millovolts Model_OutputVoltage;
/// Set the feedback voltage.
void Model_SetFeedbackVoltage( /// Modeled error state
millivolts feedback); extern bool Model_Error;
/// Get drive output voltage for hardware. mockmodel.c
millivolts Model_GetOutputVoltage(void);
// Linked with testconductor.o in place of
/// Get the error state. // model.o to allow conductor tests
bool Model_GetError(void); // independent of logic in actual model.
conductor.h #include "mockmodel.h"
/// Callback for hardware feedback voltage. millivolts Model_FeedbackVoltage;
void Conductor_HandleFeedbackVoltage(void); millovolts Model_OutputVoltage;
bool Model_Error;
/// Control loop called by main() forever
void Conductor_Run(void); void Model_SetFeedbackVoltage(
millivolts feedback) {
mockhardware.h
Model_FeedbackVoltage = feedback;
/// Feedback voltage to return
}
extern millivolts
Hardware_InputFeedbackVoltage;
millivolts Model_GetOutputVoltage(void) {
/// Output voltage set by conductor. return Model_OutputVoltage;
extern millivolts }
Hardware_OutputDriveOutputVoltage;
bool Model_GetError(void) {
/// Error flag return Model_Error;
extern bool Hardware_OutputError; }
6

model.c Hardware_OutputDriveOutputVoltage);

#include "model.h" Model_Error = true;


Model_OutputVoltage = 99;
ModelInstance Model;
Conductor_Run();
void Model_SetFeedbackVoltage(
millivolts feedback) { TEST_ASSERT_MESSAGE(
Hardware_OutputError == true,
Model.FeedbackVoltage = feedback; "Error not set");

if(feedback != Model.OutputVoltage) { TEST_ASSERT_EQUAL_INT(78,


// realistically use nominal value Hardware_OutputDriveOutputVoltage);
Model.Error = true; }
}
} VII. CONCLUSION / FUTURE WORK
millivolts Model_GetOutputVoltage(void) { Applying Test Driven Development in embedded software
return Model.OutputVoltage; allows developers to create well-tested systems. The concepts
} we have presented are tool and platform independent allowing
the methods of TDD to drive design and test implementation.
bool Model_GetError(void) {
Software design and quality are both improved leading to
return Model.Error;
} overall cost savings in reduced field defects and eventual
feature enhancements. TDD and to a lesser extent initial setup
conductor.c and tool customization for each new project will add to project
development time; however, the benefits far outweigh this
#include "model.h"
#include "hardware.h" cost. Total test automation can likely never be accomplished.
Nevertheless, the presented methods codify a flexible
void Conductor_HandleFeedbackVoltage(void) { approach that encourages consistent testability.
Model_SetFeedbackVoltage( Planned future work includes applying and expanding the
Hardware_GetFeedbackVoltage());
practices covered in this paper in the context of Real Time
}
Operating Systems. Further, given our experience with our
void Conductor_Run(void) { own Unity framework, we believe that it is feasible to create a
Hardware_SetError(Model_GetError()); unit test framework for assembly languages.
if(!Model_GetError()) {
ACKNOWLEDGMENT
Hardware_SetOutputVoltage(
Model_GetOutputVoltage()); Authors thank Matt Werner of Savant Automation for the
} opportunity to implement in a production system the ideas that
} inspired this paper.
testconductor.c
REFERENCES
#include "conductor.h" [1] Kent Beck, Extreme Programming Explained, Reading, MA: Addison
#include "mockhardware.h" Wesley, 2000.
#include "mockmodel.h" [2] Wolfgang Schmitt. “Automated Unit Testing of Embedded ARM
Applications.” Information Quarterly, Volume 3, Number 4, p. 29,
static void testHandleFeedback(void) { 2004.
Hardware_InputFeedbackVoltage = 7; [3] David Astels, Test Driven Development: A Practical Guide, Upper
Saddle River, NJ: Prentice Hall PTR, 2003.
Conductor_HandleFeedbackVoltage(); [4] “Unit Test.” http://en.wikipedia.org/wiki/Unit_test.
[5] Erich Gamma, Richard Helm, Ralph Johnson, John Vlissides, Design
TEST_ASSERT_EQUAL_INT( 7, Patterns: Elements of Reusable Object-Oriented Software, Reading,
MA: Addison-Wesley Professional Computing Series, 1995.
Model_FeedbackVoltage); [6] Martin Fowler. “Model View Presenter.”
} http://www.martinfowler.com/eaaDev/ModelViewPresenter.html. July
2004.
static void testConductorRun(void) { [7] M. Alles, D. Crosby, C. Erickson, B. Harleton, M. Marsiglia, G.
Model_Error = false; Pattison, C. Stienstra. “Presenter First: Organizing Complex GUI
Model_OutputVoltage = 78; Applications for Test-Driven Development,” accepted at Agile 2006
conference, Minneapolis, MN.
Conductor_Run(); [8] Kent Beck, “Simple Smalltalk Testing: With Patterns.”
http://www.xprogramming.com/testfram.htm.
TEST_ASSERT_MESSAGE( [9] http://www.atomicobject.com/embeddedtesting.page.
Hardware_OutputError == false, [10] “Continuous Integration.”
http://en.wikipedia.org/wiki/Continuous_integration.
"Error set incorrectly");

TEST_ASSERT_EQUAL_INT( 78,

You might also like