experchange > comp.arch.embedded

pozz (12-03-18, 10:18 AM)
What do you really use for embedded projects? Do you use "standard"
makefile or do you rely on IDE functionalities?

Nowadays every MCU manufacturers give IDE, mostly for free, usually
based on Eclipse (Atmel Studio and Microchip are probably the most
important exception).
Anyway most of them use arm gcc as the compiler.

I usually try to compile the same project for the embedded target and
the development machine, so I can speed up development and debugging. I
usually use the native IDE from the manufacturer of the target and
Code::Blocks (with mingw) for compilation on the development machine.
So I have two IDEs for a single project.

I'm thinking to finally move to Makefile, however I don't know if it is
a good and modern choice. Do you use better alternatives?

My major reason to move from IDE compilation to Makefile is the test. I
would start adding unit testing to my project. I understood a good
solution is to link all the object files of the production code to a
static library. In this way it will be very simple to replace production
code with testing (mocking) code, simple prepending the testing oject
files to static library of production code during linking.

I think these type of things can be managed with Makefile instead of IDE
compilation.

What do you think?
David Brown (12-03-18, 12:06 PM)
On 03/12/18 09:18, pozz wrote:
[..]
> So I have two IDEs for a single project.
> I'm thinking to finally move to Makefile, however I don't know if it is
> a good and modern choice. Do you use better alternatives?


I sometimes use the IDE project management to start with, or on very
small projects. But for anything serious, I always use makefiles. I
see it as important to separate the production build process from the
development - I need to know that I can always pull up the source code
for a project, do a "build", and get a bit-perfect binary image that is
exactly the same as last time. This must work on different machines,
preferably different OS's, and it must work over time. (My record is
rebuilding a project that was a touch over 20 years old, and getting the
same binary.)

This means that the makefile specifies exactly which build toolchain
(compiler, linker, library, etc.) are used - and that does not change
during a project's lifetime, without very good reason.

The IDE, and debugger, however, may change - there I will often use
newer versions with more features than the original version. And
sometimes I might use a lighter editor for a small change, rather than
the full IDE. So IDE version and build tools version are independent.

With well-designed makefiles, you can have different targets for
different purposes. "make bin" for making the embedded binary, "make
pc" for making the PC version, "make tests" for running the test code on
the pc, and so on.

> My major reason to move from IDE compilation to Makefile is the test. I
> would start adding unit testing to my project. I understood a good
> solution is to link all the object files of the production code to a
> static library. In this way it will be very simple to replace production
> code with testing (mocking) code, simple prepending the testing oject
> files to static library of production code during linking.


I would not bother with that. I would have different variations in the
build handled in different build tree directories.

> I think these type of things can be managed with Makefile instead of IDE
> compilation.
> What do you think?


It can /all/ be managed from make.

Also, a well-composed makefile is more efficient than an IDE project
manager, IME. When you use Eclipse to do a build, it goes through each
file to calculate the dependencies - so that you re-compile all the
files that might be affected by the last changes, but not more than
that. But it does this dependency calculation anew each time. With
make, you can arrange to generate dependency files using gcc, and these
dependency files get updated only when needed. This can save
significant time in a build when you have a lot of files.
pozz (12-03-18, 01:13 PM)
Il 03/12/2018 11:06, David Brown ha scritto:
[..]
> different purposes. "make bin" for making the embedded binary, "make
> pc" for making the PC version, "make tests" for running the test code on
> the pc, and so on.


Fortunately modern IDEs separate well the toolchain from the IDE itself.
Most manufacturers let us install the toolchain as a separate setup. I
remember some years ago the scenario was different and the compiler is
"included" in the IDE installation.

However the problem here isn't the compiler (toolchain) that nowadays is
usually arm-gcc. The big issue is with libraries and includes that the
manufacturer give you to save some time in writing drivers of peripherals.
I have to install the full IDE and copy the interesting headers and
libraries in my folders.

Another small issue is the linker script file that works like a charm in
the IDE when you start a new project from the wizard.
At least for me, it's very difficult to write a linker script from the
scratch. You need to have a deeper understanding of the C libraries
(newlib, redlib, ...) to write a correct linker script.
My solution is to start with IDE wizard and copy the generated linker
script in my make-based project.

> I would not bother with that. I would have different variations in the
> build handled in different build tree directories.


Could you explain?

> It can /all/ be managed from make.
> Also, a well-composed makefile is more efficient than an IDE project
> manager, IME. When you use Eclipse to do a build, it goes through each
> file to calculate the dependencies - so that you re-compile all the
> files that might be affected by the last changes, but not more than
> that. But it does this dependency calculation anew each time. With
> make, you can arrange to generate dependency files using gcc, and these
> dependency files get updated only when needed. This can save
> significant time in a build when you have a lot of files.


Yes, this is sure!
David Brown (12-03-18, 01:57 PM)
On 03/12/18 12:13, pozz wrote:
> Il 03/12/2018 11:06, David Brown ha scritto:
> Fortunately modern IDEs separate well the toolchain from the IDE itself.
> Most manufacturers let us install the toolchain as a separate setup. I
> remember some years ago the scenario was different and the compiler is
> "included" in the IDE installation.


You can do that do some extent, yes - you can choose which toolchain to
use. But your build process is still tied to the IDE - your choice of
directories, compiler flags, and so on is all handled by the IDE. So
you still need the IDE to control the build, and different versions of
the IDE, or different IDEs, do not necessarily handle everything in the
same way.

> However the problem here isn't the compiler (toolchain) that nowadays is
> usually arm-gcc. The big issue is with libraries and includes that the
> manufacturer give you to save some time in writing drivers of peripherals.
> I have to install the full IDE and copy the interesting headers and
> libraries in my folders.


That's fine. Copy the headers, libraries, SDK files, whatever, into
your project folder. Then push everything to your version control
system. Make the source code independent of the SDK, the IDE, and other
files - you have your toolchain (and you archive the zip/tarball of the
gnu-arm-embedded release) and your project folder, and that is all you
need for the build.

> Another small issue is the linker script file that works like a charm in
> the IDE when you start a new project from the wizard.
> At least for me, it's very difficult to write a linker script from the
> scratch. You need to have a deeper understanding of the C libraries
> (newlib, redlib, ...) to write a correct linker script.
> My solution is to start with IDE wizard and copy the generated linker
> script in my make-based project.


Again, that's fine. IDE's and their wizards are great for getting
started. They are just not great for long-term stability of the tools.

> Could you explain?


You have a tree something like this:

Source tree:

project / src / main
drivers

Build trees:

project / build / target
debug
pctest

Each build tree might have subtrees :

project / build / target / obj / main
drivers
project / build / target / deps / main
drivers
project / build / target / lst / main
drivers

And so on.

Your build trees are independent. So there is no mix of object files
built in the "target" directory for your final target board, or the
"debug" directory for the version with debugging code enabled, or the
version in "pctest" for the code running on the PC, or whatever other
builds you have for your project.

> Yes, this is sure!


Of course, if build times are important, you drop Windows and use Linux,
and get a two to four-fold increase in build speed on similar hardware.
And then you discover ccache on Linux and get another leap in speed.
Grant Edwards (12-03-18, 05:30 PM)
On 2018-12-03, pozz <pozzugno> wrote:

> What do you really use for embedded projects? Do you use "standard"
> makefile or do you rely on IDE functionalities?


Gnu makefiles.

> Nowadays every MCU manufacturers give IDE, mostly for free, usually
> based on Eclipse (Atmel Studio and Microchip are probably the most
> important exception).


And they're almost all timewasting piles of...

> Anyway most of them use arm gcc as the compiler.


If you're going to use an IDE, it seems like you should pick one and
stick with it so that you get _good_ at it.

I use Emacs, makefiles, and meld.

> I usually try to compile the same project for the embedded target and
> the development machine, so I can speed up development and debugging. I
> usually use the native IDE from the manufacturer of the target and
> Code::Blocks (with mingw) for compilation on the development machine.
> So I have two IDEs for a single project.


How awful.

> I'm thinking to finally move to Makefile, however I don't know if it is
> a good and modern choice. Do you use better alternatives?


> My major reason to move from IDE compilation to Makefile is the test. I
> would start adding unit testing to my project. I understood a good
> solution is to link all the object files of the production code to a
> static library. In this way it will be very simple to replace production
> code with testing (mocking) code, simple prepending the testing oject
> files to static library of production code during linking.
> I think these type of things can be managed with Makefile instead of
> IDE compilation.
> What do you think?


I've tried IDEs. I've worked with others who use IDEs and watched
them work, and compared it to how I work. It looks to me like IDEs
are a tremendous waste of time.
Grant Edwards (12-03-18, 05:39 PM)
On 2018-12-03, David Brown <david.brown> wrote:

> I sometimes use the IDE project management to start with, or on very
> small projects. But for anything serious, I always use makefiles. I
> see it as important to separate the production build process from the
> development - I need to know that I can always pull up the source code
> for a project, do a "build", and get a bit-perfect binary image that is
> exactly the same as last time.


It impossible to overemphasize how important that is. Somebody should
be able to check out the source tree and a few tools and then type a
single command to build production firmware. And you need to be able
to _automate_ that process.

If building depends on an IDE, then there's always an intermediate
step where a person has to sit in front of a PC for a week tweaking
project settings to get the damn thing to build on _this_ computer
rather than on _that_ computer.

> This must work on different machines,


And in my experience, IDEs do not. The people I know who use Eclips
with some custom-set-of-plugins spend days and days when they need to
build on computer B insted of computer A. I just scp "build.sh" to
the new machine and run it. It contains a handful of Subversion
checkout commands and a "make". And I can do it remotely. From my
phone if needed.

> preferably different OS's, and it must work over time.


Yes! Simply upgrading the OS often seems to render an IDE incapable
of building a project: another week of engineering time goes down the
drain tweaking the "project settings" to get things "just right".
Theo Markettos (12-03-18, 05:49 PM)
Grant Edwards <invalid> wrote:
> It impossible to overemphasize how important that is. Somebody should
> be able to check out the source tree and a few tools and then type a
> single command to build production firmware. And you need to be able
> to _automate_ that process.


One approach is to put the tools into a VM or a container (eg Docker), so
that when you want to build you pull the container and you get an identical
build environment to the last time anyone built it.
Also, your continuous integration system can run builds and tests in
the same environment as you're developing on.

Unfortunately vendors have a habit of shipping IDEs for Windows only, which
makes this harder. It's not so much of a problem for the actual
compiler - especially if that's GCC under the hood - but ancillary tools (eg
configuration tools for peripherals, flash image builders, etc), which are
sometimes not designed to be scripted.

(AutoIt is my worst enemy here, but it has been the only way to get the job
done in some cases)

Decoupling your build from the vagaries of the IDE, even if you can trust
that you'll always build on a fixed platform, is still a good thing - many
IDEs still don't play nicely with version control, for example.

Theo
Phil Hobbs (12-03-18, 06:06 PM)
On 12/3/18 3:18 AM, pozz wrote:
[..]
> I think these type of things can be managed with Makefile instead of IDE
> compilation.
> What do you think?


We use cmake for that--it allows unit testing on a PC, as you say, and
also automates the process of finding libraries, e.g. for emulating
peripherals.

Cheers

Phil Hobbs
Dave Nadler (12-03-18, 07:41 PM)
On Monday, December 3, 2018 at 10:49:36 AM UTC-5, Theo Markettos wrote:
> One approach is to put the tools into a VM or a container (eg Docker), so
> that when you want to build you pull the container and you get an identical
> build environment to the last time anyone built it.
> Also, your continuous integration system can run builds and tests in
> the same environment as you're developing on.


Second that!

We to development in and deliver VMs to customers now, so they are CERTAIN to receive exactly the 'used for production build' versions of every tool, library, driver required for JTAG gizmo, referenced component, etc, etc, etc. Especially important when some tools won't work under latest version of Winbloze! Saves enormous headaches sometime down the road when an update must be made...

Hope that helps,
Best Regards, Dave
DJ Delorie (12-03-18, 09:05 PM)
Grant Edwards <invalid> writes:
> I use Emacs, makefiles, and meld.


+1 on those. My memory isn't good enough any more to remember all the
byzantine steps through an IDE to re-complete all the tasks my projects
require.

Especially since each MCU seems to have a *different* IDE with
*different* procedures to forget...

And that's assuming they run on Linux in the first place ;-)
Grant Edwards (12-03-18, 09:36 PM)
On 2018-12-03, DJ Delorie <dj> wrote:
> Grant Edwards <invalid> writes:
> +1 on those. My memory isn't good enough any more to remember all
> the byzantine steps through an IDE to re-complete all the tasks my
> projects require.
> Especially since each MCU seems to have a *different* IDE with
> *different* procedures to forget...
> And that's assuming they run on Linux in the first place ;-)


The most important rule to remember is:

Never, ever, use any software written or provided by the silicon
vendor. Everytime I've failed to obey that rule, I've regretted it.

I've heard rumors that Intel at one time wrote a pretty good C
compiler for x86.

However, having used other development software from Intel, I find
that impossible to believe. [Acually, Intel MDS-800 "blue boxes"
weren't bad as long as you ran CP/M on them insteaod of, ISIS.]

And don't get me started on compilers and tools from TI, Motorola, or
various others either...

Some of them have put some effort into getting good Gnu GCC and
binutils support for their processors, and that seems to produce good
results. If only they had realized that's all they really needed to
do in the _first_ place...
David Brown (12-03-18, 10:31 PM)
On 03/12/2018 16:49, Theo Markettos wrote:
> Grant Edwards <invalid> wrote:
> One approach is to put the tools into a VM or a container (eg Docker), so
> that when you want to build you pull the container and you get an identical
> build environment to the last time anyone built it.


That is possible, but often more than necessary. Set up your build
sensibly, and it only depends on the one tree for the toolchain, and
your source code tree. It should not depend on things like the versions
of utility programs (make, sed, touch, etc.), environment variables, and
that kind of thing.

Sometimes, however, you can't avoid that - especially for Windows-based
toolchains that store stuff in the registry and other odd places.

> Also, your continuous integration system can run builds and tests in
> the same environment as you're developing on.
> Unfortunately vendors have a habit of shipping IDEs for Windows only, which
> makes this harder.


That is thankfully rare these days. There are exceptions, but most
major vendors know that is a poor habit.

> It's not so much of a problem for the actual
> compiler - especially if that's GCC under the hood - but ancillary tools (eg
> configuration tools for peripherals, flash image builders, etc), which are
> sometimes not designed to be scripted.


Yes, these are more likely to be an issue. Generally they are not
needed for rebuilding the software - once you have run the wizards and
similar tools, the job is done and the generated source can be
preserved. But it can be an issue if you need to re-use the tools for
dealing with changes to the setup.

> (AutoIt is my worst enemy here, but it has been the only way to get the job
> done in some cases)
> Decoupling your build from the vagaries of the IDE, even if you can trust
> that you'll always build on a fixed platform, is still a good thing - many
> IDEs still don't play nicely with version control, for example.


Often IDE's have good integration with version control for the source
files, but can be poor for the project settings and other IDE files.
Typically that sort of thing is held in hideous XML files with
thoughtless line breaks, making it very difficult to do comparisons and
change management.
David Brown (12-03-18, 10:34 PM)
On 03/12/2018 16:30, Grant Edwards wrote:

> I've tried IDEs. I've worked with others who use IDEs and watched
> them work, and compared it to how I work. It looks to me like IDEs
> are a tremendous waste of time.


IDE's are extremely useful tools - as long as you use them for their
strengths, and not their weaknesses. I use "make" for my builds, but I
use an IDE for any serious development work. A good quality editor,
with syntax highlighting, navigation, as-you-type checking, integration
with errors and warnings from the builds - it is invaluable as a
development tool.
Jacob Sparre Andersen (12-03-18, 11:05 PM)
Phil Hobbs wrote:

> We use cmake for that--it allows unit testing on a PC, as you say, and
> also automates the process of finding libraries, e.g. for emulating
> peripherals.


How does it automate finding emulation libraries? That sounds like a
cool feature.

We use GNU Makefiles, but we handle the matching up of emulation
libraries with the real thing by hand. We then typically use different
source directories for emulation libraries and actual drivers.

Greetings,

Jacob
pozz (12-04-18, 12:29 AM)
Il 03/12/2018 12:57, David Brown ha scritto:> On 03/12/18 12:13, pozz wrote:
[..]
> "debug" directory for the version with debugging code enabled, or the
> version in "pctest" for the code running on the PC, or whatever other
> builds you have for your project.

Ok, I got your point and I usually arrange everything similar to your
description (even if I put .o, .d and .lst in the same target-dependent
directory). I also have to admit that all major IDEs nowadays arrange
output files in this manner.

Anyway testing is difficult, at least for me.

Suppose you have a simple project with three source files: main.c,
modh.c and modl.c (of course you have modh.h and modl.h).

Now you want to create a unit testing for modh module that depends on
modl. During test modl should be replaced with a dummy module, a mocking
object. What is your approach?

In project/tests I create a test_modl.c source file that should be
linked against modh.o (the original production code) and
project/tests/modl.o, the mocking object for modl.

One approach could be to re-compile modh.c again during test
compilation. However it's difficult to replace main modl.h with modl.h
from mocking object in the test directory.
modh.c will have a simple

#include "modl.h"

directive and this will point to modl.h in the *same* directory. I
couldn't be able to instruct the compiler to use modl.h from tests
directory.

Moreover it could be useful to test the same object generated during
production. I found a good approach. The production code is compiled all
in a static library, libproduct.a. The tests are compiled against static
library.
The following command, run in the project/tests/ folder

gcc test_modh.o modl.o libproduct.a -o test_modh.exe

should generate a test_modh.exe with mocking object for modl and the
*same* modh object code of production.

Similar Threads