Skip to content
Snippets Groups Projects
advance.tex 185 KiB
Newer Older
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518
%%% Local Variables:
%%% TeX-master: "developer.tex"
%%% ispell-local-dictionary: "english"
%%% End:

\chapter{Advanced Plug-in Development}\label{chap:advance}
\lstset{language=Ocaml} %% makes Ocaml the default language for listings, eg. \lstinline.

This chapter details how to use services provided by \framac in order to be
fully operational with the development of plug-ins. Each section describes
technical points a developer should be aware of. Otherwise,
one could find oneself in one or more of the following situations
\footnote{It
  is fortunately quite difficult (but not impossible) to fall into the worst
  situation by mistake if you are not a kernel developer.}
(from bad to worse):
\begin{enumerate}
\item reinventing the (\framac) wheel;
\item being unable to do some specific things (\eg saving
  results\index{Saving} of an analysis on disk, see
  Section~\ref{proj:states});
\item introducing bugs in his/her code;
\item introducing bugs in other plug-ins using his/her code;
\item breaking the kernel consistency\index{Consistency} and so potentially
  breaking all \framac plug-ins (\eg if s/he modifies the
  AST\index{AST!Modification} without changing project\index{Project}, see
  Section~\ref{proj:use}).
\end{enumerate}

In this chapter, we suppose that the reader is able to write a minimal plug-in
like \texttt{hello}\index{Hello} described in chapter~\ref{chap:tutorial} and
knows about the software architecture of \framac (chapter~\ref{chap:archi}). Moreover
plug-in development requires the use of advanced features of
\caml (module system, classes and objects, \emph{etc}). Static plug-in
development requires some knowledge of \autoconf and \make.
 Each section summarizes its own prerequisites at its beginning (if any).

Note that the following subsections can be read in no particular
order: their contents are indeed quite independent from one another even if
there are references from one chapter to another one. Pointers to
reference manuals (Chapter~\ref{chap:refman}) are also provided for readers who
want full details about specific parts.

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

%% \section{File Tree Overview}\label{adv:files}

%% \begin{target}beginners.\end{target}

%% The \framac main directory is split in several sub-directories. The \framac source
%% code is mostly provided in directories \texttt{cil}\codeidx{cil} and
%% \texttt{src}\codeidx{src}. The first one contains the source code of
%% \cil~\cite{cil}\index{Cil} extended with an \acsl~\cite{acsl}
%% implementation\index{ACSL}. The second one is the core implementation of
%% \framac. This last directory contains directories of the \framac
%% kernel\index{Kernel} and directories of the provided \framac plug-in.

%% A pretty complete description of the \framac file tree is provided in
%% Section~\ref{refman:files}.

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

\section{\framac Configure.in}\label{adv:configure}
\codeidxdef{configure.in}

\begin{target}not for standard plug-ins developers.\end{target}

\begin{prereq}
  Knowledge of \autoconf and shell programming.
\end{prereq}

In this Section, we detail how to modify the file \texttt{configure.in} in
order to configure plug-ins (\framac configuration has been introduced in
Section~\ref{tut2:configure} and~\ref{tut2:makefile}).

First Section~\ref{conf:principle} introduces the general principle and
organisation of \texttt{configure.in}. Then Section~\ref{conf:add} explains how
to configure a new simple plug-in without any dependency. Next we show how to
exhibit dependencies with external libraries and tools
(Section~\ref{conf:dep-lib}) and with other plug-ins
(Section~\ref{conf:dep-plug}). Finally Section~\ref{conf:lib} presents the
configuration of external libraries and tools needed by a new plug-in but not
used anywhere else in \framac.

\subsection{Principle}\label{conf:principle}

When you execute \texttt{autoconf}, file \texttt{configure.in} is used to
generate the \texttt{configure} script. Each \framac user executes this script
to check his/her system and determine the most appropriate configuration: at the
end of this configuration (if successful), the script summarizes the
status of each plug-in, which can be:\index{Plug-in!Status}
\begin{itemize}
\item \emph{available} (everything is fine with this plug-in);
\item \emph{partially available}: either an optional dependency of the plug-in
  is not fully available, or a mandatory dependency of the plug-in is only
  partially available; or\index{Plug-in!Dependency}
\item \emph{not available}: either the plug-in itself is not provided by
  default, or a mandatory dependency of the plug-in is not available.
\end{itemize}

The important notion in the above definitions is
\emph{dependency}\index{Plug-in!Dependency|bfit}. A dependency of a plug-in $p$
is either an external library/tool\index{Library}\index{Tool} or another
\framac plug-in. It is either \emph{mandatory} or \emph{optional}. A mandatory
dependency must be present in order to build $p$, whereas an optional
dependency provides features to $p$ that are additional but not highly required
(especially $p$ must be compilable without any optional dependency).

Hence, for the plug-in developer, the main role of \texttt{configure.in} is to
define the optional and mandatory dependencies of each plug-in. Another standard
job of \texttt{configure.in} is the addition of options \texttt{---enable-$p$}
and \texttt{---disable-$p$} to \texttt{configure} for a plug-in $p$. These
options respectively forces $p$ to be available and disables $p$ (its status is
automatically ``not available'').

Indeed \texttt{configure.in} is organised in different sections specialized in
different configuration checks. Each of them begins with a title delimited by
comments and it is highlighted when \texttt{configure} is executed. These
sections are described in Section~\ref{refman:configure}. Now we focus on
the modifications to perform in order to integrate a new plug-in in \framac.

\subsection{Addition of a Simple Plug-in}\label{conf:add}

In order to add a new plug-in, you have to add a new subsection for the new
plug-in to Section \emph{Plug-in wished}. This action is usually very easy to
perform by copying/pasting from another existing plug-in (\eg
\texttt{occurrence}\index{Plug-in!Occurrence|see{Occurrence}}\index{Occurrence})
and by replacing the plug-in name (here \texttt{occurrence}) by the new plug-in
name in the pasted part. In these sections, plug-ins are sorted according to a
lexicographic ordering.

For instance, Section \emph{Wished Plug-in} introduces a new sub-section for
the plug-in \texttt{occurrence} in the following way.
\codeidxdef{check\_plugin}
\begin{configurecode}
# occurrence
############
check_plugin(occurrence,src/plugins/occurrence,
             [support for occurrence analysis],yes)
\end{configurecode}
\begin{itemize}
\item The first argument is the plug-in name,
\item the second one is the name of directory containing the source files 
  of the plug-in (usually a sub-directory of \texttt{src/plugins}), 
\item the third one is an help message for the \texttt{--enable-occurrence} 
  option of \texttt{configure},
\item the last one indicates if the plug-in is enabled by default
  (\texttt{yes/no}).
\end{itemize}

\begin{important}
The plug-in name must contain only alphanumeric characters and
underscores. It must be the same as the \texttt{name} value given as
argument to the functor \texttt{Plugin.Register} of
section~\ref{adv:plugin-services} (with spaces replaced by underscore). It must
also be the same (modulo upper/lower case) as the \texttt{PLUGIN\_NAME} variable
given in the plugin's Makefile presented in section~\ref{adv:dynamic-make}.
\end{important}

\begin{important}
  The macro \texttt{check\_plugin}\scodeidx{configure.in}{check\_plugin} sets
  the following variables:
  \texttt{FORCE\_OCCURRENCE}\scodeidxdef{configure.in}{FORCE\_$plugin$},
  \texttt{REQUIRE\_OCCURRENCE}\scodeidx{configure.in}{REQUIRE\_$plugin$},
  \texttt{USE\_OCCURRENCE}\scodeidx{configure.in}{USE\_$plugin$}, and
  \texttt{ENABLE\_OCCURRENCE}\scodeidxdef{configure.in}{ENABLE\_$plugin$}.
\end{important}

The first variable indicates if the user explicitly requires the availability of
\texttt{occurrence} \emph{via} setting the option
\texttt{---enable-occurrence}. The second and third variables are used by others
plug-ins in order to handle their dependencies (see
Section~\ref{conf:dep-plug}). The fourth variable \texttt{ENABLE\_OCCURRENCE} indicates
the plug-in status (available, partially available or not available). 
%At the end of these lines of code, it says if the plug-in should be compiled: if
If \texttt{---enable-occurrence} is set, then \texttt{ENABLE\_OCCURRENCE} is \yes
(plug-in available); if \texttt{---disable-occurrence} is set, then its value is
\texttt{no} (plug-in not available). If no option is specified on the command
line of \texttt{configure}, its value is set to the default one (according to
%\texttt{\$default}). 
the value of the fourth argument of \texttt{check\_plugin}).

%%%%
\subsection{Configuration of New Libraries or Tools}
\label{conf:lib}
\index{Library!Configuration}
\index{Tool!Configuration}

Some plug-ins need additional tools or libraries to be fully
functional. The \texttt{configure} script takes care of these in two
steps. First, it checks that an appropriate version of each external
dependency exists on the system. Second, it verifies for each plug-in
that its dependencies are met. Section~\ref{conf:dep-lib} explains how
to make a plug-in depend on a given library (or tool). The present
section deals with the first part, that is how to check for a given
library or tool on a system. Configuration of new libraries and configuration of
new tools are similar.  In this section, we therefore choose to focus
on the configuration of new libraries. This is done by calling a
predefined macro called
\texttt{configure\_library}\scodeidxdef{configure.in}{configure\_library}\scodeidxdef{configure.in}{configure\_tools}\footnote{For
  tools, there is a macro \texttt{configure\_tool} which works in the
  same way as \texttt{configure\_library}.}.
The \texttt{configure\_library} macro takes three arguments. The first one is
the (uppercase) name of the library, the second one is a filename
which is used by the script to check the availability of the library.
In case there are multiple locations possible for the library, this
argument can be a list of filenames. In this case, the argument must
be properly quoted ({\it i.e.} enclosed in a {\tt [}, {\tt]} pair).
Each name is checked in turn. The first one which corresponds to an
existing file is selected.  If no name in the list corresponds to an
existing file, the library is considered to be unavailable. The last
argument is a warning message to display if a configuration problem
appears (usually because the library does not exist). Using these
arguments, the script checks the availability of the library.

Results of this macro are available through two variables which are
substituted in the files generated by \texttt{configure}.
\begin{itemize}
\item \texttt{HAS\_$library$}\scodeidxdef{configure.in}{HAS\_$library$}
  is set to \texttt{yes} or \texttt{no} depending on the availability
  of the library
\item \texttt{SELECTED\_$library$}\scodeidxdef{configure.in}{SELECTED\_$library$}
contains the name of the version selected as described above.
\end{itemize}

When checking for \ocaml{} libraries and object files, remember that
they come in two flavors: bytecode and native
code, which have distinct suffixes. Therefore, you should use the
variables \texttt{LIB\_SUFFIX}\scodeidx{configure.in}{LIB\_SUFFIX}
(for libraries) and
\texttt{OBJ\_SUFFIX}\scodeidx{configure.in}{OBJ\_SUFFIX}
(for object files) to check the presence of a
given file. These variables are initialized at the beginning of the
\texttt{configure} script depending on the availability of a
native-code compiler on the current installation.

\begin{example}
  The library \lablgtksourceview\index{Lablgtksourceview2} (used to have a
  better rendering of C sources in the GUI) is part of
  \lablgtk\index{Lablgtk}~\cite{lablgtk}.
  This is checked through the following command,
  where \texttt{LABLGTKPATH\_FOR\_CONFIGURE} is the path where
  \texttt{configure} has found \lablgtk itself.

\begin{configurecode}
configure_library(
  [GTKSOURCEVIEW],
  [\$LABLGTKPATH_FOR_CONFIGURE/lablgtksourceview2.\$LIB_SUFFIX],
  [lablgtksourceview not found])
\end{configurecode}
\end{example}

\subsection{Addition of Library/Tool Dependencies}\label{conf:dep-lib}
\index{Library!Dependency}\index{Tool!Dependency}

Dependencies upon external tools and libraries are governed by two macros:
\begin{itemize}
\item \texttt{plugin\_require\_external($plugin$,$library$)}%
\scodeidxdef{configure.in}{plugin\_require\_external} indicates that
$plugin$ requires $library$ in order to be compiled.
\item \texttt{plugin\_use\_external($plugin$,$library$)}%
\scodeidxdef{configure.in}{plugin\_use\_external} indicates that
$plugin$ uses $library$, but can nevertheless be compiled if $library$
is not installed (potentially offering reduced functionality).
\end{itemize}

\begin{convention}
  The best place to perform such extensions is just after the addition of $p$
  which sets the value of \texttt{ENABLE\_$p$}.
\end{convention}

\begin{example}
  Plug-in \texttt{gui}\index{Plug-in!GUI} requires
  \lablgtk\index{Lablgtk} and \gnomecanvas\index{GnomeCanvas}. It
  also optionally uses \dottool\index{Dot} for displaying graphs (graph cannot
  be displayed without this tool). So, just after its declaration, there are the
  following lines in \texttt{configure.in}.
\begin{configurecode}
plugin_require_external(gui,lablgtk)
plugin_require_external(gui,gnomecanvas)
plugin_use_external(gui,dot)
\end{configurecode}
\end{example}

\subsection{Addition of Plug-in Dependencies}\label{conf:dep-plug}
\index{Plug-in!Dependency|bfit}

Adding a dependency with another plug-in is quite the same as adding a
dependency with an external library or tool (see
Section~\ref{conf:dep-lib}). For this purpose, \texttt{configure.in}
uses two macros
\begin{itemize}
\item \texttt{plugin\_require($plugin1$,$plugin2$)}%
\scodeidxdef{configure.in}{plugin\_require} states that
$plugin1$ needs $plugin2$.
\item \texttt{plugin\_use($plugin1$,$plugin2$)}%
\scodeidxdef{configure.in}{plugin\_use} states that
$plugin1$ can be used in absence of $plugin2$, but requires $plugin2$
for full functionality.
\end{itemize}

There can be mutual dependencies between plug-ins. This is for instance
the case for plug-ins \texttt{value} and \texttt{from}.

\section{Plug-in Specific Configure.ac}
\codeidxdef{configure.ac}
\label{sec:config-external}

\begin{target}standard plug-ins developers.\end{target}

\begin{prereq}
  Knowledge of \autoconf and shell programming.
\end{prereq}

External plug-ins can have their own configuration file, and can rely on the
macros defined for \framac. In addition, as mentioned in
section~\ref{dynamic-make:comp-same-time}, those plug-ins can be
compiled directly from \framac's own Makefile. In order for them to
integrate well in this setting, they should follow a particular layout,
described below.
First, they need to be able to refer to the auxiliary \texttt{configure.ac}
file defining \framac-specific macros when they are used as stand-alone plug-ins.
This can be done by the following code
\begin{configurecode}
m4_define([plugin_file],Makefile)

m4_define([FRAMAC_SHARE_ENV],
          [m4_normalize(m4_esyscmd([echo $FRAMAC_SHARE]))])

m4_define([FRAMAC_SHARE],
	  [m4_ifval(FRAMAC_SHARE_ENV,[FRAMAC_SHARE_ENV],
                                     [m4_esyscmd(frama-c -print-path)])])

m4_ifndef([FRAMAC_M4_MACROS],
         [m4_include(FRAMAC_SHARE/configure.ac)]
        )
\end{configurecode}
%$ emacs-highlighting is a bit lost otherwise...

\texttt{plugin\_file} is the file which must be present to ensure that
\texttt{autoconf} is called in the appropriate directory (see documentation for
the \texttt{AC\_INIT} macro of autoconf). \texttt{configure.ac} can be found in
two ways: either by relying on the \texttt{FRAMAC\_SHARE} shell variable (when
\framac{} is not installed, {\it i.e.} when configuring the plug-in together
with the main \framac), or by calling an installed \framac (when installing
the plug-in separately). The inclusion of \texttt{configure.ac} needs to be
guarded to prevent multiple inclusions, as the configuration file of the plug-in
might itself be included by \texttt{configure.in}
(see section~\ref{dynamic-make:comp-same-time} for more details).

The configuration of the plug-in itself or related libraries and tools can then
proceed as described in Sections~\ref{conf:add} and \ref{conf:lib}. References
to specific files in the plug-in source directory should be guarded with the
following macro:
\begin{configurecode}
PLUGIN_RELATIVE_PATH(file)
\end{configurecode}

If the external plug-in has some dependencies as described in
sections~\ref{conf:dep-lib}~and~\ref{conf:dep-plug},
the configure script \texttt{configure} must check that all dependencies
are met. This is done with the following macro:\scodeidxdef{configure.in}{check\_plugin\_dependencies}
\begin{configurecode}
check_plugin_dependencies
\end{configurecode}
An external plug-in can
have dependencies upon previously installed plug-ins.
However two separately installed plug-ins can not be
mutually dependent on each other. Nevertheless, they can be compiled together
with the main \framac sources using the \texttt{---enable-external} option
of \texttt{configure} (see section~\ref{dynamic-make:comp-same-time}
for more details).

Finally, the configuration must end with the following command:
\begin{configurecode}
write_plugin_config(files)
\end{configurecode}
where \texttt{files} are the files that must be processed by configure
(as in \texttt{AC\_CONFIG\_FILES} macro). \texttt{PLUGIN\_RELATIVE\_PATH} is
unneeded here.

\begin{important}
For technical reasons, the macros
\texttt{configure\_library}, \texttt{configure\_tool}, 
\texttt{check\_plugin\_dependencies}, and \texttt{write\_plugin\_config}
must not be inside a conditional part of the \texttt{configure}
script. More precisely,
they are using the diversion mechanism of \texttt{autoconf} in order to
ensure that the tests are performed after all dependencies information
has been gathered from all existing plugins. Diversion is a very primitive,
text-to-text transformation. Using those macros within a
conditional (or anything that alters the control-flow of the script) is likely
to result in putting some unrelated part of the script in the same branch of
the conditional.
\end{important}

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

\section{\framac Makefile}\label{adv:make}\codeidxdef{Makefile}

\begin{target}not for standard plug-in developers.\end{target}

\begin{prereq}
  Knowledge of \make.
\end{prereq}

In this section, we detail the use of \texttt{Makefile} dedicated to \framac
compilation. This file is split in several sections which are described in
Section~\ref{make:sections}. By default, executing \texttt{make} only displays
an overview of commands. For example, here is the output of the compilation of
source file \texttt{src/kernel\_services/plugin\_entry\_points/db.cmo}.
\begin{shell}
\$ make src/kernel_services/plugin_entry_points/db.cmo
Ocamlc      src/kernel_services/plugin_entry_points/db.cmo
\end{shell}
If you wish the exact command line, you have to set variable
\texttt{VERBOSEMAKE}\codeidxdef{VERBOSEMAKE} to \texttt{yes} like below.
\begin{shell}
\$ make VERBOSEMAKE=yes src/kernel_services/plugin_entry_points/db.cmo
\end{shell}


%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

\section{Plug-in Specific Makefile}\label{adv:dynamic-make}
\codeidxdef{Makefile.dynamic}

\begin{prereq}
  Knowledge of \make.
\end{prereq}

\subsection{Using \texttt{Makefile.dynamic}}
\label{dynamic-make:dynamic}

In this section, we detail how to write a Makefile for a given plug-in. Even if
it is still possible to write such a Makefile from scratch, \framac provides a
generic Makefile, called \texttt{Makefile.dynamic}\codeidx{Makefile.dynamic},
which helps the plug-in developer in this task. This file is installed in the
\framac share directory. So for writing your plug-in specific Makefile, you
have to:
\begin{enumerate}
\item set some variables for customizing your plug-in;
\item include \texttt{Makefile.dynamic}.
\end{enumerate}

\begin{example}
A minimal \texttt{Makefile} is shown below. That is the Makefile of the plug-in
\texttt{Hello World} presented in the tutorial (see
Section~\ref{tut2:hello}). Each variable set in this example has to be set
by any plug-in.
\codeidx{Makefile.dynamic}
\codeidx{FRAMAC\_SHARE}
\codeidx{FRAMAC\_LIBDIR}
\codeidx{PLUGIN\_CMO}
\codeidx{PLUGIN\_NAME}
\makefileinput{./tutorial/hello/generated/makefile_multiple/Makefile}
\texttt{FRAMAC\_SHARE} must be set to the \framac share directory while
\texttt{FRAMAC\_LIBDIR} must be set to the \framac lib directory.
\texttt{PLUGIN\_NAME} is the capitalized name of your plug-in while
\texttt{PLUGIN\_CMO} is the list of the files $.cmo$ generated from your \caml
sources.
Note that, by default, during compilation of your plug-in all warnings are
disabled; it is recommended to define environment variable
\texttt{DEVELOPMENT=yes} to enable the standard set of compilation warnings.
\end{example}

\begin{important}
To run your specific Makefile, you must have properly installed \framac
before, or compile your plugin together with Frama-C, as described in
section~\ref{dynamic-make:comp-same-time}.
\end{important}

You may possibly need to do \texttt{make depend} before running
\texttt{make}.

Which variable can be set and how they are useful is explained
Section~\ref{make:plugin}. Furthermore, Section~\ref{make:dynamic} explains
the specific features of \texttt{Makefile.dynamic}.

%% \subsection{Calling a Plug-in Specific Makefile from the \framac Makefile}
%% \label{dynamic-make:integrating}
%% [JS 2012/05/09] nobody does this from a while.

%% \begin{target}
%%   kernel-integrated plug-in developers using the \svn repository of \framac.
%% \end{target}

%% Even if you are writing a kernel-integrated plug-in, it is useful to have your
%% plug-in specific Makefile. For instance, it allows you to easily release your
%% plug-in independently of \framac. However, if your version of \framac is
%% changing frequently, it is useful to compile together \framac and your plug-in
%% without installing \framac each time. To reach this goal, you have to mix the
%% integration in \texttt{Makefile} described in Section~\ref{adv:make} and the
%% solution presented in this section: in the section ``Plug-ins'' of
%% \texttt{Makefile} you just have to set few variables before including your
%% plug-in specific Makefile

%% \begin{example} For compiling the plug-in
%%   \texttt{Ltl\_to\_acsl}\index{Ltl\_to\_acsl}, the following lines are added
%%   into \texttt{Makefile}.
%% \codeidx{PLUGIN\_ENABLE}
%% \codeidx{PLUGIN\_DIR}
%% \codeidx{PLUGIN\_DYNAMIC}
%% \codeidx{DISTRIB\_FILES}
%% \codeidx{@ENABLE\_$plugin$}
%% \begin{makefilecode}
%% PLUGIN_ENABLE	:=@ENABLE_LTL_TO_ACSL@
%% PLUGIN_DIR	:=src/ltl_to_acsl
%% DISTRIB_FILES += $(PLUGIN_DIR)/Makefile
%% include $(PLUGIN_DIR)/Makefile
%% \end{makefilecode}
%% \end{example}

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

\subsection{Compiling \framac and external plug-ins at the same time}
\label{dynamic-make:comp-same-time}
\begin{target}
  plug-in developers using the git repository of \framac.
\end{target}
It is also possible to get a completely independent plug-in
recompiled and tested together with \framac's kernel. For that,
\framac must be aware of the existence of the plug-in. This can be
done in two ways:
\begin{itemize}
\item All sub-directories of \verb+src/plugins+ directory in \framac sources
  which are not known to \framac's kernel are assumed to be external
  plug-ins.
\item One can use the \verb+--enable-external+ option of configure
  which takes as argument the path to the plug-in
\end{itemize}

In the first case, the plug-in behaves as any other built-in plug-in:
\verb+autoconf+ run in \framac's main directory will take care of it
and it can be \verb+enabled+ or \verb+disabled+ in the same way as the
others. If the plug-in has its own \verb+configure.in+ or
\verb+configure.ac+ file, the configuration instructions contained
in it (in particular additional dependencies) will be read as well.

In the second case, the plug-in is added to the list of external
plug-ins at configure time. If the plug-in has its own configure, it
is run as well.

Provided it properly uses the variables set by \texttt{Makefile.dynamic},
the plug-in's \texttt{Makefile} does not require
specific adaptations depending on whether it is compiled together with
the kernel or with respect to an already-existing \framac installation.
It is however possible to check the compilation mode with the
\codeidx{FRAMAC\_INTERNAL}\texttt{FRAMAC\_INTERNAL} variable, which is set to
\texttt{yes} when compiling together with \framac kernel and to \texttt{no}
otherwise.

\codeidx{clean-install}
\codeidx{install}
In addition, if a plug-in wishes to install custom files to
\texttt{FRAMAC\_LIBDIR} through the \texttt{install::} target,
this target must depend on \texttt{clean-install}. 
Indeed, \framac's main \texttt{Makefile} removes
all existing files in this directory before performing a fresh installation,
in order to avoid potential interference with an obsolete (and usually
incompatible) module from a previous installation. Adding the dependency thus
ensures that the removal will take place before any new file has been installed
in the directory.

\begin{example} If a plug-in wants to install \texttt{external/my\_lib.cm*}
in addition to the normal plugin files, it should use the following code:
\begin{makefilecode}
install:: clean-install
        $(PRINT_INSTALL) "My beautiful library"
        $(MKDIR) $(FRAMAC_LIBDIR)
        $(CP) external/my_lib.cm* $(FRAMAC_LIBDIR)
\end{makefilecode}
\end{example} %$ %Fix emacs highlighting


\section{Testing}\label{adv:ptests}\index{Test|bfit}

In this section, we present \ptests, a tool provided by \framac in order to
perform non-regression and unit tests.

\ptests\index{Ptests|bfit} runs the \framac toplevel on each specified test
(which are usually \C files). Specific directives can be used for each
test. Each result of the execution is compared from the previously saved result
(called the \emph{oracle}\index{Oracle|bfit}). A test is successful if and only
if there is no difference.  Actually the number of results is twice that the
number of tests because standard and error outputs are compared separately.

First Section~\ref{ptests:use} shows how to use \ptests. Next
Section~\ref{ptests:configuration} introduces how to use predefined directives
to configure tests, while Section~\ref{ptests:alternative} explains
how to set up various testing goals for the same test base. Last
Section~\ref{ptests:options} details \ptests' options, while
Section~\ref{ptests:directives} describes \ptests' directive.

\subsection{Using \ptests}\label{ptests:use}

If you're using a \texttt{Makefile} written following the principles given in
section~\ref{adv:dynamic-make}, the simplest way of using \ptests is through 
\texttt{make tests} which is
roughly equivalent to
\begin{shell}
\$ time ./bin/ptests.opt
\end{shell}
or
\begin{shell}
\$ time ptests.opt
\end{shell}
depending on whether you're inside \framac's sources or compiling a plug-in 
against an already installed \framac distribution.

A specific target \texttt{\$(PLUGIN\_NAME)\_TESTS} will specifically run the tests
of the plugin. One can add new tests as dependencies of this target.
The default tests are run by the target \texttt{\$(PLUGIN\_NAME)\_DEFAULT\_TESTS}.

\texttt{ptests.opt} runs tests belonging to a sub-directory 
of directory \texttt{tests}\codeidx{tests} that is specified in
\ptests configuration file. This configuration file,
\texttt{tests/ptests\_config}, is automatically generated by \framac's
\texttt{Makefile} from the options you set in your plugin's \texttt{Makefile}.
\ptests also accepts specific \emph{test suites}
in arguments.  A test suite\index{Test!Suite|bfit} is either the name of a
sub-directory in directory \texttt{tests} or a filename (with its path relative
to the current directory).

\begin{example}
  If you want to test plug-in
  \texttt{sparecode}\index{Plug-in!Sparecode|see{Sparecode}}\index{Sparecode}
  and specific test \texttt{tests/pdg/variadic.c}, just run
\begin{shell}
\$ ./bin/ptests.opt sparecode tests/pdg/variadic.c
\end{shell}
which should display (if there are 7 tests in directory
\texttt{tests/sparecode})
\begin{shell}
% Dispatch finished, waiting for workers to complete
% Comparisons finished, waiting for diffs to complete
% Diffs finished. Summary:
Run = 8
Ok  = 16 of 16
\end{shell}
\end{example}

\ptests accepts different options which are used to customize test
sequences. These options are detailed in Section~\ref{ptests:options}.

\begin{example}
If the code of plug-in \texttt{plug-in} has changed, a typical sequence of tests
is the following one.
\begin{shell}
\$ ./bin/ptests.opt plug-in
\$ ./bin/ptests.opt -update plug-in
\$ make tests
\end{shell}
So we first run the tests suite corresponding to \texttt{plug-in} in order to
display what tests have been modified by the changes. After checking the
displayed differences, we validate the changes by updating the
oracles\index{Oracle}. Finally we run all the test suites\index{Test!Suite} in
order to ensure that the changes do not break anything else in \framac.
\end{example}
\begin{example}
For adding a new test, the typical sequence of command is the following.
\begin{shell}
\$ ./bin/ptests.opt -show tests/plug-in/new_test.c
\$ ./bin/ptests.opt -update tests/plug-in/new_test.c
\$ make tests
\end{shell}
We first ask \ptests to print the output of the test on the command line, check
that it corresponds to what we expect, and then take it as the initial oracle.
If some changes have been made to the code in order to let \texttt{new\_test.c}
pass, we must of course launch the whole test suite and check that all
existing tests are alright.
\end{example}

\begin{important}
If you're creating a whole new test suite \texttt{suite}, 
don't forget to create the sub-directories \texttt{suite/result} and
\texttt{suite/oracle} where \ptests will store the current results and
the oracles for all the tests in \texttt{suite}
\end{important}

\subsection{Configuration}\label{ptests:configuration}
\index{Test!Configuration|bfit}

In order to exactly perform the test that you wish, some directives can be set
in three different places. We indicate first these places and next the possible
directives.

The places are:
\begin{itemize}
\item inside file \texttt{tests/test\_config};\codeidx{test\_config}
\item inside file \texttt{tests/$subdir$/test\_config} (for each sub-directory
  $subdir$ of \texttt{tests}); or
\item inside each test file, in a special comment of the
  form\index{Test!Header}
\begin{listing-nonumber}
/* run.config
   ... directives ...
*/
\end{listing-nonumber}
\end{itemize}

In each of the above case, the configuration is done by a list of
directives\index{Test!Directive}. Each directive has to be on one line and to
have the form
\begin{listing-nonumber}
CONFIG_OPTION:value
\end{listing-nonumber}
There is exactly one directive by line. The different directives (\emph{i.e.}
possibilities for \texttt{CONFIG\_OPTION}) are detailed in
Section~\ref{ptests:directives}.

\begin{important}
Note that some specific configurations require dynamic linking, which
is not available on all platforms for native code. {\tt ptests} takes
care of reverting to bytecode when it detects that the {\tt OPT},
{\tt EXECNOW}, or \texttt{EXEC} options of a test require dynamic linking. This
occurs currently in the following cases:
\begin{itemize}
\item {\tt OPT} contains the option {\tt -load-script}
\item {\tt OPT} contains the option {\tt -load-module}
\item {\tt EXECNOW} and \texttt{EXEC} use {\tt make} to create a {\tt .cmxs}
\end{itemize}
\end{important}

\begin{important}
  \textbf{Concurrency issues:}
  tests using compiled modules ({\tt -load-script} or {\tt -load-module}) may
  lead to concurrency issues when the same module is used in different test
  files, or in different test cases within the same file. One way to avoid
  issues is to serialize tests via \texttt{EXECNOW} directives, e.g. by using
  \texttt{make} to compile a \texttt{.cmxs} from the \texttt{.ml} file, and
  then loading the \texttt{.cmxs} in the test cases, as in the example below.

  \begin{listing-nonumber}
    EXECNOW: make -s @PTEST_DIR@/@PTEST_NAME@.cmxs
    STDOPT: #"-load-module @PTEST_DIR@/@PTEST_NAME.cmxs" ...
    STDOPT: #"-load-module @PTEST_DIR@/@PTEST_NAME.cmxs" ...
  \end{listing-nonumber}

  In addition, if the same script {\tt tests/suite/script.ml}
  is shared by several test files, the {\tt EXECNOW} directive should be put
  into {\tt tests/suite/test\_config}.

\end{important}

\begin{example}
  Test \texttt{tests/sparecode/calls.c} declares the following directives.
\begin{listing-nonumber}
/* run.config
   OPT: -sparecode-analysis
   OPT: -slicing-level 2 -slice-return main -slice-print
*/
\end{listing-nonumber}
These directives state that we want to test sparecode and slicing analyses on
this file. Thus running the following instruction executes two test cases.
\begin{shell}
\$ ./bin/ptests.opt tests/sparecode/calls.c
% Dispatch finished, waiting for workers to complete
% Comparisons finished, waiting for diffs to complete
% Diffs finished. Summary:
Run = 2
Ok  = 4 of 4
\end{shell}
\end{example}

\subsection{Alternative Testing}\label{ptests:alternative}

You may want to set up different testing goals for the same test
base. Common cases include:
\begin{itemize}
\item checking the result of an analysis with or without an option;
\item checking a preliminary result of an analysis, in particular if the
      complete analysis is costly;
\item checking separately different results of an analysis.
\end{itemize}

This is possible with option \texttt{-config} of \ptests, which takes as
argument the name of a special test configuration, as in
\begin{shell}
\$ ./bin/ptests.opt -config <special_name> plug-in
\end{shell}

Then, the directives for this test can be found:
\begin{itemize}
\item inside file \texttt{tests/test\_config\_<special\_name>};
\item inside file \texttt{tests/$subdir$/test\_config\_<special\_name>} (for
  each sub-directory $subdir$ of \texttt{tests}); or
\item inside each test file, in a special comment of the
  form\index{Test!Header}
\begin{listing-nonumber}
/* run.config_<special_name>
   ... directives ...
*/
\end{listing-nonumber}
Multiple configurations may share the same set of directives:
\begin{listing-nonumber}
/* run.config, run.config_<special_name> [, ...]
   ... common directives ...
*/
\end{listing-nonumber}
The following wildcard is also supported, and accepts any
configuration: \lstinline+/* run.config* +.
\end{itemize}

All operations for this test configuration should take option
\texttt{-config} in argument, as in
\begin{shell}
\$ ./bin/ptests.opt -update -config <special_name> plug-in
\end{shell}

\begin{important}
In addition, option \texttt{-config <special\_name>} requires subdirectories
\texttt{result\_<special\_name>} and \texttt{oracle\_<special\_name>} to store
results and oracle of the specific configuration.
\end{important}

\subsection{Detailed options}\label{ptests:options}

Figure~\ref{fig:ptests-options} details the options of \ptests.
\begin{figure}[ht]
\begin{center}
\begin{tabular}{|c|c|p{6cm}|c|}
\hline
\textbf{kind} & \textbf{Name} & \textbf{Specification} & \textbf{Default}\\
\hline
\hline \multirow{3}{16mm}{\centering{Toplevel}}
& \texttt{-add-options} & 
  Additional options appended to the normal toplevel command-line & \\
& \texttt{-add-options-pre} &
   Additional options prepended to the normal toplevel command line & \\
& \texttt{-byte} & Use bytecode toplevel & no \\
& \texttt{-opt} & Use native toplevel & yes \\
& \texttt{-gui} & Use GUI instead of console-based toplevel & no \\
\hline \multirow{4}{16mm}{\centering{Behavior}}
& \texttt{-run} & Delete current results; run tests and examine results & yes
\\
& \texttt{-examine} & Only examine current results; do not run tests & no \\
& \texttt{-show} & Run tests and show results, but do not examine
                   them; implies \texttt{-byte} &
no \\
& \texttt{-update} & Take current results as new oracles\index{Oracle}; do not
run tests & no \\
\hline \multirow{4}{16mm}{\centering{Misc.}}
& \texttt{-exclude suite} & Do not consider the given \texttt{suite} &
\\
& \texttt{-diff cmd} & Use \texttt{cmd} to show differences between
results and oracles when examining results
& \texttt{diff -u} \\
& \texttt{-cmp cmd} & Use \texttt{cmd} to compare results against
oracles when examining results
& \texttt{cmp -s} \\
& \texttt{-use-diff-as-cmp} & Use the same command for diff and cmp &
no \\
& \texttt{-j n} & Set level of parallelism to \texttt{n} & 4 \\
& \texttt{-v} & Increase verbosity (up to twice) & 0 \\
& \texttt{-help} & Display helps & no \\
\hline
\end{tabular}
\end{center}
\caption{\ptests options.}\label{fig:ptests-options}
\end{figure}

The commands provided through the \texttt{-diff} and \texttt{-cmp}
options play two related but distinct roles. \texttt{cmp} is always
used for each test (in fact it is used twice: one for the standard output
and one for the error output). Only its exit code is taken into
account by \ptests and the output of \texttt{cmp} is discarded.
An exit code of \texttt{1} means that the two files have
differences. The two files will then be analyzed by \texttt{diff},
whose role is to show the differences between the files. An exit code
of \texttt{0} means that the two files are identical. Thus, they won't
be processed by \texttt{diff}. An exit code of \texttt{2} indicates an
error during the comparison (for instance because the corresponding
oracle does not exist). Any other exit code results in a fatal
error. It is possible to use the same command for both \texttt{cmp}
and \texttt{diff} with the \texttt{-use-diff-as-cmp} option, which
will take as \texttt{cmp} command the command used for \texttt{diff}.

The \texttt{-exclude} option can take as argument a whole suite or an
individual test. It can be used with any behavior.

The \texttt{-gui} option will launch Frama-C's GUI instead of the console-based
toplevel. It can be combined with \texttt{-byte} to launch the bytecode GUI. 
In this mode the default level of parallelism is reduced to 1.

\subsection{Detailed directives}\label{ptests:directives}

Figure~\ref{fig:test-directives} shows all the directives that can be used in
the configuration header of a test (or a test suite).
\begin{figure}[ht]
\begin{center}
\begin{tabular}{|c|c|p{5cm}|l|}
\hline
\textbf{Kind} & \textbf{Name} & \textbf{Specification} & \textbf{default}\\
\hline
\hline \multirow{4}{23mm}{\centering{Command}}
& \texttt{CMD}\nscodeidxdef{Test!Directive}{CMD}
& Program to run
& \texttt{./bin/toplevel.opt}
\\
& \texttt{OPT}\nscodeidxdef{Test!Directive}{OPT}
& Options given to the program
& \texttt{-val -out -input -deps}
\\
& \texttt{STDOPT}\nscodeidxdef{Test!Directive}{STDOPT}
& Add and remove options from the default set
& \textit{None}
\\
& \texttt{LOG}\nscodeidxdef{Test!Directive}{LOG}
& Add an additional file to compare against an oracle
& \textit{None}
\\ 
& \texttt{EXECNOW}\nscodeidxdef{Test!Directive}{EXECNOW}
& Run a command before the following commands. When specified in a configuration
file, run it only once.
& \textit{None}
\\
& \texttt{EXEC}\nscodeidxdef{Test!Directive}{EXEC}
& Similar to \texttt{EXECNOW}, but run it once per testing file.
& \textit{None}
\\
& \texttt{MACRO}\nscodeidxdef{Test!Directive}{MACRO}
& Define a new macro
& \textit{None}
\\
& \texttt{FILTER}\nscodeidxdef{Test!Directive}{FILTER}
& Command used to filter results
& \textit{None}
\\
& \texttt{MODULE}\nscodeidxdef{Test!Directive}{MODULE}
& Register a dynamic module to be built and to be loaded with each subsequent
test
& \textit{None}
\\
\hline \multirow{2}{23mm}{\centering{Test suite}}
& \texttt{DONTRUN}\nscodeidxdef{Test!Directive}{DONTRUN}
& Do not execute this test
& \textit{None}
\\
& \texttt{FILEREG}\nscodeidxdef{Test!Directive}{FILEREG}
& selects the files to test
& \texttt{.*\bss.\bss(c|i\bss)}
\\
\hline \multirow{2}{23mm}{\centering{Miscellaneous}}
& \texttt{COMMENT}\nscodeidxdef{Test!Directive}{COMMENT}
& Comment in the configuration
& \textit{None}
\\
& \texttt{GCC}\nscodeidxdef{Test!Directive}{GCC}
& Unused (compatibility only)
& \textit{None}
\\
\hline
\end{tabular}
\end{center}
\caption{Directives in configuration headers of test
  files.}\label{fig:test-directives}
\end{figure}
Any directive can identify a file using a relative path.
The default directory considered for \texttt{.} is always the parent
directory of directory \texttt{tests}\codeidx{tests}. The
\texttt{DONTRUN} directive does not need to have any content, but it
is useful to provide an explanation of why the test should not be run
({\it e.g} test of a feature that is currently developed and not fully
operational yet). If a test file is explicitly given on the command
line of \ptests, it is always executed, regardless of the presence of
a \texttt{DONTRUN} directive.

As said in Section~\ref{ptests:configuration}, these directives can be
found in different places:
\begin{enumerate}
\item default value of the directive (as specified in
  Fig.~\ref{fig:test-directives});
\item inside file \texttt{tests/test\_config};\codeidx{test\_config}
\item inside file \texttt{tests/$subdir$/test\_config} (for each sub-directory
  $subdir$ of \texttt{tests}); or
\item inside each test file
\end{enumerate}
As presented in Section~\ref{ptests:alternative}, alternative directives for
test configuration <special\_name> can be found in slightly different places:
\begin{itemize}
\item default value of the directive (as specified in
  Fig.~\ref{fig:test-directives});
\item inside file \texttt{tests/test\_config\_<special\_name>};
\item inside file \texttt{tests/$subdir$/test\_config\_<special\_name>} (for each sub-directory
  $subdir$ of \texttt{tests}); or
\item inside each test file.
\end{itemize}
For a given test \texttt{tests/suite/test.c}, each existing file in
the sequence above is read in order and defines a configuration level
(the default configuration level always exists).

\begin{itemize}
\item
  \texttt{CMD} allows changing the command that is used for the
  following \texttt{OPT} directives (until a new \texttt{CMD}
  directive is found). No new test case is generated
  if there is no further \texttt{OPT} directive. At a given
  configuration level, the default value for directive