@@ -63,37 +63,38 @@ There are several functional properties one may want to verify on this system, f
...
@@ -63,37 +63,38 @@ There are several functional properties one may want to verify on this system, f
* guarantee that the system will never output an advisory that may result in a collision
* guarantee that the system will never output an advisory that may result in a collision
* guarantee that the system will not output a Strong advisory where a Weak variant would be enough
* guarantee that the system will not output a Strong advisory where a Weak variant would be enough
Authors of [Katz2017]_ propose ten properties to verify. We
Authors of [Katz2017]_ propose ten properties to verify. We will reproduce the
will reproduce the two first here, and try to check whether a
two first here, and try to check whether a given neural network respects it.
given neural network respects it.
**property**
**property**
:math:`\phi_1`
:math:`\phi_1`
**definition**
**definition**
If the intruder is distant and is significantly slower than the ownship, the score of a COC advisory will always be below a certain fixed threshold.
If the intruder is distant and is significantly slower than the ownship, the
score of a COC advisory will always be below a certain fixed threshold.
**input constraints**
**input constraints**
* :math:`\rho` ≥ 55947.691
* :math:`\rho` :math:`\geq` 55947.691
* :math:`v_{own}` ≥ 1145
* :math:`v_{own}` :math:`\geq` 1145
* :math:`v_{int}` ≤ 60
* :math:`v_{int}` :math:`\leq` 60
**desired output property**
**desired output property**
* :math:`COC` ≤ 1500
* :math:`COC` :math:`\leq` 1500
**property**
**property**
:math:`\phi_2`
:math:`\phi_2`
**definition**
**definition**
If the intruder is distant and is significantly slower than the ownship, the score of a COC advisory will never be maximal.
If the intruder is distant and is significantly slower than the ownship, the
score of a COC advisory will never be maximal.
**input constraints**
**input constraints**
* :math:`\rho` ≥ 55947.691
* :math:`\rho` :math:`\geq` 55947.691
* :math:`v_{own}` ≥ 1145
* :math:`v_{own}` :math:`\geq` 1145
* :math:`v_{int}` ≤ 60
* :math:`v_{int}` :math:`\leq` 60
**desired output property**
**desired output property**
...
@@ -116,15 +117,13 @@ earlier. We will call our theory `ACAS_XU_P1`.
...
@@ -116,15 +117,13 @@ earlier. We will call our theory `ACAS_XU_P1`.
theory ACAS_XU_P1
theory ACAS_XU_P1
end
end
We will need to write down some numerical values. As of now,
We will need to write down some numerical values. As of now, CAISAR support
CAISAR support writting values using floating point arithmetic.
writting values using floating point arithmetic. Why3 defines a float type and
Why3 defines a float type and the relevant arithmetic
the relevant arithmetic operations according to the IEEE floating point standard
operations according to the IEEE floating
in a theory, astutely called ``ieee_float``. Specifically, we will import the
point standard in a theory, astutely called ``ieee_float``.
``Float64`` subtheory, that defines everything we need for 64 bit precision
Specifically, we will import the ``Float64`` subtheory, that
floating point numbers. We thus import it in our theory using the ``use``
defines everything we need for 64 bit precision floating
keyword.
point numbers.
We thus import it in our theory using the ``use`` keyword.
Our file looks like this so far:
Our file looks like this so far:
...
@@ -134,13 +133,15 @@ Our file looks like this so far:
...
@@ -134,13 +133,15 @@ Our file looks like this so far:
use ieee_float.Float64
use ieee_float.Float64
end
end
We would like to verify our propety given a certain neural network.
We would like to verify our propety given a certain neural network. To do this,
To do this, we can write the filename in our theory.
we can write the filename in our theory. Internally, when provided with a neural
Internally, when provided with a neural network file, CAISAR will perform some basic sanity checks (mainly, checks that the network is in a supported format).
network file, CAISAR will perform some basic sanity checks (mainly, checks that
If those checks pass, then CAISAR will internally build a theory, named as the original neural network. This theory will contain two subtheories that provide
the network is in a supported format). If those checks pass, then CAISAR will
logical symbols we can use in our theory: ``AsTuple`` and ``AsArray``.
internally build a theory, named as the original neural network. This theory
We will only consider the ``AsTuple`` subtheory for this tutorial.
will contain two subtheories that provide logical symbols we can use in our
CAISAR currently supports neural network in ONNX and NNet format.
theory: ``AsTuple`` and ``AsArray``. We will only consider the ``AsTuple``
subtheory for this tutorial. CAISAR currently supports neural network in ONNX
and NNet format.
This theory is equivalent to the following WhyML file:
This theory is equivalent to the following WhyML file: