diff --git a/doc/acas_xu.rst b/doc/acas_xu.rst
index cb836f48af6787f6e94ff6593d3490a5a4013155..606827e4734b10ccc4985808db0204f641136207 100644
--- a/doc/acas_xu.rst
+++ b/doc/acas_xu.rst
@@ -1,7 +1,7 @@
 .. _acas_xu:
 
 Functional properties of ACAS-Xu
-********************************
+******************************** 
 
 ACAS-Xu stands for Aircraft Collision Avoidance System. Introduced for instance
 in [Manfredi2016]_, it is a specification of a program which aim to output
@@ -47,7 +47,7 @@ only while maintaining the same number of outputs. We will consider five-inputs
 networks in the remaining of this example.
 
 Properties
-----------
+==========
 
 There are several functional properties one may want to verify on this system,
 for instance:
@@ -99,7 +99,7 @@ whether a given neural network respects them.
    * :math:`COC` is not the minimal score.
 
 Modelling the problem using WhyML
----------------------------------
+=================================
 
 The first step for verifying anything with CAISAR is to write a specification
 file that describe the problem to verify as a so-called *theory*. A theory can
@@ -195,10 +195,16 @@ We can write this as a predicate for clarity:
       /\ (-0.5:t) .<= i[4] .<= (-0.4500000000000000111022302462515654042363166809082031250000000000:t)
   end
 
-Note that there is an additional normalization step on the inputs, according
-to the original authors. For this specific benchmark, we adapt the values
-they provide in their `repository
-<https://github.com/NeuralNetworkVerification/Marabou/tree/master/resources/properties>`_, hence the diverging values from the specification. I would 
+.. warning ::
+  You'll notice that the input value in the example are
+  different than what the specification states.
+  Recall that
+  :math:`1500 \leq \rho \leq 1800`, and :math:`-0.06 \leq \theta \leq 0.06`.
+  Training networks on such wide variation range leads to
+  instability, hence we need to normalize the inputs.
+  For this specific benchmark, we refer to the values the
+  original authors provide in their
+  `repository <https://github.com/NeuralNetworkVerification/Marabou/tree/master/resources/properties>`_.
 
 We must then define the result of the application of ``nn_1_1`` on the inputs.
 The built-in function ``@@`` serves this purpose. Its type, ``nn -> vector 'a -> vector 'a``, describes what it does: given a neural network ``nn`` and an input vector ``x``, return the vector that is the result of the application of ``nn`` on ``x``.
@@ -250,7 +256,7 @@ The corresponding neural network in ONNX format is available under the
 
 
 Verifying the property with CAISAR
-----------------------------------
+==================================
 
 Once formalized, the specified property can be assessed by using CAISAR. We will
 use the *open-source* provers CAISAR supports for verifying properties of neural
@@ -288,7 +294,7 @@ either ``Valid``, ``Invalid``, ``Unknown`` or ``Timeout``. CAISAR may output
 (typically, a prover internal failure).
 
 Using more advanced WhyML constructs
-------------------------------------
+====================================
 
 Let us model the ACAS-Xu property :math:`\phi_3`, and verify it for the neural
 networks :math:`N_{1,1}` and :math:`N_{2,7}` [Katz2017]_.
@@ -377,6 +383,9 @@ sub-results to provide the final result corresponding to the original goal
 formula.
 
 
+References
+==========
+
 .. [Manfredi2016] G. Manfredi and Y. Jestin, *An introduction to ACAS Xu and the
    challenges ahead*, 2016 IEEE/AIAA 35th Digital Avionics Systems Conference
    (DASC), 2016, pp. 1-9, doi: 10.1109/DASC.2016.7778055