To solve this problem, we can use the following kinematic equation:
v_f^2 = v_i^2 + 2a d
where v_f is the final velocity (which is 0 m/s since the car stops), v_i is the initial velocity, a is the acceleration, and d is the distance traveled.
First, we need to convert the initial velocity from km/h to m/s. We know that 1 km/h is equal to 0.2778 m/s, so the initial velocity (v_i) is:
v_i = 100 \times 0.2778 = 27.78 \, \text{m/s}
Next, we need to calculate the time it takes for the driver to observe the stop signal and apply the brakes. We are given that this time is 0.7 s. Since the distance traveled during this time is negligible, we can ignore it in our calculations.
Now, we can rearrange the kinematic equation to solve for distance (d):
d = \frac{{v_f^2 - v_i^2}}{{2a}}
Plugging in the given values, we get:
d = \frac{{0 - (27.78)^2}}{{2 \times (-1.5)}} = \frac{{-27.78^2}}{{-3}} = \frac{{771.5684}}{{3}} = 257.1895 \, \text{m}
So, the distance the car will travel from the moment the signal is observed until it stops is 257.1895 meters.
Answer: \boxed{257.1895 \, \text{m}}