We defined time-jitter, δt, as the difference between when a periodic task is supposed to be run, and when it is actually run. The goal of a real-time DAS is to start the ADC at a periodic rate, Δt. Let tn be the nth time the ADC is started. In particular, the goal to make tn - tn-1 = Δt. The jitter is defined as the constant, δt, such that
Assume the input to the ADC can be described as V(t) = A + Bsin(2πft), where A, B, f are constants.
a) Derive an estimate of the maximum voltage error, δV, caused by time-jitter. Basically, solve for the largest possible value of δV as a function of δt, A, B, and f.
b) Consider the situation where this time-jitter is unacceptably large. Which modification to the system will reduce the error the most? Justify your selection.
A) Run the ADC in condinuous mode
B) Convert from spinlock semaphores to blocking semaphores
C) Change from round robin to priority thread scheduling
D) Reduce the amount of time the system runs with interrupts disabled.
E) Increase the size of the DataFifo