We consider a multiagent linear time-invariant system whose dynamical model may change from one disturbance event to another. The system is monitored by a control center that collects output measurements from the agents after every event and estimates the eigenvalues of the model to keep track of any adverse impact of the disturbance on its spectral characteristics. Sharing measurements in this way, however, can be susceptible to privacy breaches. If an intruder gains access to these measurements, she may estimate the values of sensitive model parameters and launch more severe attacks. To prevent this, we employ a differential privacy framework by which agents can add synthetic noise to their measurements before sending them to the control center. The noise is designed carefully by characterizing the sensitivity of the system so that it limits the intruder from inferring any incremental change in the sensitive parameters, thereby protecting their privacy. Our numerical results show that the proposed design results in marginal degradation in eigenvalue estimation when compared to the error incurred by the intruder in identifying the sensitive parameters.