Messages in this thread | | | From | "Junhee Lee" <> | Subject | microsecond event scheduling in an application | Date | Tue, 8 Sep 2009 23:27:40 +0900 |
| |
I am working on event scheduler which handles events in microsecond level. Actual this program is a network emulator using simulation codes. I'd like to expect that network emulator is working as simulation behaviors. Thus high resolution timer interrupt is required. But high resolution timer interrupt derived by high tick frequency (jiffies clock) must effect the system performance. Are there any comments or ways to support microsecond event scheduling without performance degradation?
Regards
| |