[lkml]   [2007]   [Aug]   [2]   [last100]   RSS Feed
Views: [wrap][no wrap]   [headers]  [forward] 
Messages in this thread
SubjectRe: CFS review

On Wed, 1 Aug 2007, Linus Torvalds wrote:

> So I think it would be entirely appropriate to
> - do something that *approximates* microseconds.
> Using microseconds instead of nanoseconds would likely allow us to do
> 32-bit arithmetic in more areas, without any real overflow.

The basic problem is that one needs a number of bits (at least 16) for
normalization, which limits the time range one can work with. This means
that 32 bit leaves only room for 1 millisecond resolution, the remainder
could maybe saved and reused later.
So AFAICT using micro- or nanosecond resolution doesn't make much
computational difference.

bye, Roman
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to
More majordomo info at
Please read the FAQ at

 \ /
  Last update: 2007-08-03 01:25    [W:0.404 / U:30.520 seconds]
©2003-2018 Jasper Spaans|hosted at Digital Ocean and TransIP|Read the blog|Advertise on this site