Implement a function getbits, that returns the(right adjusted) n bits that begin at position p of an integer. Assume bit position 0 is at the right end and that n and p are sensible positive values.
You are given n real numbers in an array. A number in the array is called a decimal dominant if it occurs more than n/10 times in the array. Give an O(n) time algorithm to determine if the given array has a decimal dominant.
int myatoi(const char *string)
ReplyDelete{
int i;
i=0;
while(*string && (*string <= '9' && *string >= '0'))
{
i=(i<<3) + (i<<1) + (*string - '0');
string++;
// Dont increment i!
}
return(i);
}
does (i<<3)+(i<<1) have any advantage over multiplying i with 10? or is it just another logic?
ReplyDeleteyes bitwise operation is faster than normal multiplication...
ReplyDelete