Problem

This very fast, simple sieve quickly finds 31 bit primes. It uses a memory efficient `BitArray`

for odd numbers.

**How a 32 bit int is a 31 bit prime**

Since a prime number must be > 1, the sign bit has no bearing, leaving 31 bits to store the positive numbers. This differs from `uint`

which truly would be 32 bit primes.

See this Microsoft link on Magic Numbers.

If you find the value `2147483647`

on that page you will see this text:

The maximum signed 32 bit value. Largest 31 bit prime.

On an unrelated note, the Magic Numbers link has some pretty interesting things in it.

```
public static class Sieve31
{
public static IEnumerable<int> Primes(int upperLimit)
{
if (upperLimit < 2)
{
throw new ArgumentException("Upper Limit be must greater than or equal to 2.");
}
yield return 2;
if (upperLimit == 2)
{
yield break;
}
// Check odd numbers for primality
const int offset = 3;
Func<int, int> ToNumber = delegate(int index) { return (2 * index) + offset; };
Func<int, int> ToIndex = delegate(int number) { return (number - offset) / 2; };
var bits = new BitArray(ToIndex(upperLimit) + 1, defaultValue: true);
var upperSqrtIndex = ToIndex((int)Math.Sqrt(upperLimit));
for (var i = 0; i <= upperSqrtIndex; i++)
{
// If this bit has already been turned off, then its associated number is composite.
if (!bits[i]) continue;
var number = ToNumber(i);
// The instant we have a known prime, immediately yield its value.
yield return number;
// Any multiples of number are composite and their respective bits should be turned off.
for (var j = ToIndex(number * number); (j > i) && (j < bits.Length); j += number)
{
bits[j] = false;
}
}
// Output remaining primes once bit array is fully resolved:
for (var i = upperSqrtIndex + 1; i < bits.Length; i++)
{
if (bits[i])
{
yield return ToNumber(i);
}
}
}
}
```

**Example Usage**

Here’s a simple example that counts the number of primes found and finds the largest one.

```
var count = 0;
var largest = 0;
var primes = Sieve31.Primes(int.MaxValue);
foreach (var prime in primes)
{
count++;
largest = prime;
}
```

**Worst Case Scenario: int.MaxValue**

The `BitArray`

will have a Length of 1,073,741,823 bits, which is 128 megabytes. This will yield 105,097,565 primes.

If you want to store the primes to a `List<int>`

, the code is quite easy:

```
var primeList = Sieve31.Primes(int.MaxValue).ToList();
```

The resulting list would require 409 megabytes, in addition to the 128 for the `BitArray`

. On my laptop this takes about 40 seconds. Since `BitArray`

is not thread safe, this is close to the best I can hope for on a single thread.

**An indexed list is strongly recommended**

When processing primes it is often, but not always, preferred to use an indexed list. Assuming, of course, that you have sufficient memory to do so. Worst case you need 530 megabytes to generate the list, and once it’s done, you get back 128 MB but now would have a fast Count and a fast index at your fingertips.

If you don’t use a list, and need to loop over the primes more than once, the `BitArray`

is completely regenerated every time, which could degrade performance.

**A Challenge Problem**

Given an extremely large `int`

upper limit, perhaps anything over 1 billion, build a random list of 100 primes. You cannot hardcode any known prime counts.

**Short, Easy Solution**

The easiest solution is to use the recommended indexed list, if memory allows.

```
private IList<int> GetRandom100Easy(int upperLimit)
{
var answer = new List<int>();
var primeList = Sieve31.Primes(upperLimit).ToList();
var random = new Random();
for (var i = 0; i < 100; i++)
{
var index = random.Next(primeList.Count);
answer.Add(primeList[index]);
}
return answer;
}
```

**Longer, Low Memory Solution**

If you don’t have sufficient memory to produce the largest possible list of primes, the solution is a lot longer and slower, as it requires two-passes over the enumerable collection. For the 2nd pass, you can exit early once the full answer is known.

```
private IList<int> GetRandom100LowMemory(int upperLimit)
{
// To produce this answer without a list of primes requires two-passes.
// The 2nd pass can exit early.
var primes = Sieve31.Primes(upperLimit);
// Fully loop over to get count
var primeCount = 0;
foreach (var prime in primes)
{
primeCount++;
}
// Initialize dictionary of 100 random sequences.
IDictionary<int, int> dict = new Dictionary<int, int>();
var random = new Random();
for (var i = 0; i < 100; i++)
{
dict.Add(random.Next(primeCount) + 1, 0);
}
// For early loop termination, find max sequence.
var maxSequence = dict.Keys.Max();
// Loop again to assign primes to dict.
var sequence = 0;
foreach (var prime in primes)
{
sequence++;
if (dict.ContainsKey(sequence))
{
dict[sequence] = prime;
if (sequence == maxSequence)
{
break;
}
}
}
return dict.Values.ToList();
}
```

As you review both solutions, I’m sure you understand why working with an indexed list is preferred.

**Questions**

Being this is CR, there is always an implied question of “Do you have any constructive comments?”

Are there better ways to address the challenge problem?

Solution

About the only huge, performance-impacting change I can see that would be necessary is replacing the `BitArray`

for a literal `bool`

array.

This actually has a **huge** speed impact.

So, making that change, and inverting the related conditions causes us to have the following method:

```
public static IEnumerable<int> Primes(int upperLimit)
{
if (upperLimit < 2)
{
throw new ArgumentException("Upper Limit be must greater than or equal to 2.");
}
yield return 2;
if (upperLimit == 2)
{
yield break;
}
// Check odd numbers for primality
const int offset = 3;
Func<int, int> ToNumber = delegate (int index) { return (2 * index) + offset; };
Func<int, int> ToIndex = delegate (int number) { return (number - offset) / 2; };
var bits = new bool[ToIndex(upperLimit) + 1];
var upperSqrtIndex = ToIndex((int)Math.Sqrt(upperLimit));
for (var i = 0; i <= upperSqrtIndex; i++)
{
// If this bit has already been turned off, then its associated number is composite.
if (bits[i]) continue;
var number = ToNumber(i);
// The instant we have a known prime, immediately yield its value.
yield return number;
// Any multiples of number are composite and their respective bits should be turned off.
for (var j = ToIndex(number * number); (j > i) && (j < bits.Length); j += number)
{
bits[j] = true;
}
}
// Output remaining primes once bit array is fully resolved:
for (var i = upperSqrtIndex + 1; i < bits.Length; i++)
{
if (!bits[i])
{
yield return ToNumber(i);
}
}
}
```

You’ll notice that I inverted all the booleans on the `bits`

value, to make sure that requiring a default of `false`

instead of `true`

now would not create issues.

As far as speed goes, the difference seems to be performing in 63.3% of the time of your original method. (For a limit of 100,000,000 the modified version I have here is about 2486ms, yours is about 3917ms.)

Other than that, there is really *not much* you can do about performance overall.

As far as readability, I would never one-line an `if`

statement without braces.

I’m speaking about:

```
``````
if (bits[i]) continue;
```

Break that down into two-lines, it won’t hurt anything. 😉

```
if (bits[i])
continue;
```

You could also do for a little whitespace to break logical portions of your algorithm out.

```
public static IEnumerable<int> Primes(int upperLimit)
{
if (upperLimit < 2)
{
throw new ArgumentException("Upper Limit be must greater than or equal to 2.");
}
yield return 2;
if (upperLimit == 2)
{
yield break;
}
// Check odd numbers for primality
const int offset = 3;
Func<int, int> ToNumber = delegate (int index) { return (2 * index) + offset; };
Func<int, int> ToIndex = delegate (int number) { return (number - offset) / 2; };
var bits = new bool[ToIndex(upperLimit) + 1];
var upperSqrtIndex = ToIndex((int)Math.Sqrt(upperLimit));
for (var i = 0; i <= upperSqrtIndex; i++)
{
// If this bit has already been turned off, then its associated number is composite.
if (bits[i])
continue;
// The instant we have a known prime, immediately yield its value.
var number = ToNumber(i);
yield return number;
// Any multiples of number are composite and their respective bits should be turned off.
for (var j = ToIndex(number * number); (j > i) && (j < bits.Length); j += number)
{
bits[j] = true;
}
}
// Output remaining primes once bit array is fully resolved:
for (var i = upperSqrtIndex + 1; i < bits.Length; i++)
{
if (!bits[i])
{
yield return ToNumber(i);
}
}
}
```

I know, old question, but I felt it deserved a little attention.

You can still make this function faster and gain another second if you don’t use `IEnumerable`

and an enumerator but for example a `List<int>`

.

Like this (based on @EBrown‘s code):

```
public static List<int> Primes(int upperLimit)
{
var result = new List<int>();
if (upperLimit < 2)
{
throw new ArgumentException("Upper Limit be must greater than or equal to 2.");
}
result.Add(2);
if (upperLimit == 2)
{
return result;
}
// Check odd numbers for primality
const int offset = 3;
Func<int, int> toNumber = index => (2 * index) + offset;
Func<int, int> toIndex = number => (int)((number - offset) * 0.5);
var bits = new bool[toIndex(upperLimit) + 1];
var upperSqrtIndex = toIndex((int)Math.Sqrt(upperLimit));
for (var i = 0; i <= upperSqrtIndex; i++)
{
// If this bit has already been turned off, then its associated number is composite.
if (bits[i]) { continue;}
var number = toNumber(i);
// The instant we have a known prime, immediately yield its value.
result.Add(number);
// Any multiples of number are composite and their respective bits should be turned off.
for (var j = toIndex(number * number); (j > i) && (j < bits.Length); j += number)
{
bits[j] = true;
}
}
// Output remaining primes once bit array is fully resolved:
for (var i = upperSqrtIndex + 1; i < bits.Length; i++)
{
if (!bits[i])
{
result.Add(toNumber(i));
}
}
return result;
}
```

Here are two profiler screenshots:

Tested code:

```
for (int i = 0; i < 3; i++)
{
//var primes1 = Primes(100000000);
var primes1 = Primes1(100000000).ToList();
//var primes1 = PrimesOriginal(100000000).ToList();
}
```

A Challenge ProblemGiven an extremely large int upper limit, perhaps anything over 1 billion, build a random list of 100 primes. You cannot hardcode any known prime counts.

As always with challenges of this type, the trick lies in doing only the work necessary instead of lots more. That’s why the solutions proposed here need a **thousand times** as much CPU and memory as necessary, **despite the brilliant micro-optimisations** – a fact that calls to mind The Sieve of Smith (I hope that little gem stays around for a while, as a warning about what happens when you start micro-optimising before you’re done with algorithmical and architectural optimisation).

The key here is this: the range from which the random primes are to be drawn is huge but the number of random primes to be drawn is minuscule.

Sieving all numbers up to 2^32 requires on the order of several CPU seconds even when done in C/C++ (2 seconds on a single core for an optimised implementation), and the particulars of CLR and C# compiler slow things down further by about one order of magnitude. However, the challenge doesn’t ask for the primality classification of four billion numbers, and it doesn’t ask for 203,280,221 primes either. It asks only for a hundred random primes.

Hence a quick, easy and brutally simple solution is to implement a segmented (or rather windowed) sieve. The primes up to 64K – the square root of the upper limit – need to be sieved out in the classic manner and kept around, preferably as an array of integers. But that range is tiny and the list needs to hold only a few thousand primes; generating it should take only a few milliseconds even with the most simple and straightforward (unoptimised) code.

To draw a random prime, draw a random number k between 0 and 2^32-1 and sieve the window `[k-335, k]`

. Then scan the window backwards from k and return the first prime you encounter. The window size is based on the fact that the gaps between primes up to 2^32 are no wider than 336; the value needs to be adjusted for processing ranges beyond 4,302,407,359. See The Gaps Between Primes and First Occurrence Prime Gaps, for example.

In this way, drawing a hundred primes requires only the sieving of 2^16 + 100 * 336 = 99136 numbers instead of 2^32, or about **1/43323th** (four orders of magnitude less). The windows are so small compared to the total range that you could even scan them using trial division and still be lots faster than with a full-range sieve.

Here’s a Delphi implementation (not my weapon of choice but I gotta learn it somehow) that draws 100 random primes up to 1,000,000,000 in less than a millisecond:

```
function HighestPrimeNotAfter (n: Cardinal): Cardinal;
const
WINDOW_SIZE = 336;
var
is_composite: array of Boolean;
max_factor, window_start: Cardinal;
i, prime, start, stride, j, rest: Cardinal;
begin
SetLength(is_composite, WINDOW_SIZE);
max_factor := Trunc(sqrt(n));
if n > WINDOW_SIZE then
window_start := n - WINDOW_SIZE + 1
else
window_start := 2;
for i := Low(primes_up_to_64K) to High(primes_up_to_64K) do begin
prime := primes_up_to_64K[i];
if prime > max_factor then
BREAK;
start := prime * prime;
stride := prime shl (prime and 1); // no double-stepping for p == 2, because of the squaring
if start >= window_start then
j := start - window_start
else begin
rest := (window_start - start) mod stride;
if rest = 0 then j := 0 else j := stride - rest;
end;
while j < WINDOW_SIZE do begin
is_composite[j] := true;
Inc(j, stride);
end;
end;
for j := High(is_composite) downto Low(is_composite) do
if not is_composite[j] then begin
result := window_start + j;
EXIT;
end;
assert(false);
result := 0; // suppress compiler warning
end;
```

The timing includes 0.2 ms for sieving the primes up to 64K into an `array of Cardinal`

, the code for which is trivial and not shown here.