# Find non-duplicated integer in a list where every integer occurs three times except for one integer

Posted on

Problem

Given an array of integers where every integer occurs three times
except for one integer, which only occurs once, find and return the
non-duplicated integer.

For example, given [6, 1, 3, 3, 3, 6, 6], return 1. Given [13, 19, 13,
13], return 19.

Do this in O(N) time and O(1) space.

My solution

const arr = [6, 1, 3, 3, 3, 6, 6];
const findUnique = arr => {
const set = new Set();
return arr.reduce((sum, n) => set.has(n) ?
sum - n :
(set.add(n), sum + 2 * n), 0) / 2;
};


Because of the Set it has O(n) space, I guess. I currently don’t know how to solve it with O(1) space.

Solution

I’m going to take @greybeard’s internet points, since he apparently doesn’t want them enough to type this ;-).

You need to keep in mind that $$O(x)O(x)O(x)$$ really means $$O(C×x+...)O(C×x+...)O(C times x + ...)$$ for some value C. As long as you keep C a constant, you can do whatever you want. Thus, $$O(n)O(n)O(n)$$ can really mean $$O(2000×n)O(2000×n)O(2000 times n)$$, and $$O(1)O(1)O(1)$$ can really mean $$O(106)O(106)O(10^6)$$, etc.

In this case, what @greybeard has suggested is that you count the individual set bits in all the numbers, and keep the counts in separate positions. Thus, for an array of 32-bit numbers, you would keep 32 separate counts.

Since you have to perform in $$O(n)O(n)O(n)$$, you could spend one loop through your input array determining how many bits you need to keep, so even very large numbers won’t break the solution.

So let’s pretend all the integers are 8-bit, just because. That doesn’t mean there aren’t a large number of integers, just that all the values in the array are in the range [0, 256). What does that give you?

(If you don’t know, & and << are “bitwise operators.”

for each element in array:
for each bit_offset in [0..8):
if element & (1 << bit_offset):
count[bit_offset] += 1


What does that get you? It gets you a set of 8 separate counts, one for each bit in the 8-bit numbers. (Feel free to replace 8 with 64 if you like…)

Each count represents the number of times that bit appeared “set” in the input array. Now, according to your premise, all the numbers but one appear 3 times. So all the set bits will appear a multiple of 3 times. (Since set bits might be duplicated among numbers, it won’t be “exactly 3 times” but instead “a multiple”).

On the other hand, the one value that appears one time will have its set bits added to the counts one time. So there are two possibilities for each of the count[i] values (note: k[i] is some numbers that don’t matter):

• Either the unique value does not have this bit set: count[i] = 3 * k[i]

• Or the unique value does have this bit set: count[i] = 3 * k[i] + 1

So you have to evaluate each count[i] in order, and determine which form it takes:

result = 0
for bit_offset in [0..8):
if count[bit_offset] % 3 == 1:
result = set_bit(result, bit_offset)


Where set_bit is spelled |= 1 << bit_offset in most C-derived languages.

In terms of efficiency, what happens?

• You might process the array once by iterating over it to discover the largest number of bits required to be counted.

• You create an array of N_BITS counts, initialized to zero. Since N_BITS is not related to the size of the input array, this is considered $$O(1)O(1)O(1)$$. (Actually $$O(32)O(32)O(32)$$ most likely, or maybe 64… But that’s still 1 in big-O-hio!)

• You iterate over the array one time, iterating over N_BITS bit values within each element. (So effectively $$O(64×n)O(64×n)O(64 times n)$$, or less.) You compute the count values here.

• You iterate over the N_BITS counts, determining the bits that will be set in the result, and setting them.

• You return the result.

So your runtime is $$O(2×n)O(2×n)O(2 times n)$$, or just n if you hard-code the N_BITS. And your memory is $$O(64)O(64)O(64)$$ or less, which is just 1.

Let’s look at the example you gave in the comments:

[13, 12, 12, 3, 3, 3, 12]


First, rewrite those to binary (8 + 4 + 2 + 1):

[ 0b1101, 0b1100, 0b1100, 0b0011, 0b0011, 0b0011, 0b1100]


Now, count the set bits, just adding all the 1’s in each column (no carry!):

[ 0b1101,
0b1100,
0b1100,
0b0011,
0b0011,
0b0011,
0b1100]
--------
4434 -> counts = [ 4, 4, 3, 4 ]


Next, our output will be 1 if the count is 1 (mod 3) and 0 if the count is 0 (mod 3). So [4,4,3,4] mod 3 are [1,1,0,1]:

result = 0b1101


Which is 13.

## Feedback

Good job using const in your code for values and functions that don’t get re-assigned. I don’t often see multi-line ternary operators (especially within arrow functions) and that one in your code is a bit on the complex side.

You are correct- your code does not have $$O(1)O(1)O(1)$$ space complexity.

### Alternate approach

Originally I thought about suggesting an approach that used .findIndex() but you pointed out that would no longer be $$O(n)O(n)O(n)$$ complexity.

I also thought about suggesting you sort the array, then iterate over each element with Array.find(), looking for the first item that doesn’t match the next item in the list. However, I forgot that sorting the array is not exactly linear time.

Cudos to Austin Hastings for spelling out how counting, for each bit, the number of times it is set solves the problem.
I was thinking of bit-bashing code, Python for lack of ECMAScript prowess:

''' bit-wise modular arithmetic: mod 3 '''

def bits_mod_3(a):
''' Return the bits in A "mod 3". '''
one, two = bits_0_mod_3(a)
return one & ~two, two & ~one

def bits_0_mod_3(a):
''' Return the bits in A "mod 3":
0 for no bit set, 3 for natural multiple of 3. '''
one, two = 0, 0
for v in a:
one, carry = one ^ v, one & v
two, carry = two ^ carry, two & carry
one |= carry  # carry from bit 2 means 4: congruent 1 mod 3
# above the way I think about the approach, "simplified" below
# carry = one & v
# one = (one ^ v) | two & carry
# two ^= carry
# alternatively, one could code "a mod-3 counter" directly:
# one = one & ~v | ~(one | two) & v
# two = two & ~v | carry
return one, two

def once(a):
""" Return "bits in A with a count of 1 mod 3". """
return bits_mod_3(a)

if __name__ == '__main__':
print(once((13, 12, 12, 3, 3, 3, 12)))