Problem

Would this take O(n)

$O(n)$in worst case? Can it be improved?

```
var tree = {
val: 5,
left: {
val: 2, left: null, right: null
},
right: {
val: 6,
left: null,
right: {
val: 8,
left: null,
right: null
}
}
};
function inorderReverse(node, cb) {
if (node.right !== null) inorderReverse(node.right, cb);
cb(node);
if (node.left !== null) inorderReverse(node.left, cb);
}
// Find the kth largest element
var k = 1;
var count = 1;
inorderReverse(tree, function(node) {
if (count++ === k) console.log(node.val);
});
```

Solution

Assuming that you are forced for some reason to use that particular type of datastructure, I think you will end up with an O(k) algorithm.

To deal with the edge case, note that your code arguably works fine if k is too large–it just doesn’t print anything. You could also store the size of the tree and check that. But you’ll have to be careful to keep it updated when you manipulate the tree.

Alternately, you could change your tree-walking code to return a value as soon as it finds one:

```
function reverseFind(node, test) {
if (node == null) return undefined;
return reverseFind(node.right, test)
|| test(node) ? node : undefined
|| reverseFind(node.left, test);
}
function kthLargest(tree, k) {
var count = 1;
var node = reverseFind(tree, function(node) {
if (count++ === k) return true;
});
if (node) {
console.log(node.val);
} else {
console.log('no value');
}
}
```

This, at least, is O(k), not O(n).

If your data is reasonably static, it is probably better to just store your data in a sorted array. Then you can get the kth largest element in constant time with much less code:

```
var sortedArray = [2, 5, 6, 8];
function kthLargest(sortedArray, k) {
return sortedArray[sortedArray.length - k];
}
```

Note that this code assumes that there are no values in the array with negative indices (i.e., that sortedArray[-5] === undefined). So you can deal with the edge case of invalid k by checking to see if kthLargest returns a value.