## Problem

We are given 2 arrays `a`

and `b`

both of length `n`

. We build a third array `c`

by rearranging the values in `b`

. The goal is to find the optimal `c`

that maximizes

```
result = (a(0) ^ c(0)) & (a(1) ^ c(1)) & ... & (a(n - 1) ^ c(n - 1))
```

where `^`

is XOR and `&`

is AND.

**Is it possible to do this efficiently?** It’s straightforward to iterate through all possible permutations of `b`

, but this is infeasible for large `n`

.

## More details

- The order of the values in
`a`

is fixed.
- The order of the values in
`b`

may be rearranged to form `c`

. That is, starting with `b = (1, 2, 3)`

, it may be that the maximum result is obtained when the values are rearranged to `c = (2, 1, 3)`

.
`b`

may be rearranged in-place if needed.
- Since the optimal
`c`

is not necessarily unique, any optimal `c`

may be returned.
- Assume all values are 32-bit unsigned integers.
`1 <= n <= 10,000`

.

## Test cases

```
Input:
a = (3, 4, 5)
b = (6, 7, 8)
Output:
c = (8, 7, 6) (result = 3)
```

```
Input:
a = (1, 11, 7, 4, 10, 11)
b = (6, 20, 8, 9, 10, 7)
Output:
c = (8, 6, 10, 9, 7, 20) (result = 9)
```

```
Input:
a = (0, 1, 2, 4, 8, 16)
b = (512, 256, 128, 64, 32, 16)
Output:
c = (16, 32, 64, 128, 256, 512) (result = 0)
```