performance tuning – Why is mathematic so slow when tables reach a certain length

So Here is an example:
I have a table full of numbers:

foo = Table[Table[RandomComplex[], {i, 1000}], {j, 8192}];

No I want to do things with some elements of this table:

Table[Abs[foo[[1, 21]] - foo[[1, -20]]], {249}];

And this runs fine and takes about 0 seconds to evaluate. But I want to do more:

Table[Abs[foo[[1, 21]] - foo[[1, -20]]], {250}];

And suddenly it takes about 3 seconds. Why whould that be when I just increased the . But I want to do even more:

Table[RandomComplex[]*Table[Abs[foo[[1, 21]] - foo[[1, -20]]], {250}], {249}];

And this still takes about 3 seconds. But now I want to take it one step further:

Table[RandomComplex[]*Table[Abs[foo[[1, 21]] - foo[[1, -20]]], {250}], {250}];

And this takes forever and I get a CPU load of 60% on my quadcore. As if something is being evaluated in parallel even though I did not ask for it. And it takes 1001 seconds to evaluate. How is this possible and why is there a magic number 250 involved? It there a way to solve this issue without chopping my data into chunks of size<250 before processing?

Does anyone have the same issue or can recreate this problem on their system? Is there a way to solve this?