One integer takes 32bit in memory, 1 byte = 8bits, therefore one integer takes 4 bytes.

Now let's assume we have an array: [1,2,3]

  4bytes .   4bytes .  4bytes

| . | . | . | . | . | . | . | . | . | . | . | . |

     1 .            2 .          3

It tooks 4 * 3 bytes for three integers in an array. What if we want to add two more integers into the array, how the memory allocate?

   A. it appends another 8 bytes in the end

   B. it recreate an new array with 4 * 5 bytes size. 

 

B. is the correct answer. It always create a new larger array and delete the old array. The simple reason for this is because we never know after array, it might have other code:

var a = [1,2,3]
var c = 4

  4bytes .   4bytes .  4bytes     4bytes

| . | . | . | . . | . | . | . | . | . | . | . | . | . | . | . |

     1 .            2 .          3 .        c=4

It assign 4 bytes to variable c in memory. We cannot simply append more memory after the old array.