Mac symbols_in_program gives unexpected values

I would expect this program to show the symbol values of __Cons and __Nil followed by the same values as collected by read_symbols:

module test

import StdEnv
import symbols_in_program

Start w
# (syms,w) = accFiles (read_symbols "./a.out") w
= (descriptor [1,2,3], descriptor [], {#s \\ s <-: syms | isMember s.symbol_name ["__Cons","__Nil"]})

descriptor :: !a -> Int
descriptor _ = code {
	get_node_arity 0
	pushI 8
	mulI
	pushI 2
	addI
	pushD_a 0
	subI
	pop_a 1
}

This works on 64-bit linux:

(6596068,6596040,{Symbol "__Cons" 6596068,Symbol "__Nil" 6596040})

But not on Mac:

(4444321560,4444321608,{Symbol "__Cons" 4295288632,Symbol "__Nil" 4295288600})

The problem can be overcome by checking the difference between the symbol found for __ARRAY__ and the actual address of __ARRAY__ and adding that offset to each value. But it seems that should be done in symbols_in_program rather than elsewhere; the numbers returned by read_symbols now are meaningless.

Edited by Ghost User