We just finished the Korean Lunar New Year holiday called Solnal (or 설날). I was not feeling physically adventurous so didn’t do any traveling over this time, but rather relaxed at home with some nice whisky, read “Antifragile” by Nassim Nicholas Taleb, finished watching the first series of “Westworld” and then just for a bit of fun, done some hunting (or practice hunting) for deviations from Einstein’s gravity.

Martin White recently wrote a paper (arXiv:1609.08632) on the possible advantages of using Marked (weighted) correlation functions as a tool for investigating modified gravity. So I (with Yi Zheng) thought we would have a go at trying this, and I’ll present here what I find.

To do this analysis we need:

1) some code for calculating the correlation functions. I have already written a fast MPI+tree code for measuring these kind of statistics (KSTAT).

2) Cosmological N-body simulations of different gravity models. I contacted Baojiu Li to ask if we can use his simulations of modified gravity. He was happy to help and collaborate on this project 🙂

## Data

Baojiu provided us with dark matter halo snapshots at z=0. The size of the box is 1024Mpc on the side and the sample is split into a high mass (M>10^13 Msun) and a low mass (M

<10^13 Msun) samples. There is one GR (general relativity) or rather Newtonian simulation and two simulations of the f(R) gravity with varying values for the |fR0| parameter, |f(R0)|=10^-5 and 10^-6.

## Density Estimation

To estimate the density at every galaxy (or DM halo) I opted to use the Voronoi Tesselation scheme. Two reasons for this: 1) I never tried before so thought it would be good to learn 2) I know that the Voronoi should have good resolution on small scales owing to its intrinsic lack of a smoothing scale (although this is not exactly true, as we shall see later).

In mathematics a Voronoi diagram is a way of partitioning N-dimensional data into individual cells. The nice propertiy being that the inverse of the volumne of each cell is an estimate of the density at that point. There is no need to work on fixed grid cells for computing the density.

import matplotlib.pyplot as plt import numpy as np from scipy.spatial import Voronoi, voronoi_plot_2d infile='galaxy_file.ascii' ## filename with 3 columns (x,y,z) coords points = np.genfromtxt(infile) galvor=Voronoi(points) voronoi_plot_2d(galvor,show_points=True,show_vertices=False,line_alpha=0.5)

##### This is a Voronoi diagram of a subsample of the galaxy data (blue points)

To construct the Voronoi diagram from the data I used the publicly available python code contained within the scipy spatial package. The density is then computed from the inverse of the volume of each cell. However, I could not find an easy way to access the Voronoi volume elements, thankfully someone already thought of this -> here.

from scipy.spatial import Voronoi,Delaunay import numpy as np def tetravol(a,b,c,d): '''Calculates the volume of a tetrahedron, given vertices a,b,c,d''' tetravol=abs(np.dot((a-d),np.cross((b-d),(c-d))))/6 return tetravol def vol(vor,p): '''Calculate volume of 3d Voronoi cell based on point p.''' dpoints=[] vol=0 for v in vor.regions[vor.point_region[p]]: dpoints.append(list(vor.vertices[v])) tri=Delaunay(np.array(dpoints)) for simplex in tri.simplices: vol+=tetravol(np.array(dpoints[simplex[0]]),np.array(dpoints[simplex[1]]), np.array(dpoints[simplex[2]]),np.array(dpoints[simplex[3]])) return vol dpoints=[] vtot=0 mass=("lt13","gt13") sim=("gr","f5","f6") for simi in sim: for massi in mass: infile="zpos_rstar_b1024_"+simi+"_z0.0_m"+massi+".txt_wgt" xx = np.genfromtxt(infile) print infile points=xx[:,0:3] galvor=Voronoi(points) ngal=np.shape(xx)[0] galden=np.zeros((ngal,)) for i in range(ngal): galden[i]=1./vol(galvor,i) rhobar=float(ngal)/1024.**3 rhonorm=galden/rhobar out=np.concatenate((points,rhonorm[:,np.newaxis]),axis=1) outfile="zpos_rstar_b1024_"+simi+"_z0.0_m"+massi+".txt_rho" np.savetxt(outfile,out)

now the function “vol” will compute the volume of each cell associate with each galaxy. The density is obtained by simply inverting this. In this python script we are looping over the 2 mass bins (mlt13,mgt13) and the 3 simulation (gr, f5, f6).

The one problem with using the Voronoi for density estimation is that it will perform badly near the boundary. To avoid this I trimmed 20Mpc/h (little larger than mean inter-particle spacing) from all edges – throwing away data but preserving a clean sample.

## Computing the weights

Following White (2017) we compute weights based as a function of the density (in units of mean density) with two free parameters \rho_star and p.

mass=("lt13","gt13") sim=("gr","f5","f6") for simi in sim: for massi in mass: infile="zpos_rstar_b1024_"+simi+"_z0.0_m"+massi+".txt_rho_trim" xx = np.genfromtxt(infile) ngal=np.shape(xx)[0] points=xx[:,0:3] #x,y,z positions rhonorm=xx[:,3] #normalised density for pstari in range(1,4): for rhostari in range(1,4): m=((rhostari+1.0)/(rhostari+rhonorm))**pstari out=np.concatenate((points,m[:,np.newaxis]),axis=1) outfile="zpos_rstar_b1024_"+simi+"_z0.0_m"+massi+".txt_trim_p"+str(pstari)+"r"+str(rhostari) print outfile np.savetxt(outfile,out)

above we can see how the distributions of the weighs look compared to each other. On the left we show just the distribution of density, with its characteristic log-normal shape. The weighted functions can be seen to give more importance to under-dense regions, as is clear from the above equation when p>0.

## The two-point correlation function

Firstly looking at just the usual correlation function. We plot the modified gravity simulation results normalised to the GR simulation. We see that there is a relative bias between the samples. The weaker F6 has a constant %5 bias difference with GR in both mass bins. For the stronger F5, the higher mass halos again have a relatively constant bias difference with GR, however the lower mass halos exhibit a much larger deviation from GR which seems to increase on larger scales. The errorbars are from 125 jackknife subsamples.

## The weighted correlation functions…

I continue this work over at overleaf…