Neutron Star Equation of State The nature and interactions of matter at high densities and low temperatures is one of the great unsolved problems in modern science, with profound implications for particle physics and astrophysics. Neutron star cores contain the densest matter known in the Universe, and therefore, determining their properties provides a direct way of measuring the equation of state of ultra dense matter. I work on the surfaces of neutron stars and on the development of spectroscopic methods to measure neutron star radii with increasing precision. I also focus on the optimal ways to infer the neutron star equation of state from these measurements. Furthermore, I develop new statistical tools to extract the measurements and uncertainties from large data sets.
NICER One of the most significant developments in the measurement of the dense matter equation of state is going to come from the NICER detector, built as an astrophysics payload that will go on the International Space Station in 2016. Instead of focusing on spectroscopy, NICER will take a very different approach to measuring neutron star radii, based on the shapes and amplitudes of the pulsed emission observed from neutron star surfaces in multiple wavebands. Because of light bending effects in general relativity, these waveforms encode information about the neutron star space-time, and therefore, its radius and mass. I work on the most sophisticated calculations of these waveforms and methods, similar to image reconstruction and Doppler tomography, for extracting radius information from the upcoming data. I will be applying these techniques to the pulsars that NICER will observe in the next 2 years.
High Performance Computing Rendering time-dependent images of black holes in order to study their horizons and accretion flows and radiation transport in spinning neutron star space times are highly complex problems that require fast, efficient, large-scale algorithms for their solutions. My research group specializes in the development and application of such algorithms on the GPU platform. GPUs were first developed for highly parallelizable applications such as computer games that required fast, high resolution image rendering. We are one of the very few groups who pioneered their use for problems in astrophysics and the efficiency they are enabling is bringing about a revolution. For example, we can render the full image of a spinning black hole and its accretion flow from its horizon to several thousand Schwarzschild radii in a matter of seconds on a single GPU. With support from the National Science Foundation, we built a super-cluster of GPUs at the University of Arizona called El Gato, which we use for our simulations.
The Event Horizon Telescope The Event Horizon Telescope is an experiment that is being performed on a large and ever-increasing array of radio telescopes that span the Earth from Hawaii to Chile and from the South Pole to Arizona. When data will be taken with the full array, it will image the event horizon of the supermassive black hole at the center of our Galaxy, Sagittarius A*, and the black hole at the center of M87, with an unprecedented 10-microarcssecond resolution! This will allow us to take the first ever picture of a black hole at 1.3 and 0.85 mm wavelengths and look for the shadow that is a direct evidence for a black hole predicted by the theory of General Relativity. In addition to allowing us to test this theory of gravity in regimes that it has never been tested before, EHT will also enable us to study the process by which black holes accrete matter and grow in mass. Arizona is a member of the EHT collaboration with several members that work on large scale black hole simulations in GR, develop detectors for the new telescopes in the array, and run the Arizona Radio Observatory.