Book

New worlds, new horizons in astronomy and astrophysics

Authors:

Abstract

Driven by discoveries, and enabled by leaps in technology and imagination, our understanding of the universe has changed dramatically during the course of the last few decades. The fields of astronomy and astrophysics are making new connections to physics, chemistry, biology, and computer science. Based on a broad and comprehensive survey of scientific opportunities, infrastructure, and organization in a national and international context, New Worlds, New Horizons in Astronomy and Astrophysics outlines a plan for ground- and space- based astronomy and astrophysics for the decade of the 2010's. Realizing these scientific opportunities is contingent upon maintaining and strengthening the foundations of the research enterprise including technological development, theory, computation and data handling, laboratory experiments, and human resources. New Worlds, New Horizons in Astronomy and Astrophysics proposes enhancing innovative but moderate-cost programs in space and on the ground that will enable the community to respond rapidly and flexibly to new scientific discoveries. The book recommends beginning construction on survey telescopes in space and on the ground to investigate the nature of dark energy, as well as the next generation of large ground-based giant optical telescopes and a new class of space-based gravitational observatory to observe the merging of distant black holes and precisely test theories of gravity. New Worlds, New Horizons in Astronomy and Astrophysics recommends a balanced and executable program that will support research surrounding the most profound questions about the cosmos. The discoveries ahead will facilitate the search for habitable planets, shed light on dark energy and dark matter, and aid our understanding of the history of the universe and how the earliest stars and galaxies formed. The book is a useful resource for agencies supporting the field of astronomy and astrophysics, the Congressional committees with jurisdiction over those agencies, the scientific community, and the public. © 2010 by the National Academy of Sciences. All rights reserved.
... Weak gravitational lensing is a powerful statistical tool for probing the growth of cosmic structure and measuring cosmological parameters. However, as shown by studies such as Ménard et al. (2010), dust in the circumgalactic region of haloes dims and reddens background sources. In a weak lensing analysis, this selects against sources behind overdense regions; since there is more structure in overdense regions, we will underestimate the amplitude of density perturbations σ 8 if we do not correct for the effects of circumgalactic dust. ...
... Ferreras et al. 1997;Scranton et al. 2005). Ménard et al. (2010) used the well-calibrated photometry from the Sloan Digital Sky Survey, covering u through z bands, to fit both the magnification and dust reddening contributions to the correlation function of z ∼ 0.3 galaxies and z 1 quasars. They find a power law-like dust signal extending from the inner halo (r p ∼ 20 kpc) out to the large-scale clustering regime (several Mpc). ...
... This means we will need to define a dust density profile and the total amount of dust within each halo. We calculate the density profile of the dust by using the extinction model that was measured in Ménard et al. (2010) in Eq. (30). First, we need to understand the geometry of the measurement. ...
Preprint
Weak gravitational lensing is a powerful statistical tool for probing the growth of cosmic structure and measuring cosmological parameters. However, as shown by studies such as M\'enard et al. (2010), dust in the circumgalactic region of haloes dims and reddens background sources. In a weak lensing analysis, this selects against sources behind overdense regions; since there is more structure in overdense regions, we will underestimate the amplitude of density perturbations $\sigma_8$ if we do not correct for the effects of circumgalactic dust. To model the dust distribution we employ the halo model. Assuming a fiducial dust mass profile based on measurements from M\'enard et al. (2010), we compute the ratio $Z$ of the systematic error to the statistical error for a survey similar to the Nancy Grace Roman Space Telescope reference survey (2000 deg$^2$ area, single-filter effective source density 30 galaxies arcmin$^{-2}$). For a waveband centered at $1580$ nm ($H$-band), we find that $Z_{H} = 0.47$. For a similar survey with waveband centered at $620$ nm ($r$-band), we also computed $Z_{r} = 3.6$. Within our fiducial dust model, since $Z_{r} > 1$, the systematic effect of dust will be significant on weak lensing image surveys. We also computed the dust bias on the amplitude of the power spectrum, $\sigma_{8}$, and found it to be for each waveband $\Delta \sigma_8/\sigma_8 = -3.9\times 10^{-4}$ ($H$ band) or $-2.9\times 10^{-3}$ ($r$ band) if all other parameters are held fixed (the forecast Roman statistical-only error $\sigma(\sigma_8)/\sigma_8$ is $9\times 10^{-4}$).
... In addition, the sum of the neutrino masses [1] remains unknown. It is expected that the operational weak lensing surveys, including the Subaru Hyper Suprime-Cam Survey 1 (HSC) [2], the Dark Energy Survey 2 (DES) [3], the Dark Energy Spectroscopic Instrument (DESI) 3 [4], the Prime Focus Spectrograph 4 [5], the Kilo-Degree Survey (KiDS) [6], as well as near-future Stage-IV large-scale structure (LSS) surveys such as Euclid 5 [7], the Vera C. Rubin Observatory 6 [8], and the Roman Space Telescope 7 [9,10], will improve our understanding to many of the questions that cosmology is facing from high-precision measurements of the intervening mass distribution of the universe. ...
Article
Full-text available
We introduce two kurt-spectra to probe fourth-order statistics of weak lensing convergence maps. Using state-of-the-art numerical simulations, we study the shapes of these kurt-spectra as a function of source redshifts and smoothing angular scales. We employ a pseudo- C ℓ approach to estimate the spectra from realistic convergence maps in the presence of an observational mask and noise for stage-IV large-scale structure surveys. We compare these results against theoretical predictions calculated using the FFTLog formalism, and find that a simple nonlinear clustering model — the hierarchical ansatz — can reproduce the numerical trends for the kurt-spectra in the nonlinear regime. In addition, we provide estimators for beyond fourth-order spectra where no definitive analytical results are available, and present corresponding results from numerical simulations.
... The ten-year LSST survey will begin once Rubin Observatory commissioning is completed, expected to be in 2024. The facility was originally proposed as the Dark Matter Telescope in 1996; it was identified as a priority for funding in both the 2008 P5 report [29] and the 2010 Astronomy and Astrophysics Decadal Survey [30]. The community has repeatedly recognized the opportunities presented by a large collecting-area telescope with a wide field of view and large focal plane, which enables rapid surveys of the sky that are at the same time deep (due to the large mirror area and total survey time over ten years), wide (covering large fractions of the total available sky due to the high field of view), and fast (relying on short single-visit exposures combined with repeatedly returning to the same parts of the sky to enable time-domain science, including studies of supernovae and strong lens systems of import for cosmology). ...
Preprint
Cosmological observations in the new millennium have dramatically increased our understanding of the Universe, but several fundamental questions remain unanswered. This topical group report describes the best opportunities to address these questions over the coming decades by extending observations to the $z<6$ universe. The greatest opportunity to revolutionize our understanding of cosmic acceleration both in the modern universe and the inflationary epoch would be provided by a new Stage V Spectroscopic Facility (Spec-S5) which would combine a large telescope aperture, wide field of view, and high multiplexing. Such a facility could simultaneously provide a dense sample of galaxies at lower redshifts to provide robust measurements of the growth of structure at small scales, as well as a sample at redshifts $2<z<5$ to measure cosmic structure at the largest scales, spanning a sufficient volume to probe primordial non-Gaussianity from inflation, to search for features in the inflationary power spectrum on a broad range of scales, to test dark energy models in poorly-explored regimes, and to determine the total neutrino mass and effective number of light relics. A number of compelling opportunities at smaller scales should also be pursued alongside Spec-S5. The science collaborations analyzing DESI and LSST data will need funding for a variety of activities, including cross-survey simulations and combined analyses. The results from these experiments can be greatly improved by smaller programs to obtain complementary data, including follow-up studies of supernovae and spectroscopy to improve photometric redshift measurements. The best future use of the Vera C. Rubin Observatory should be evaluated later this decade after the first LSST analyses have been done. Finally, investments in pathfinder projects could enable powerful new probes of cosmology to come online in future decades.
... These include the well-known real-space one-point statistics such as the cumulants [50] or two-point cumulant correlators as well as the associated probability distribution function [51], the peak-count statistics [52], and morphological estimators [14]. 9 http://www.sdss3.org/surveys/boss.php 10 http://wigglez.swin.edu.au/ We begin with a short review of n-point correlation functions in harmonic space in §2.1, focusing on the case n = 4. ...
Preprint
We introduce two kurt-spectra to probe fourth-order statistics of weak lensing convergence maps. Using state-of-the-art numerical simulations, we study the shapes of these kurt-spectra as a function of source redshifts and smoothing angular scales. We employ a pseudo-$C_{\ell}$ approach to estimate the spectra from realistic convergence maps in the presence of an observational mask and noise for stage-IV large-scale structure surveys. We compare these results against theoretical predictions calculated using the FFTLog formalism, and find that a simple nonlinear clustering model-the hierarchical ansatz-can reproduce the numerical trends for the kurt-spectra in the nonlinear regime. In addition, we provide estimators for beyond fourth-order spectra where no definitive analytical results are available, and present corresponding results from numerical simulations.
... A primary goal of the next generation of telescopes is to find the very first galaxies to form in our observable Universe (Rieke et al. 2019;Council 2010). In particular, the James Webb Space Telescope (JWST) has infrared cameras designed to allow the detection and spectroscopic followup of galaxies at z > 11 (Gardner et al. 2006). ...
Preprint
One of the primary goals for the upcoming James Webb Space Telescope (JWST) is to observe the first galaxies. Predictions for planned and proposed surveys have typically focused on average galaxy counts, assuming a random distribution of galaxies across the observed field. The first and most massive galaxies, however, are expected to be tightly clustered, an effect known as cosmic variance. We show that cosmic variance is likely to be the dominant contribution to uncertainty for high-redshift mass and luminosity functions, and that median high-redshift and high-mass galaxy counts for planned observations lie significantly below average counts. Several different strategies are considered for improving our understanding of the first galaxies, including adding depth, area, and independent pointings. Adding independent pointings is shown to be the most efficient both for discovering the single highest-redshift galaxy and also for constraining mass and luminosity functions.
... Large surveys, particularly ground-based efforts like LSST, are a highly prioritized component of the current and future development landscape in astronomy (Council 2010). As such, great efforts are being made to develop survey tools and technologies to enable science from these missions, such as new database technologies (Juric 2012;Zečević et al. 2019), real-time event broadcasting (Patterson et al. 2019), and real-time data analysis frameworks (e.g. ...
Preprint
Traditional searches for extraterrestrial intelligence (SETI) or "technosignatures" focus on dedicated observations of single stars or regions in the sky to detect excess or transient emission from intelligent sources. The newest generation of synoptic time domain surveys enable an entirely new approach: spatio-temporal SETI, where technosignatures may be discovered from spatially resolved sources or multiple stars over time. Current optical time domain surveys such as ZTF and the Evryscope can probe 10-100 times more of the "Cosmic Haystack" parameter space volume than many radio SETI investigations. Small-aperture, high cadence surveys like Evryscope can be comparable in their Haystack volume completeness to deeper surveys including LSST. Investigations with these surveys can also be conducted at a fraction of the cost of dedicated SETI surveys, since they make use of data already being gathered. However, SETI methodology has not widely utilized such surveys, and the field is in need of new search algorithms that can account for signals in both the spatial and temporal domains. Here I describe the broad potential for modern wide-field time domain optical surveys to revolutionize our search for technosignatures, and illustrate some example SETI approaches using transiting exoplanets to form a distributed beacon.
... In addition the sum of the neutrino masses [4] remains unknown. Fortunately it is expected that the operational weak lensing surveys including CFHTLS 2 , Dark Energy Survey 3 (DES) [5], Dark Energy Spectroscopic Instrument (DESI) 4 , Prime Focus Spectrograph 5 , KiDS [6] and near-future Stage-IV large scale structure (LSS) surveys such as Euclid 6 [7], LSST 7 [8] and WFIRST [9] will provide answers to many of the questions that cosmology is facing. ...
Preprint
We use a recently introduced statistic called Integrated Bispectrum (IB) to probe the gravity-induced non-Gaussianity at the level of the bispectrum from weak lensing convergence or $\kappa$ maps. We generalize the concept of the IB to spherical coordinates, This result is next connected to the response function approach. We introduce the concept of squeezed three-point correlation functions (3PCF) for $\kappa$ maps and relate them to the IB defined in the Fourier domain. Finally, we use the Euclid Flagship simulations to compute the IB as a function of redshift and wave number. We outline how IB can be computed using a variety of analytical approaches including ones based on Effective Field Theory (EFT), halo models and models based on the Separate Universe approach. Generalizations to include tomographic bins, external data sets and Bayesian estimators are discussed. Generalizations to shear maps and construction of squeezed limits of EEE, BBB, EEB and EEB bispectra are also discussed. We also show how external data sets, e.g. $y$-parameter maps from thermal Sunyaev-Zeldovich observations, can be used to construct the squeezed limits of mixed IB involving $y$ and $\kappa$ fields. We emphasize the role of the finite volume effect in numerical estimation of IB.
... To describe their properties, the equation of state (EOS), namely, the relationship between energy density and pressure, of neutron-rich matter is needed. Great efforts have been devoted in both nuclear physics and astrophysics to understand the nature of neutron stars [4,[8][9][10][11][12][13]. In fact, to better constrain the underlying EOS of neutron star matter, many research facilities are currently operating, updating, or under construction around the world [14,15], such as various advanced X-ray satellites and Earth-based large telescopes, the Neutron Star Interior Composition Explorer (NICER), various gravitational wave detectors, and advanced radioactive ion beam facilities. New observations and experiments at these facilities provide us with great opportunities to address some of the controversies regarding the EOS of neutron-rich matter especially at densities significantly higher than the saturation density q 0 of cold nuclear matter. ...
Article
Full-text available
Extracting the equation of state (EOS) and symmetry energy of dense neutron-rich matter from astrophysical observations is a long-standing goal of nuclear astrophysics. To facilitate the realization of this goal, the feasibility of using an explicitly isospin-dependent parametric EOS for neutron star matter was investigated recently in [1, 2, 3]. In this contribution, in addition to outlining the model framework and summarizing the most important findings from [1, 2, 3], we report a few new results regarding constraining parameters characterizing the high-density behavior of nuclear symmetry energy. In particular, the constraints on the pressure of neutron star matter extracted from combining the X-ray observations of the neutron star radius, the minimum–maximum mass \(M=2.01\) \(\hbox {M}_\odot \), and causality condition agree very well with those extracted from analyzing the tidal deformability data by the LIGO + Virgo Collaborations. The limitations of using the radius and/or tidal deformability of neutron stars to constrain the high-density nuclear symmetry energy are discussed.
... Magrathea will help to answer one of the biggest questions in astrophysics and a decadal survey priority [50]: how planets form from dust in protoplanetary disks. The mission will advance our understanding of the physics underlying dust grain growth from the microscopic to the macroscopic scale. ...
Article
Full-text available
One of the least understood processes in astrophysics is the formation of planetesimals from molecules and dust within protoplanetary disks. In fact, current methods have strong limitations when it comes to model the full dynamics in this phase of planet formation, where small dust aggregates collide and grow into bigger clusters. That is why microgravity experiments of the phenomena involved are important to reveal the underlying physics. Because previous experiments had some limitations, in particular short durations and constrained dimensions, a new mission to study the very first stages of planet formation is proposed here. This mission, called Magrathea, is focused on creating the best conditions for developing these experiments, using a satellite with a 6 $m^3$ test chamber. During the mission 28 experiments are performed using different dust compositions, sizes and shapes, to better understand under which conditions dust grains stick and aggregate. Each experiment should last up to one month, with relative collision velocities of up to 5 mm/s, and initial dust sizes between 1 $\mu$m and 1 mm. At least $10^6$ collisions per experiment should be recorded, to provide statistically significant results. Based on the scientific objectives and requirements, a preliminary analysis of the payload instrumentation is performed. From that a conceptual mission and spacecraft design is developed, together with a first approach to mission programmatic and risk analysis. The solution reached is a 1000 kg spacecraft, set on a 800 km Sun-synchronous orbit, with a total mission cost of around 438 MEuros.
... Most intriguingly, the presence of an odd-parity "B-mode" pattern of polarization at large angular scales would be an unambiguous signature that our universe began with an inflationary epoch of rapid expansion, thus providing information on fundamental physics at energy scales far B R. Gualtieri [email protected] Extended author information available on the last page of the article beyond those achievable at accelerators [1]. A believable detection (or exclusion) of this faint signal will require polarimeters of enormous sensitivity, exquisite control of polarized instrumental systematics, and clean separation of the CMB from galactic and atmospheric foregrounds. ...
Article
Full-text available
SPIDER is a balloon-borne instrument designed to map the polarization of the millimeter-wave sky at large angular scales. SPIDER targets the B-mode signature of primordial gravitational waves in the cosmic microwave background (CMB), with a focus on mapping a large sky area with high fidelity at multiple frequencies. SPIDER's first longduration balloon (LDB) flight in January 2015 deployed a total of 2400 antenna-coupled Transition Edge Sensors (TESs) at 90 GHz and 150 GHz. In this work we review the design and in-flight performance of the SPIDER instrument, with a particular focus on the measured performance of the detectors and instrument in a space-like loading and radiation environment. SPIDER's second flight in December 2018 will incorporate payload upgrades and new receivers to map the sky at 285 GHz, providing valuable information for cleaning polarized dust emission from CMB maps.
... Assuming a single-sample readout noise of σ 1 = 3.55 e − rms/ pix, the applications that will benefit most from reduced readout noise will be signal dominated (r ≈ r sig ), but possess a low signal to noise ratio (SNR 7). One exciting science case that will operate in the very low SNR regime is space-based imaging and spectroscopy of terrestrial exoplanets in the habitable zones of nearby stars [45]. The photon flux from exo-Earths is expected to be of order 1 per several minutes, necessitating the use of ultra-low noise detectors [46]. ...
Article
Full-text available
We have developed a non-destructive readout system that uses a floating-gate amplifier on a thick, fully depleted charge coupled device (CCD) to achieve ultra-low readout noise of 0.068 e- rms/pix. This is the first time that discrete sub-electron readout noise has been achieved reproducibly over millions of pixels on a stable, large-area detector. This allows the precise counting of the number of electrons in each pixel, ranging from pixels with 0 electrons to more than 1500 electrons. The resulting CCD detector is thus an ultra-sensitive calorimeter. It is also capable of counting single photons in the optical and near-infrared regime. Implementing this innovative non-destructive readout system has a negligible impact on CCD design and fabrication, and there are nearly immediate scientific applications. As a particle detector, this CCD will have unprecedented sensitivity to low-mass dark matter particles and coherent neutrino-nucleus scattering, while astronomical applications include future direct imaging and spectroscopy of exoplanets.
Chapter
The Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) was formed in 2002 with support from the US National Science Foundation to design and operate research facilities for the benefit of the academic water research community. Through a series of community meetings, workshops, and pilot projects, four facilities were advanced for possible funding: hydrologic observatories (highly instrumented field sites), a hydrologic measurement facility (to provide and to develop advanced field instrumentation), a data center (to publish, to access, and to provide data), and a national center for hydrologic synthesis (a facility to promote interdisciplinary scholarship in water). Only one of these facilities to support the hydrologic information system succeeded in attracting funding. The mixture of success and failure indicates fundamental challenges for hydrologic science that must be overcome to achieve its potential as a science and to advance scientific management of water resources.
Article
We provide a systematic study of the position-dependent correlation function in weak-lensing convergence maps and its relation to the squeezed limit of the three-point correlation function (3PCF) using state-of-the-art numerical simulations. We relate the position-dependent correlation function to its harmonic counterpart, i.e., the position-dependent power spectrum or equivalently the integrated bispectrum (IB). We use a recently proposed improved fitting function, BiHalofit, for the bispectrum to compute the theoretical predictions as a function of source redshifts. In addition to low redshift results (zs=1.0–2.0) we also provide results for maps inferred from lensing of the cosmic microwave background (CMB), i.e., zs=1100. We include a Euclid-type realistic survey mask and noise. In agreement with the recent studies on the position-dependent power spectrum, we find that the results from simulations are consistent with the theoretical expectations when appropriate corrections are included. Performing a rough estimate, we find that the signal-to-noise (S/N) for the detection of position-dependent correlation function from Euclid-type mask with fsky=0.35, can range between 6–12 depending on the value of the intrinsic ellipticity distribution parameter σε=0.3–1.0. For reconstructed κ maps using an ideal CMB survey the S/N≈1.8. We also found that a 10% deviation in σ8 can be detected using IB for the optimistic case of σε=0.3 with a S/N≈5. The S/N for such detection in case of ΩM is lower.
Article
Full-text available
Weak gravitational lensing is a powerful statistical tool for probing the growth of cosmic structure and measuring cosmological parameters. However, as shown by studies such as by Ménard et al., dust in the circumgalactic region of halos dims and reddens background sources. In a weak lensing analysis, this selects against sources behind overdense regions; since there is more structure in overdense regions, we will underestimate the amplitude of density perturbations σ 8 if we do not correct for the effects of circumgalactic dust. To model the dust distribution we employ the halo model . Assuming a fiducial dust mass profile based on measurements from Ménard et al., we compute the ratio Z of the systematic error to the statistical error for a survey similar to the Nancy Grace Roman Space Telescope reference survey (2000 deg ² area, single-filter effective source density 30 galaxies arcmin ⁻² ). For a wave band centered at 1580 nm ( H band), we find that Z H = 0.37. For a similar survey with wave band centered at 620 nm ( r band), we also computed Z r = 2.8. Within our fiducial dust model, since Z r > 1, the systematic effect of dust will be significant on weak lensing image surveys. We also computed the dust bias on the amplitude of the power spectrum, σ 8 , and found it to be for each wave band Δ σ 8 / σ 8 = −3.1 × 10 ⁻⁴ ( H band) or −2.2 × 10 ⁻³ ( r band) if all other parameters are held fixed (the forecast Roman statistical-only error σ ( σ 8 )/ σ 8 is 9 × 10 ⁻⁴ ).
Article
Full-text available
We introduce the response function approach to model the weak lensing statistics in the context of separate universe formalism. Numerical results for the RFs are presented for various semi-analytical models that includes perturbative modelling and variants of halo models. These results extend the recent studies of the Integrated Bispectrum and Trispectrum to arbitrary order. We find that due to the line-of-sight projection effects, the expressions for RFs are not identical to the squeezed correlation functions of the same order. We compute the RFs in three-dimensions using the spherical Fourier-Bessel formalism which provides a natural framework for incorporating photometric redshifts, and relate these expressions to tomographic and projected statistics. We generalise the concept of k-cut power spectrum to k -cut response functions. In addition to response functions, we also define their counterparts in real space, since they are easier to estimate from surveys with low sky-coverage and non-trivial survey boundaries.
Preprint
Full-text available
The Galileo Project is the first systematic scientific research program in search for potential astro8 archaeological artifacts or remnants of extraterrestrial technological civilizations (ETCs) or potentially active equipment near Earth. Taking a path not taken, it conceivably may pick some low-hanging fruit, and without asserting probabilities -- make discoveries of ETC-related objects, which would have far-reaching implications for science and our worldview.
Article
Full-text available
Cosmic Explorer is a concept for a new laser interferometric observatory in the United States to extend ground-based gravitational-wave astrophysics into the coming decades. Aiming to begin operation in the 2030s, Cosmic Explorer will extend current and future detector technologies to a 40 km interferometric baseline—ten times larger than the LIGO observatories. Operating as part of a global gravitational-wave observatory network, Cosmic Explorer will have a cosmological reach, detecting black holes and neutron stars back to the times of earliest star formation. It will observe nearby binary collisions with enough precision to reveal details of the dynamics of the ultradense matter in neutron stars and to test the general-relativistic model of black holes.
Article
Full-text available
The CALorimetric Electron Telescope (CALET) on the International Space Station consists of a high-energy cosmic-ray CALorimeter (CAL) and a lower-energy CALET Gamma-ray Burst Monitor (CGBM). CAL is sensitive to electrons up to 20 TeV, cosmic-ray nuclei from Z = 1 through Z ∼ 40, and gamma rays over the range 1 GeV–10 TeV. CGBM observes gamma rays from 7 keV to 20 MeV. The combined CAL-CGBM instrument has conducted a search for gamma-ray bursts (GRBs) since 2015 October. We report here on the results of a search for X-ray/gamma-ray counterparts to gravitational-wave events reported during the LIGO/Virgo observing run O3. No events have been detected that pass all acceptance criteria. We describe the components, performance, and triggering algorithms of the CGBM—the two Hard X-ray Monitors consisting of LaBr 3 (Ce) scintillators sensitive to 7 keV–1 MeV gamma rays and a Soft Gamma-ray Monitor BGO scintillator sensitive to 40 keV–20 MeV—and the high-energy CAL consisting of a charge detection module, imaging calorimeter, and the fully active total absorption calorimeter. The analysis procedure is described and upper limits to the time-averaged fluxes are presented.
Chapter
New opportunities are being enabled by state‐of‐the‐art instrumentation to search for technosignatures (TS) in time‐series photometric observations. We review existing proposals for detectable TS around extrasolar planets or their parent stars and discuss them in the context of the “axes of merit”. We briefly describe the methods by which such signals could be detected and how challenging that would be with current or upcoming technology.
Preprint
The shear signal required for weak lensing analyses is small, so any detector-level effects which distort astronomical images can contaminate the inferred shear. The Nancy Grace Roman Space Telescope (Roman) will fly a focal plane with 18 Teledyne H4RG-10 near infrared (IR) detector arrays; these have never been used for weak lensing and they present unique instrument calibration challenges. A pair of previous investigations (Hirata & Choi 2020; Choi & Hirata 2020) demonstrated that spatiotemporal correlations of flat fields can effectively separate the brighter-fatter effect (BFE) and interpixel capacitance (IPC). Later work (Freudenburg et al. 2020) introduced a Fourier-space treatment of these correlations which allowed the authors to expand to higher orders in BFE, IPC, and classical nonlinearity (CNL). This work expands the previous formalism to include quantum yield and charge diffusion. We test the updated formalism on simulations and show that we can recover input characterization values. We then apply the formalism to visible and IR flat field data from three Roman flight candidate detectors. We fit a 2D Gaussian model to the charge diffusion at 0.5 $\mu$m wavelength, and find variances of $C_{11} = 0.1066\pm 0.0011$ pix$^2$ in the horizontal direction, $C_{22} = 0.1136\pm 0.0012$ pix$^2$ in the vertical direction, and a covariance of $C_{12} = 0.0001\pm 0.0007$ pix$^2$ (stat) for SCA 20829. Last, we convert the asymmetry of the charge diffusion into an equivalent shear signal, and find a contamination of the shear correlation function to be $\xi_+ \sim 10^{-6}$ for each detector. This exceeds Roman's allotted error budget for the measurement by a factor of $\mathcal{O}(10)$ in power (amplitude squared) but can likely be mitigated through standard methods for fitting the point spread function (PSF) since charge diffusion can be treated as a contribution to the PSF.
Article
We report studies on the mitigation of optical effects of bright low-Earth-orbit (LEO) satellites on Vera C. Rubin Observatory and its Legacy Survey of Space and Time (LSST). These include options for pointing the telescope to avoid satellites, laboratory investigations of bright trails on the Rubin Observatory LSST camera sensors, algorithms for correcting image artifacts caused by bright trails, experiments on darkening SpaceX Starlink satellites, and ground-based follow-up observations. The original Starlink v0.9 satellites are g ∼ 4.5 mag, and the initial experiment “DarkSat” is g ∼ 6.1 mag. Future Starlink darkening plans may reach g ∼ 7 mag, a brightness level that enables nonlinear image artifact correction to well below background noise. However, the satellite trails will still exist at a signal-to-noise ratio ∼ 100, generating systematic errors that may impact data analysis and limit some science. For the Rubin Observatory 8.4 m mirror and a satellite at 550 km, the full width at half maximum of the trail is about 3″ as the result of an out-of-focus effect, which helps avoid saturation by decreasing the peak surface brightness of the trail. For 48,000 LEOsats of apparent magnitude 4.5, about 1% of pixels in LSST nautical twilight images would need to be masked. © 2020. The American Astronomical Society. All rights reserved.
Article
Recent studies have demonstrated that secondary non-Gaussianity induced by gravity will be detected with a high signal-to-noise ratio (S/N) by future and even by on-going weak lensing surveys. One way to characterize such non-Gaussianity is through the detection of a non-zero three-point correlation function of the lensing convergence field, or of its harmonic transform, the bispectrum. A recent study analysed the properties of the squeezed configuration of the bispectrum, when two wavenumbers are much larger than the third one. We extend this work by estimating the amplitude of the (reduced) bispectrum in four generic configurations, i.e. squeezed, equilateral, isosceles and folded, and for four different source redshifts zs = 0.5, 1.0, 1.5, 2.0, by using an ensemble of all-sky high-resolution simulations. We compare these results against theoretical predictions. We find that, while the theoretical expectations based on widely used fitting functions can predict the general trends of the reduced bispectra, a more accurate theoretical modelling will be required to analyse the next generation of all-sky weak lensing surveys. The disagreement is particularly pronounced in the squeezed limit.
Article
Full-text available
We introduce the skew-spectrum statistic for weak lensing convergence κ maps and test it against state-of-the-art high-resolution all-sky numerical simulations. We perform the analysis as a function of source redshift and smoothing angular scale for individual tomographic bins. We also analyse the cross-correlation between different tomographic bins. We compare the numerical results to fitting-functions used to model the bispectrum of the underlying density field as a function of redshift and scale. We derive a closed form expression for the skew-spectrum for gravity-induced secondary non-Gaussianity. We also compute the skew-spectrum for the projected κ inferred from Cosmic Microwave Background (CMB) studies. As opposed to the low redshift case we find the post-Born corrections to be important in the modelling of the skew-spectrum for such studies. We show how the presence of a mask and noise can be incorporated in the estimation of a skew-spectrum.
Preprint
We compute the low-$\ell$ limit of the family of higher-order spectra for projected (2D) weak lensing convergence maps. In this limit, these spectra are computed to an arbitrary order using {\em tree-level} perturbative calculations. We use the flat-sky approximation and Eulerian perturbative results based on a generating function approach. We test these results for the lower-order members of this family, i.e. the skew- and kurt-spectra against state-of-the-art simulated all-sky weak lensing convergence maps and find our results to be in very good agreement. We also show how these spectra can be computed in the presence of a realistic sky-mask and Gaussian noise. We generalize these results to three-dimensions (3D) and compute the {\em equal-time} higher-order spectra. These results will be valuable in analyzing higher-order statistics from future all-sky weak lensing surveys such as the {\em Euclid} survey at low-$\ell$ modes. As illustrative examples, we compute these statistics in the context of the {\em Horndeski} and {\em Beyond Horndeski} theories of modified gravity. They will be especially useful in constraining theories such as the Gleyzes-Langlois-Piazza-Vernizzi (GLPV) theories and Degenerate Higher-Order Scalar-Tensor (DHOST) theories as well as the commonly used normal-branch of Dvali-Gabadadze-Porrati (nDGP) model, clustering quintessence models, and scenarios with massive neutrinos.
Article
Full-text available
Astronomy is considered by many to be a gateway science owing to its ability to inspire curiosity in everyone irrespective of age, culture, or general inclination towards science. Currently, where there is a global push to get more students engaged in Science, Technology, Engineering, and Mathematics, astronomy provides an invaluable conduit to achieve this shift. This paper highlights the results of a study which has reviewed the presence and extent to which astronomy has been incorporated into the school curriculum of the Organisation for Economic and Cooperative Development (OECD) member countries. In addition, two others strong in astronomy research, China and South Africa, are included together with the International Baccalaureate Diploma science curriculum. A total of 52 curricula from 37 countries were reviewed. The results reveal that astronomy and its related topics are prevalent in at least one grade in all curricula. Of the 52 curricula, 44 of them had astronomy-related topics in grade 6, 40 introduced astronomy-related topics in grade 1, whilst 14 had astronomy-related topics explicitly mentioned in all grades. At all year levels, celestial motion is the dominant content area; however, topics such as stars, physics, cosmology, and planetary science become much more frequent as a proportion towards the higher year levels. The most common keywords employed in the curricula related to basic astronomy concepts were the Earth, Sun, Moon, and stars, all with a high frequency of use. There is hardly any focus on Indigenous Astronomy or the role of prominent women astronomers. Relational textual analysis using Leximancer revealed that all the major concepts could be encompassed within two broad themes: Earth and Physics. Astronomy and Physics are often seen as different domains, with astronomy content being more facts based, than based on concepts.
Preprint
Full-text available
Recent studies have demonstrated that {\em secondary} non-Gaussianity induced by gravity will be detected with a high signal-to-noise (S/N) by future and even by on-going weak lensing surveys. One way to characterise such non-Gaussianity is through the detection of a non-zero three-point correlation function of the lensing convergence field, or of its harmonic transform, the bispectrum. A recent study analysed the properties of the squeezed configuration of the bispectrum, when two wavenumbers are much larger than the third one. We extend this work by estimating the amplitude of the (reduced) bispectrum in four generic configurations, i.e., {\em squeezed, equilateral, isosceles} and {\em folded}, and for four different source redshifts $z_s=0.5,1.0,1.5,2.0$, by using an ensemble of all-sky high-resolution simulations. We compare these results against theoretical predictions. We find that, while the theoretical expectations based on widely used fitting functions can predict the general trends of the reduced bispectra, a more accurate theoretical modelling will be required to analyse the next generation of all-sky weak lensing surveys. The disagreement is particularly pronounced in the squeezed limit.
Article
Astrophysical observations currently provide the only robust, empirical measurements of dark matter. In the coming decade, astrophysical observations will guide other experimental efforts, while simultaneously probing unique regions of dark matter parameter space. This white paper summarizes astrophysical observations that can constrain the fundamental physics of dark matter in the era of LSST. We describe how astrophysical observations will inform our understanding of the fundamental properties of dark matter, such as particle mass, self-interaction strength, non-gravitational interactions with the Standard Model, and compact object abundances. Additionally, we highlight theoretical work and experimental/observational facilities that will complement LSST to strengthen our understanding of the fundamental characteristics of dark matter.
Preprint
We have conducted a data study of leadership and participation in NASA's Astrophysics Explorer-class missions for the nine solicitations issued during the period 2008-2016, using gender as a marker of diversity. During this time, 102 Principal Investigators (PIs) submitted Explorer-class proposals; only four of these PIs were female. Among the 102 PIs, there were 61 unique PIs overall; of these, just three were female. The percentage of females in science teams in these proposals ranges from a low of 10% to a high of 19% across the various solicitations. Combining data from all these Explorer-class proposals, we find that the overall participation by females in science teams is 14%. Eighteen of the Explorer-class proposals had zero females in science roles, and this includes science teams with as many as 28 members. These results demonstrate that participation by women in the leadership of and, in many cases, on the science teams of proposals for Explorer-class missions is well below the representation of women in astronomy and astrophysics as a whole. In this white paper, we present our data and a discussion of our results, their context, and the ramifications for consideration by Astro2020 in its study of the state of the profession.
Preprint
The Large Synoptic Survey Telescope (LSST) can advance scientific frontiers beyond its groundbreaking 10-year survey. Here we explore opportunities for extended operations with proposal-based observing strategies, new filters, or transformed instrumentation. We recommend the development of a mid-decade community- and science-driven process to define next-generation LSST capabilities.
Preprint
Full-text available
Astrophysical observations currently provide the only robust, empirical measurements of dark matter. In the coming decade, astrophysical observations will guide other experimental efforts, while simultaneously probing unique regions of dark matter parameter space. This white paper summarizes astrophysical observations that can constrain the fundamental physics of dark matter in the era of LSST. We describe how astrophysical observations will inform our understanding of the fundamental properties of dark matter, such as particle mass, self-interaction strength, non-gravitational interactions with the Standard Model, and compact object abundances. Additionally, we highlight theoretical work and experimental/observational facilities that will complement LSST to strengthen our understanding of the fundamental characteristics of dark matter.
Article
In this dissertation, we first present an analysis on the effect of wind at the Blanco Telescope, the home of the Dark Energy Camera (DECam), on Dark Energy Survey (DES) image quality. We find it to have a likely negligible impact on the weak gravitational lensing measurements conducted with images taken during high wind. ^ We then present the methods and validation of two new techniques in weak lensing shear and magnification measurement. We demonstrate highly accurate recovery of weak gravitational lensing shear using an implementation of the Bayesian Fourier Domain (BFD) method, proposed by Bernstein and Armstrong (2014), extended to correct for selection biases. The BFD formalism is rigorously correct for Nyquist-sampled, background-limited, uncrowded image of background galaxies. We conduct initial tests of this code on &ap;109 simulated lensed galaxy images and recover the simulated shear to a fractional accuracy of m = (2.1 ± 0.4) × 10–3 , substantially more accurate than has been demonstrated previously for any generally applicable shear measurement method. ^ We also introduce a new Bayesian method for selecting high-redshift galaxies and calculating their magnification around foreground lenses. We apply this method to galaxies from DES Science Verification (SV). ^ Finally, we share the results of a survey conducted with DES collaborators on the collaboration itself, in which we find positive attitudes towards education and public outreach (EPO) in physics and astronomy. We also provide recommendations for current and future surveys on how to increase EPO engagement by scientists.
Conference Paper
The coronagraphic instrument (CGI) on the Wide-Field Infrared Survey Telescope (WFIRST) will demonstrate technologies and methods for high-contrast direct imaging and spectroscopy of exoplanet systems in reflected light, including polarimetry of circumstellar disks. The WFIRST management and CGI engineering and science investigation teams have developed requirements for the instrument, motivated by the objectives and technology development needs of potential future flagship exoplanet characterization missions such as the NASA Habitable Exoplanet Imaging Mission (HabEx) and the Large UV/Optical/IR Surveyor (LUVOIR). The requirements have been refined to support recommendations from the WFIRST Independent External Technical/Management/Cost Review (WIETR) that the WFIRST CGI be classified as a technology demonstration instrument instead of a science instrument. This paper provides a description of how the CGI requirements flow from the top of the overall WFIRST mission structure through the Level 2 requirements, where the focus here is on capturing the detailed context and rationales for the CGI Level 2 requirements. The WFIRST requirements flow starts with the top Program Level Requirements Appendix (PLRA), which contains both high-level mission objectives as well as the CGI-specific baseline technical and data requirements (BTR and BDR, respectively)... We also present the process and collaborative tools used in the L2 requirements development and management, including the collection and organization of science inputs, an open-source approach to managing the requirements database, and automating documentation. The tools created for the CGI L2 requirements have the potential to improve the design and planning of other projects, streamlining requirement management and maintenance. [Abstract Abbreviated]
Article
Current time domain facilities are finding several hundreds of transient astronomical events a year. The discovery rate is expected to increase in the future as soon as new surveys such as the Zwicky Transient Facility (ZTF) and the Large Synoptic Sky Survey (LSST) come on line. At the present time, the rate at which transients are classified is approximately one order or magnitude lower than the discovery rate, leading to an increasing "follow-up drought". Existing telescopes with moderate aperture can help address this deficit when equipped with spectrographs optimized for spectral classification. Here, we provide an overview of the design, operations and first results of the Spectral Energy Distribution Machine (SEDM), operating on the Palomar 60-inch telescope (P60). The instrument is optimized for classification and high observing efficiency. It combines a low-resolution (R$\sim$100) integral field unit (IFU) spectrograph with "Rainbow Camera" (RC), a multi-band field acquisition camera which also serves as multi-band (ugri) photometer. The SEDM was commissioned during the operation of the intermediate Palomar Transient Factory (iPTF) and has already proved lived up to its promise. The success of the SEDM demonstrates the value of spectrographs optimized to spectral classification. Introduction of similar spectrographs on existing telescopes will help alleviate the follow-up drought and thereby accelerate the rate of discoveries.
Article
The U.S. astronomy decadal surveys have been models for advice to government on how to apportion resources to optimise the scientific return on national investments in facilities and manpower. The U.S. is now gearing up to conduct its 2020 survey and the results are likely to guide international astronomy far into the future. Here, I summarize the current strains in an otherwise world-leading program of ground- and space-based astronomical discovery and some of the issues that will be faced by the participants in this upcoming collective exercise.
Article
The Wide Field InfraRed Survey Telescope (WFIRST) was the highest ranked large space-based mission of the 2010 New Worlds, New Horizons decadal survey. It is now a NASA mission in formulation with a planned launch in the mid-2020's. A primary mission objective is to precisely constrain the nature of dark energy through multiple probes, including Type Ia supernovae (SNe Ia). Here, we present the first realistic simulations of the WFIRST SN survey based on current hardware specifications and using open-source tools. We simulate SN light curves and spectra as viewed by the WFIRST wide-field channel (WFC) imager and integral field channel (IFC) spectrometer, respectively. We examine 11 survey strategies with different time allocations between the WFC and IFC, two of which are based upon the strategy described by the WFIRST Science Definition Team, which measures SN distances exclusively from IFC data. We propagate statistical and, crucially, systematic uncertainties to predict the dark energy task force figure of merit (DETF FoM) for each strategy. The increase in FoM values with SN search area is limited by the overhead times for each exposure. For IFC-focused strategies the largest individual systematic uncertainty is the wavelength-dependent calibration uncertainty, whereas for WFC-focused strategies, it is the intrinsic scatter uncertainty. We find that the best IFC-focused and WFC-exclusive strategies have comparable FoM values. Even without improvements to other cosmological probes, the WFIRST SN survey has the potential to increase the FoM by more than an order of magnitude from the current values. Although the survey strategies presented here have not been fully optimized, these initial investigations are an important step in the development of the final hardware design and implementation of the WFIRST mission.
Article
Infrastructures are not inherently durable or fragile, yet all are fragile over the long term. Durability requires care and maintenance of individual components and the links between them. Astronomy is an ideal domain in which to study knowledge infrastructures, due to its long history, transparency, and accumulation of observational data over a period of centuries. Research reported here draws upon a long-term study of scientific data practices to ask questions about the durability and fragility of infrastructures for data in astronomy. Methods include interviews, ethnography, and document analysis. As astronomy has become a digital science, the community has invested in shared instruments, data standards, digital archives, metadata and discovery services, and other relatively durable infrastructure components. Several features of data practices in astronomy contribute to the fragility of that infrastructure. These include different archiving practices between ground- and space-based missions, between sky surveys and investigator-led projects, and between observational and simulated data. Infrastructure components are tightly coupled, based on international agreements. However, the durability of these infrastructures relies on much invisible work - cataloging, metadata, and other labor conducted by information professionals. Continual investments in care and maintenance of the human and technical components of these infrastructures are necessary for sustainability.
Article
Transition Metals (TM) are proposed to play a role in astrophysical environments in both gas and solid state astrochemistry by co-determining the homogeneous/heterogeneous chemistry represented by the gas/gas and gas/dust grain interactions. Their chemistry is function of temperature, radiation field and chemical composition/coordination sphere and as a consequence, dependent from the astrophysical object where TM are localized. Here five main categories of TM compounds are proposed and classified as: a) pure bulk and clusters; b) TM naked ions; c) TM oxides/minerals or inorganic; d) TM-L (L = ligand) with L = (σ and/or π)-donor/acceptor species like H/H2, N/N2, CO, H2O and e) TM-organoligands such as Cp, PAH, R1=°=°=R2. Each of the classes is correlated to their possible localization within astrophysical objects. Because of this variety coupled with their ability to modulate reactivity and regio/enantioselectivity by ligand sphere composition, TM compounds can introduce a fine organic synthesis in astrochemistry. For the selection of small TM parental compounds to be analyzed as first examples, by constraining the TM and the second element/molecule on the basis of their cosmic abundances and mutual reactivity, Fe atoms coupled with N and CO are studied by developing the chemistry of [FeN]+, [FeNH]+ and [(CO)2FeN]+. These molecules, due to their ability to perform C-C and C-H bond activations, are able to open the pathway toward the nitrogenation/amination and carbonylation of organic substrates. By considering the simplest organic substrate CH4, the parental reaction schemes (gas phase, T=30 K): I) [FeN]+ + CH4 + H → [Fe]+ + H3C-NH2; II) [FeNH]+ + CH4 → [Fe]+ + H3C-NH2 and III) [(CO)2FeN]+ + H → [FeCO]+ + HNCO are analyzed by theoretical methods (B2PLYP double hybrid functional/TZVPPP basis set). All reactions are thermodynamically favored and first step transition states can follow a minimal energy path by spin crossing, while H extraction in reaction II shows very high activation energies. The need to overcome high activation energy barriers underlines the importance of molecular activation by radiation and particle collision. TM chemistry is expected to contribute to the known synthesis of organic compounds in space leading towards a new direction in the astrochemistry field whose qualitative (type of compounds) and quantitative contribution must be unraveled.
ResearchGate has not been able to resolve any references for this publication.