Found 76 talks archived in Cosmology
On March 17 the team responsible for the BICEP2 experiment, a CMB telescope located in the South Pole, announced the discovery of the primordial B-mode signal in the CMB polarization. This discovery inmediatly had a well-deserved impact in the media world-wide. In fact, it is the first observational confirmation of a prediction from the inflationary model, which was proposed at the beginning of the 80s as a solution for some inconsistencies of the Big Bang model. In this talk I will put this discovery in the context of CMB research, with a historical perspective. I will emphasize the importance of this discovery for Cosmology, and for Fundamental Physics, and will finally comment the prospects for the future, in particular the role of experiments like Quijote that have to confirm this signal.
The accelerated expansion of the Universe discovered in the late 90's has opened one of the most intriguing questions of modern physics. To help to understand its origin, and measure the expansion history of the Universe, large galaxy spectroscopic surveys are being carried out and planned for the future. In this talk, I will review the Baryon Oscillation Spectroscopic Survey (BOSS) and the requirements to achieve its precise results. I will then describe a sample of large-volume high-resolution N-body simulations available at MultiDark database,
that are useful to test the models. Finally, I will present some work I have been doing aimed at producing a large number of mock galaxy catalogs using an improved lagrangian perturbation theory calibrated with these simulations. Mock galaxy catalogs are essential to produce reliable cosmological constraints from these surveys.
Next generation of CMB experiments will require a large number of detectors (few tens of thousands) in order to tackle the challenging detection of primordial polarization B modes. Furthermore, high resolution experiments are needed for a detailed study of high redshift objects including clusters of galaxies, proto-clusters and dusty galaxies. Within this context Kinetic Inductance Detectors (KIDs) are a serious alternative to bolometers at millimetre wavelengths. Indeed, KIDs are naturally multiplexed and compact allowing us to construct arrays of thousands of detectors. Furthermore, KIDs present short time constants (below 1 ms) and have been demonstrated to be background limited on ground based observations. The NIKA camera, made of two matrices (200 KIDs each) operated at 140 and 240 GHz, has been installed successfully at the IRAM 30 m telescope in Pico Veleta, Granada. NIKA has provided the first ever scientific quality astrophysical observations with KIDs. In particular RXJ1347.5-1145, a massive intermediate redshift galaxy cluster at z = 0.4516 undergoing a merging event, has been successfully mapped at 12 arcsec resolution by NIKA. NIKA is a general purpose camera and it can be also used for other astrophysical objectives including for example observations of high redshift galaxies and proto-clusters, and detailed intensity and polarisation mapping of star-forming regions in the Galaxy. NIKA is a prototype of the NIKA2 camera that should be installed in 2015 at the IRAM 30 m telescope. NIKA2 should have 2 frequency bands at 150 and 250 GHz with about 5000 detectors in total and polarisation capabilities. NIKA2 will be well-suited for in-depth studies of the Intra Cluster Medium in intermediate to high redshift clusters and the follow-up of clusters and proto-clusters newly discovered by the Planck satellite. Finally, we discuss the possibility of including KIDs in the next generation of CMB satellites as for example PRISM.
About half the baryons in the local Universe could be in the form of a Warm Hot Intergalactic Medium (WHIM). If a large fraction of the gas is ionized, it could produce significant temperature anisotropies in the Cosmic Microwave Background (CMB), generated by the thermal and also the kinematic Sunyaev-Zeldovich effect. We have developed a theoretical framework to describe the mildly non-linear regime of the WHIM that allows us to compute its contribution to CMB anisotropies. We discuss prospective ways of detecting the WHIM contribution using our formalism and discuss our results on PLANCK data and the constraints we set on the WHIM parameters.
Twenty years ago, no one convincingly knew the age or the size of the
Universe to within a factor of two. Ten years ago, everyone agreed on
those same two numbers to within 10%. Today, we arguably have brought
the errors down by another factor of two. But that has led to anxiety
rather than euphoria, renewed interest rather than complacency. The
problem is that there are now two independent, competing methods
giving answers of comparable precision and accuracy:
one is a model-based method using the cosmic microwave background
(the CMB), the other is a geometric, parallax-based method using local
measures of distances and expansion velocities. To within about
two-sigma the methods agree. To within about two-sigma the methods
disagree. And basic physics (a fourth neutrino species, perhaps) hangs
in the balance.
I will discuss how this "tension" arose and how it will soon be
relieved. A tie-breaker has been identified and developed, and it is
now being worked on from the ground and from space.
Two main families of models explain that, at least in appearance, something like 90% of the mass of the Universe is still undetected. One (supported by an overwhelmingly large fraction of the community) is the dark matter model, in which the missing mass is postulated to be made of exotic non-baryonic particles. The other one, is modifying gravity (MOND, MOG, ...) in such a way that it compensates the apparent lack of mass. Both approaches are purely ad-hoc and so far not based in first principles of fundamental physics. Since I am not a specialist, in dark matter or modified gravities, the talk I am proposing is intended to be made purely from a philosophical/sociological/historical point of view. I expect the talk to be an open debate. The philosophical thesis I will defend is that the order in the discovery of some astronomical landmarks has led the community to favour dark matter model. In my opinion, this has caused darkmatter to receive a larger funding and become more successful at describing reality than alternative models. I will try to expose to the audience that, from a purely philosophical point of view, the dark matter model and the modified gravity models are equally speculative and equally (in)valid. I will make the point that dark matter has to be taken only as an extremely complex model which is useful for the description of reality and not as reality itself.
DESI is a massively multiplexed fiber-fed spectrograph that will make the next
major advance in dark energy in the timeframe 2018-2022. On the Mayall
telescope, DESI will obtain spectra and redshifts for tens of millions of
galaxies and cuasars with 5,000 fiber postioner robots, constructing a
3-dimensional map spanning the nearby universe to 10 billion light years. DESI
is supported by the US Department of Energy Office of Science to perform this
Stage IV dark energy measurement using baryon acoustic oscillations and other
techniques that rely on spectroscopic measurements. Spain has a major role in
DESI with the construction of the Focal Plate and the development of the fiber
positioners. I will give an overview of the DESI science, instrument, and Spain
participation in the project.
I will discuss a new, open-source astronomical image-fitting program, specialized for galaxies, which is fast, flexible, and highly extensible. A key characteristic is an object-oriented design which allows new types of image components (2D surface-brightness functions) to be easily written and added to the program. Image functions provided with the program include the usual suspects for galaxy decompositions (Sersic, exponential, Gaussian), along with Core-Sersic and broken-exponential profiles, elliptical rings, and components which perform line-of-sight integration through 3D luminosity-density models of disks and rings seen at arbitrary inclinations. Minimization can be done using the standard chi^2 statistic (using either data or model values to estimate per-pixel errors) or the Cash statistic, which is appropriate for Poisson data in low-count regimes; different minimization algorithms allow trade-offs between speed and decreased sensitivity to local minima in the fit landscape. I will also show that fitting low-S/N galaxy images by minimizing chi^2 can lead to significant biases in fitted parameter values, which are avoided if the Cash statistic is used; this is true even when Gaussian read noise is present.
Dark matter makes up most of the mass of the Universe but remains mysterious. I discuss recent progress in constraining its properties by measuring its distribution in the Universe from tiny dwarf galaxies to giant galaxy clusters, and comparing this with numerical simulations. The latest results favour a cold, collisionless particle that must lie beyond the standard model of particle physics. I discuss the known small scale problems with this model: the cusp-core and missing satellites problems, and I argue that these are likely due to baryonic "feedback" during galaxy formation. I conclude with a discussion of experiments underway to detect dark matter particles, and the role that astrophysics has to play in these too. There is an exciting a very real prospect of detecting a dark matter particle in the next five years.
Over the past decade there has been a growing body of evidence for a closely regulated balance of heating and cooling of the intracluster medium in the cores of clusters. I will review this evidence with a particular emphasis on the role of cold gas and dust as the fuel for AGN feedback that dominates these systems.
- Design of the control system of the Four Laser Guide Star Facility of VLTIvan Guidolin/Mauro CominTuesday February 18, 2020 - 12:00 (Aula)
- Big data, Big responsability: reproducible, archivable and branchable pipelinesDr. Mohammad Akhlaghi, Raúl Infante-SainzThursday February 20, 2020 - 10:30 (Aula)