Found 19 talks archived in Computing
( * Please bring your laptop or your smartphone with you * )
Astronomers are not only skilled computer users, but also experienced Internet users. Their work could benefit from latest improvement in the world of the web. In particular, new client-side, server-side, and database technologies can be used to set up:
1) extremely performing data mining applications;
2) multi-thread CPU/GPU simulation or reduction software;
3) efficient ways to communicate results or to exchange data;
4) modern web-based tools for astronomic research and outreach.
After a quick introduction and a basic history of the evolution of web technologies during these last 15 years, we will present interactive examples of the above-mentioned technologies, focusing on client-side innovations.
I will discuss a new, open-source astronomical image-fitting program, specialized for galaxies, which is fast, flexible, and highly extensible. A key characteristic is an object-oriented design which allows new types of image components (2D surface-brightness functions) to be easily written and added to the program. Image functions provided with the program include the usual suspects for galaxy decompositions (Sersic, exponential, Gaussian), along with Core-Sersic and broken-exponential profiles, elliptical rings, and components which perform line-of-sight integration through 3D luminosity-density models of disks and rings seen at arbitrary inclinations. Minimization can be done using the standard chi^2 statistic (using either data or model values to estimate per-pixel errors) or the Cash statistic, which is appropriate for Poisson data in low-count regimes; different minimization algorithms allow trade-offs between speed and decreased sensitivity to local minima in the fit landscape. I will also show that fitting low-S/N galaxy images by minimizing chi^2 can lead to significant biases in fitted parameter values, which are avoided if the Cash statistic is used; this is true even when Gaussian read noise is present.
Many images (and signals) admit sparse representations in the sense that they are well approximated by linear combinations of a small number of functions taken from know sets. The topic of sparse and redundant representations, often termed as a sparse regression or sparse coding, has attracted tremendous interest from the research community in the last ten years. This interest stems from the role that the low dimensional models play in many signal and image areas such as compression, restoration, classification, and design of priors and regularizers, just to name a few. In this talk we use the sparse approximations for phase and magnitude of a complex-valued wavefield. While our techniques are quite general here they are illustrated for processing phase-shifting interferometry measurements. It is assumed that the observations are Poissonian (photon counting). In this way we are targeting at optimal sparse reconstruction of both phase and magnitude taking into consideration all details of the observation formation. Contrary to the standard variational approaches we propose a vector optimization with two objective functions leading to decoupling of inverse and denoising operations. This reconstruction is framed as a maximum likelihood constrained nonlinear optimization problem. It is demonstrated by simulation that proposed recursive algorithm is efficient, demonstrates high accuracy and better imaging performance in comparison with the current state-of-the-art.
En esta charla se abordarán herramientas de interés para el desarrollo de actividades en línea con equipos de trabajo ubicados es espacios geográficos diferentes. Se darán a conocer herramientas de utilidad para el trabajo en la nube, especialmente en aquellos casos en los que tenemos que contactar con personas ubicadas en espacios geográficos distintos.
With the advent of GPU accelerators the landscape of High Performance Computing has started to change rapidly. While this is in principle good news, the increased compute power comes with a steep price tag in that new languages (CUDA, OpenCL) must be used. Recently Intel has announced their own coprocessor Many Integrated Cores (MIC) technology which will deliver competitive performance but will be programmed through familiar languages (Fortran, C/C++ and OpenMP). In my talk I will introduce Intel's MIC architecture and will discuss the ongoing efforts at the Texas Advanced Computing Center to build a 10 PetaFlop cluster with MIC coprocessors in early 2013. Coprocessors (MIC) and accelerators (GPU) are here to stay and the changing hardware will spur considerable changes in general software design. Astrophysics codes of all varieties (for example highly parallel simulations, data-intensive software pipelines for large surveys, and even data reduction software on desktops) will have to adapt to the new environment. I will discuss software design, performance considerations, and optimizations in general and specifically with respect to the MIC technology. In the second part of my talk I will introduce the software package ASSET (Advanced Spectral Synthesis 3D Tool) that allows for the fast and efficient calculation of spectra from 3D hydrodynamical models and will highlight recent projects that have employed high-resolution (> 1,000,000), wide-range (1000's of Angstroem) synthetic spectra derived from 3D radiation transfer.
This lecture will address recent progress in modeling the emergence of cosmic structure at high redshifts. Also new insights gained from numerical simulations into the processes relevant for star formation are presented. Rapid magnetic field growth in galaxies and the important role of proto-stellar outflows regulating star formation up to pc scales are particularly highlighted.