Panel Session 5.4

Author: Johanna Casado
Affiliation: Instituto de Tecnologías en Detección y Astropartículas (CNEA, CONICET, UNSAM), Mendoza, Argentina.
Instituto de Bioingeniería, Facultad de Ingeniería, Universidad de Mendoza, Argentina.
Country: Argentina

Co-Author(s): Wanda Díaz-Merced
Affiliation: Office of Astronomy for Development (OAD – IAU), Southafrica.
Office of Astronomy for Outreach (OAO), Japan.

Beatriz García
Affiliation: Instituto de Tecnologías en Detección y Astropartículas (CNEA, CONICET, UNSAM), Mendoza, Argentina.
Universidad Tecnológica Nacional, Argentina.
Country: Argentina

Title: Analysis of astronomical data through sonification:  reaching more inclusion for visual impairment scientist.
Big data drives the all inclusive aspects of societal and human development. Regardless, it seems
that international efforts have been purposefully limited to the development of packages addressing
visualisation of big data (http://hdr.undp.org/en/data-visualization-challenge-2019). This is the
case even after sound has been evidenced to be a good complement to the interpretation of complex
information when used as an adjunct to visual display [1]. Human beings may combine different attention
modalities towards vigilance tasks to perceive events by nature blind and/or ambiguous to the eye.
In 2015 around 1.3 billion people have vision impairment, of which near 36 million people were blind
(https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment). Is very important
to notice that some of this persons acquire the disability after finishing their careers, and the actual
tools do not allow them to continue working. A previous study shows that some of the programs available
to sonify large data sets and symmetrically display the plot are not accessible according to the
ISO 9241-171:2008 standard, concluding that one of the principal problems are that these software are not
user centred from the beginning [2].
Taking into account the problem of accessibility to the astronomical data and the growing volume of
data to analyze, the authors developed a Human Centred Interface [3] based on the stimulation of more
than one sense (especially audition), to complement the current astronomy data display techniques and
generate greater accessibility to astronomical data. The program was tested with some astronomical data
of different areas (optical spectrum, light curves and cosmic ray detections, among others). Finally,
the results showed that the present tool allows the analysis of different data sets through sound and
visualization and the final plots are equal to those provided by the databases or plotted with
conventional tools.

References:
[1] Diaz-Merced, W. L. (2013). Sound for the Exploration of Space Physics Data. PhD Thesis,
University of Glasgow. Extracted from: http://theses.gla.ac.uk/5804/1/2014DiazMercedPHD.pdf
[2] Casado, J., Cancio, A., García, B., Diaz-Merced, W. L., Jaren, G. (2017). Sonification
Prototypes Review Based on Human-Centred Processes. XXI Congreso Argentino de
Bioingeniería – X Jornadas de Ingeniería Clínica, Universidad Nacional de Córdoba, Argentina.
[3] Casado, J., Carricondo Robino, J., Palma, A., Díaz-Merced, W., García, B., CONICET-Argentina,
& Universidad de Mendoza-Argentina (2019). sonoUno. GitHub, Inc. Extracted from:
https://github.com/sonoUnoTeam/sonoUno