The scientific and engineering communities are investing in the generation of large voxel datasets. Procedures and standards to record acquisition, sample condition, processing algorithms and imaging parameters are vital to maintain long-term data value. Even when these are conserved, data remains largely inaccessible to the wider community, hindering both business and research exploitation, an issue increasingly recognised by funding bodies.

Engaging imaging communities in the use and development of standards-compliant schemas and ontologies is therefore essential. Tools to then conserve significant volumes of 3D data are discussed, including cloud compute and storage opportunities. Policy and practical issues are considered in relation to institutional and disciplinary repository models for managing and exposing datasets. The real value will become apparent as semantic technologies become established, at which stage the ever-increasing body of experimental data will provide new opportunities for knowledge discovery within and across disciplines.