The IllustrisTNG Simulations: Public Data Release

We present the full public release of all data from the TNG100 and TNG300 simulations of the IllustrisTNG project. IllustrisTNG is a suite of large volume, cosmological, gravo-magnetohydrodynamical simulations run with the moving-mesh code Arepo. TNG includes a comprehensive model for galaxy formation physics, and each TNG simulation self-consistently solves for the coupled evolution of dark matter, cosmic gas, luminous stars, and supermassive blackholes from early time to the present day, z=0. Each of the flagship runs -- TNG50, TNG100, and TNG300 -- are accompanied by lower-resolution and dark-matter only counterparts, and we discuss scientific and numerical cautions and caveats relevant when using TNG. Full volume snapshots are available at 100 redshifts; halo and subhalo catalogs at each snapshot and merger trees are also released. The data volume now directly accessible online is ~750 TB, including 1200 full volume snapshots and ~80,000 high time-resolution subbox snapshots. This will increase to ~1.1 PB with the future release of TNG50. Data access and analysis examples are available in IDL, Python, and Matlab. We describe improvements and new functionality in the web-based API, including on-demand visualization and analysis of galaxies and halos, exploratory plotting of scaling relations and other relationships between galactic and halo properties, and a new JupyterLab interface. This provides an online, browser-based, near-native data analysis platform which supports user computation with fully local access to TNG data, alleviating the need to download large simulated datasets.


Introduction
Some of our most powerful tools for understanding the origin and evolution of large-scale cosmic structure and the galaxies which form therein are cosmological simulations. From pioneering beginnings (Davis et al., 1985;Press and Schechter, 1974), dark matter, gravity-only simulations have evolved into cosmological hydrodynamical simulations (Katz et al., 1992). These aim to self-consistently model the coupled evolution of dark matter, gas, stars, and blackholes at a minimum, and are now being extended to also include magnetic fields, radiation, cosmic rays, and other fundamental physical components. Such simulations provide foundational support in our understanding of the ΛCDM cosmological model, including the nature of both dark matter and dark energy.
Modern large-volume simulations now capture cosmological scales of tens to hundreds of comoving megaparsecs, while simultaneously resolving the internal structure of individual galaxies at 1 kpc scales. Recent examples reaching z = 0 include Illustris Vogelsberger et al., 2014b), EAGLE Schaye et al., 2015), Horizon-AGN (Dubois et al., 2014), Romulus (Tremmel et al., 2017), Simba (Davé et al., 2019), Magneticum (Dolag et al., 2016), among others. These simulations produce observationally verifiable outcomes across a diverse range of astrophysical regimes, from the stellar and gaseous properties of galaxies, galaxy populations, and the supermassive blackholes they host, to the expected distribution of molecular, neutral, and ionized gas tracers across interstellar, circumgalactic, and intergalactic scales, in addition to the expected distribution of the dark matter component itself.
Complementary efforts, although not the focus of this data release, include high redshift reionizationera simulations such as BlueTides (Feng et al., 2016), Sphinx (Rosdahl et al., 2018), and CoDa II , among others. In addition, 'zoom' simulation campaigns include NIHAO (Wang et al., 2015), FIRE-2 (Hopkins et al., 2018), and Auriga (Grand et al., 2017), in addition to many others. These have provided numerous additional insights into many questions in galaxy evolution (recent progress reviewed in Faucher-Giguère, 2018). For instance, reionization simulations may be able to include explicit radiative transfer, and zoom simulations may be able to reach higher resolutions and/or more rapidly explore model variations, in comparison to large cosmological volume simulations.
In order to inform theoretical models using observational constraints, as well as to interpret observational results using realistic cosmological models, public data dissemination from both observational and simulation campaigns is required. Observational data release has a successful history dating back at least to the SDSS SkyServer , which provides tools for remote users to query and acquire large datasets . The still-in-use approach is based on user written SQL queries, which provide search results as well as data acquisition. From the theoretical community, the public data release of the Millennium simulation  was the first attempt of similar scale. Modeled on the SDSS approach, data was stored in a relational database, with an immediately recognizable SQL-query interface (Lemson and Virgo Consortium, 2006). It has since been extended to include additional simulations, including Millennium-II (Boylan-Kolchin et al., 2009;Guo et al., 2011), and a first attempt at the idea of a "virtual observatory" (VO) was realized (Overzier et al., 2013). The Theoretical Astrophysical Observatory (TAO; Bernyk et al., 2014) was similarly focused around mock observations of simulated galaxy and galaxy survey data. Explorations continue on how to best deliver theoretical results within the existing VO framework (Lemson and Zuther, 2009;Lemson et al., 2014).
Other dark-matter only simulations have released data with similar approaches, including Bolshoi and MultiDark (CosmoSim; Klypin et al., 2011;Riebe et al., 2013), DEUS (Rasera et al., 2010), and MICE (Cosmohub; Crocce et al., 2010). In contrast, some recent simulation projects have made group catalogs and/or snapshots available for direct download, including MassiveBlack-II (Khandai et al., 2014), the Dark Sky simulation (Skillman et al., 2014), ν 2 GC (Makiya et al., 2016), and Abacus . Skies and Universes (Klypin et al., 2017) organizes a number of such data releases. With respect to Illustris, the most comparable in simulation type, data complexity, and scientific scope is the recent public data release of the Eagle simulation, described in McAlpine et al. (2016) (see also Camps et al., 2018). The initial group catalog release was modeled on the Millennium database architecture, and the raw snapshot data was also subsequently made available (The EAGLE team, 2017). More recently, significant infrastructure research and development has focused on providing remote computational resources, including the NOAO Data Lab (Fitzpatrick et al., 2014) and the SciServer project (Medvedev et al., 2016;Raddick et al., 2017). Web-based orchestration projects also include Ragagnin et al. (2017), Tangos (Pontzen and Tremmel, 2018), and Jovial (Araya et al., 2018).
The public release of IllustrisTNG (hereafter, TNG) follows upon and further develops tools and ideas pioneered in the original Illustris data release. We offer direct online access to all snapshot, group catalog, merger tree, and supplementary data catalog files. In addition, we develop a web-based API which allows users to perform many common tasks without the need to download any full data files. These include searching over the group catalogs, extracting particle data from the snapshots, accessing individual merger trees, and requesting visualization and further data analysis functions. Extensive documentation and programmatic examples (in the IDL, Python, and Matlab languages) are provided.
This paper is intended primarily as an overview guide for TNG data users, describing updates and new features, while exhaustive documentation will be maintained online. In Section 2 we give an overview of the simulations. Section 3 describes the data products, and Section 4 discusses methods for data access. In Section 5 we present some scientific remarks and cautions, while in Section 6 we discuss community considerations including citation requests. Section 7 describes technical details related to the data release architecture, while Section 8 summarizes. Appendix A provides a few additional data details, while Appendix B shows several examples of how to use the API. The three IllustrisTNG simulation volumes: TNG50, TNG100, and TNG300, shown here in projected dark matter density.
In each case the name denotes the box side-length in comoving Mpc. The largest, TNG300, enables the study of rare, massive objects such as galaxy clusters, and provides unparalleled statistics of the galaxy population as a whole. TNG50, with a mass resolution more than one hundred times better, provides for the detailed examination of internal, structural properties and small-scale phenomena. In the middle, TNG100 uses the same initial conditions as the original Illustris simulation, providing a useful balance of resolution and volume for studying many aspects of galaxy evolution.

Description of the Simulations
IllustrisTNG is a suite of large volume, cosmological, gravo-magnetohydrodynamical simulations run with the moving-mesh code AREPO (Springel, 2010). The TNG project is made up of three simulation volumes: TNG50, TNG100, and TNG300. The first two simulations, TNG100 and TNG300, were recently introduced in a series of five presentation papers Naiman et al., 2018;Nelson et al., 2018a;Pillepich et al., 2018a;Springel et al., 2018), and these are here publicly released in full. The third and final simulation of the project is TNG50 (Nelson et al., 2019b;Pillepich et al., 2019) which will also be publicly released in the future. TNG includes a comprehensive model for galaxy formation physics, which is able to realistically follow the formation and evolution of galaxies across cosmic time Weinberger et al., 2017). Each TNG simulation solves for the coupled evolution of dark matter, cosmic gas, luminous stars, and supermassive blackholes from a starting redshift of z = 127 to the present day, z = 0.
The IllustrisTNG project [1] is the successor of the original Illustris simulation [2] Sijacki et al., 2015;Vogelsberger et al., 2014a,b) and its associated galaxy formation model Vogelsberger et al., 2013). Illustris was publicly released in its entirety roughly three and a half years ago . TNG incorporates an updated 'next generation' galaxy formation model which includes new physics and numerical improvements, as well as refinements to the original model. TNG newly includes a treatment of cosmic magnetism, following the amplification and dynamical impact of magnetic fields, as described below.
The three flagship runs of IllustrisTNG are each accompanied by lower-resolution and dark-matter only counterparts. Three physical simulation box sizes are employed: cubic volumes of roughly 50, 100, and 300 Mpc side length, which we refer to as TNG50, TNG100, and TNG300, respectively. The three boxes complement each other by enabling investigations of various aspects of galaxy formation. The large physical volume associated with the largest simulation box (TNG300) enables, for instance, the study of galaxy clustering, the analysis of rare and massive objects such as galaxy clusters, and provides the largest statistical galaxy sample. In contrast, the smaller physical volume simulation of TNG50 enables a mass resolution which is more than a hundred times better than in the TNG300 simulation, providing a more detailed look at, for example, the structural properties of galaxies, and small-scale gas phenomena in and around galaxies. Situated in the middle, the TNG100 simulation uses the same initial conditions (identical phases, adjusted for the updated cosmology) as the original Illustris simulation. This facilitates robust comparisons between the original Illustris results and the updated TNG model. For many galaxy evolution analyses, TNG100 provides an ideal balance of volume and resolution, particularly for intermediate mass halos. Despite these strengths, each volume still has intrinsic physical and numerical limitations -for instance, TNG300 is still small compared to the scale of the BAO for precision cosmology, and lacks statistics for the most massive halos at ∼ 10 15 M , while TNG50 is still too low-resolution to resolve ultra-faint dwarf galaxies with M 10 5 M , globular clusters, or small-scale galactic features such as nuclear star clusters. We provide an overview and comparison between the specifications of all the TNG runs in Table 1.  TNG300  TNG100  TNG50 10 -1 10 1 Figure 2 Spatial resolution of the three high-resolution TNG simulations at z ∼ 0. The dark regions of the distributions highlight star-forming gas inside galaxies, the corresponding median values marked by dark vertical dotted lines.

Starforming Gas
This data release includes the TNG100 and TNG300 simulations in full. It will, in the future, also include the final TNG50 simulation. For each, snapshots at all 100 available redshifts, halo and subhalo catalogs at each snapshot, and two distinct merger trees are released. This includes three resolution levels of the 100 and 300 Mpc volumes, and four resolution levels of the 50 Mpc volume, decreasing in steps of eight in mass resolution (two in spatial resolution) across levels. The highest resolution realizations, TNG50-1, TNG100-1 and TNG300-1, include 2×2160 3 , 2×1820 3 and 2×2500 3 resolution elements, respectively (see Table 1). As the actual spatial resolution of cosmological hydrodynamical simulations is highly adaptive, it is poorly captured by a single number. Figure 2 therefore shows the distribution of Voronoi gas cell sizes in these three simulations, highlighting the high spatial resolution in star-forming gas -i.e., within galaxies themselves. In contrast, the largest gas cells occur in the low-density intergalactic medium.
All ten of the baryonic runs invoke, without modification and invariant across box and resolution, the fiducial "full" galaxy formation physics model of TNG. All ten runs are accompanied by matched, dark matter only (i.e. gravity-only) analogs. In addition, there are multiple, high time-resolution "subboxes", with up to 8000 snapshots each and time spacing down to one million years.
This paper serves as the data release for IllustrisTNG as a whole, including the future public release of TNG50.

Physical Models and Numerical Methods
All of the TNG runs start from cosmologically motivated initial conditions, assuming a cosmology consistent with the Planck Collaboration (2016) results (Ω Λ,0 = 0.6911, Ω m,0 = 0.3089, Ω b,0 = 0.0486, σ 8 = 0.8159, n s = 0.9667 and h = 0.6774), with Newtonian self-gravity solved in an expanding Universe. All of the baryonic TNG runs include the following additional physical components: (1) Primordial and metal-line radiative cooling in the presence of an ionizing background radiation field which is redshift-dependent and spatially uniform, with additional self-shielding corrections.
(2) Stochastic star formation in dense ISM gas above a threshold density criterion. (3) Pressurization of the ISM due to unresolved supernovae using an effective equation of state model for the two-phase medium.
(4) Evolution of stellar populations, with associated chemical enrichment and mass loss (gas recycling), accounting for SN Ia/II, AGB stars, and NS-NS mergers.
(5) Stellar feedback: galactic-scale outflows with an energy-driven, kinetic wind scheme. (6) Seeding and growth of supermassive blackholes. (7) Supermassive blackhole feedback: accreting BHs release energy in two modes, at high-accretion rates ('quasar' mode) and low-accretion rates ('kinetic wind' mode). Radiative Table 1 Table of physical and numerical parameters for each of the resolution levels of the three flagship TNG simulations. The physical parameters are: the box volume, the box side-length, the initial number of gas cells, dark matter particles, and Monte Carlo tracer particles. The target baryon mass, the dark matter particle mass, the z = 0 Plummer equivalent gravitational softening of the collisionless component, the same value in comoving units, and the minimum comoving value of the adaptive gas gravitational softenings. Additional characterizations of the gas resolution, measured at redshift zero: the minimum physical gas cell radius, the median gas cell radius, the mean radius of SFR>0 gas cells, the mean hydrogen number density of star-forming gas cells, and the maximum hydrogen gas density.

Run
Volume proximity effects from AGN affect nearby gas cooling.
(8) Magnetic fields: amplification of a small, primordial seed field and dynamical impact under the assumption of ideal MHD. For complete details on the behavior, implementation, parameter selection, and validation of these physical models, see the two TNG methods papers: Weinberger et al. (2017) and Pillepich et al. (2018b). Table 2 provides an abridged list of the key differences between Illustris and TNG. We note that the TNG model has been designed (i.e. 'calibrated', or 'tuned') to broadly reproduce several basic, observed galaxy properties and statistics. These are: the galaxy stellar mass function and the stellar-to-halo mass relation, the total gas mass content within the virial radius (r 500 ) of massive groups, the stellar mass -stellar size and the BH-galaxy mass relations all at z = 0, in addition to the overall shape of the cosmic star formation rate density at z 10 (see Pillepich et al., 2018b, for a discussion).
The TNG simulations use the moving-mesh Arepo code (Springel, 2010) which solves the equations of continuum magnetohydrodynamics (MHD; Pakmor and Springel, 2013;Pakmor et al., 2011) coupled with self-gravity. The latter is computed with the Tree-PM approach, while the fluid dynamics employs a Godunov (finite-volume) type method, with a spatial discretization based on an unstructured, moving, Voronoi tessellation of the domain. The Voronoi mesh is generated from a set of control points which move with the  (Schaal and Springel, 2015) local fluid velocity modulo mesh regularization corrections. Assuming ideal MHD, an 8-wave Powell cleaning scheme maintains the zero divergence constraint. The previous MUSCL-Hancock scheme has been replaced with a time integration approach following Heun's method, and the original Green-Gauss method for gradient estimation of primitive fluid quantities has been replaced with a least-squares method, obtaining second order convergence in the hydrodynamics . The long-range FFT calculation employs a new column-based MPI-parallel decomposition, while the gravity solver has been rewritten based on a recursive splitting of the N-body Hamiltonian into short-and long-timescale systems (as in Springel in prep.). The code is second order in space, and with hierarchical adaptive time-stepping, also second order in time. Of order 10 million individual timesteps are required to evolve the high-resolution runs to redshift zero. During the simulation we employ a Monte Carlo tracer particle scheme  to follow the Lagrangian evolution of baryons. An on-the-fly cosmic shock finder is coupled to the code (Schaal and Springel, 2015;Schaal et al., 2016). Group catalogs are computed during the simulations using the FoF and Subfind (Springel et al., 2001) substructure identification algorithms.

Model Validation and Early Findings
TNG has been shown to produce observationally consistent results in several regimes beyond those adopted to calibrate the model. Some examples regarding galaxy populations, galactic structural and stellar population properties include: the shapes and widths of the red sequence and blue cloud of SDSS galaxies ; the shapes and normalizations of the galaxy stellar mass functions up to z ∼ 4 ; the spatial clustering of red vs. blue galaxies from tens of kpc to tens of Mpc separations ; the spread in Europium abundance of metal-poor stars in Milky Way like halos ; the emergence of a population of quenched galaxies both at low  and high redshift (Habouzit et al., 2018); stellar sizes up to z ∼ 2, including separate star-forming and quiescent populations ; the z = 0 and evolution of the gas-phase mass-metallicity relation ; the dark matter fractions within the extended bodies of massive galaxies at z = 0 in comparison to e.g. SLUGGS results (Lovell et al., 2018); and the optical morphologies of galaxies in comparison to Pan-STARRS observations .
The IllustrisTNG model also reproduces a broad range of unusual galaxies, tracing tails of the galaxy population, including low surface brightness galaxies (Zhu et al., 2018) and jellyfish, ram-pressure stripped galaxies (Yun et al., 2018). The large-volume of TNG300 helps demonstrate reasonable agreement in several galaxy cluster, intra-cluster and circumgalactic medium properties -for example, the scaling relations between total radio power and X-ray luminosity, total mass, and Sunyaev-Zel'dovich parameter of massive haloes ; the distribution of metals in the intra-cluster plasma ; the observed fraction of cool core clusters ; and the OVI content of the circumgalactic media around galaxies from surveys at low redshift including COS-Halos and eCGM .
IllustrisTNG is also producing novel insights on the formation and evolution of galaxies. For instance, halo mass alone is a good predictor for the entire stellar mass profile of massive galaxies ; the metal enrichment of cluster galaxies is higher than field counterparts at fixed mass and this enhancement is present pre-infall (Gupta et al., 2018); star-forming and quenched galaxies take distinct evolutionary pathways across the galaxy size-mass plane  and exhibit systematically different column densities of OVI ions  and different magnetic-field strengths  at fixed galaxy stellar mass, as well as different magnetic-field topologies . Galaxies oscillate around the star formation main sequence and the mass-metallicity relations over similar timescales and often in an anti-correlated fashion ; the presence of jellyfish galaxies is signaled by large-scale bow shocks in their surrounding intra-cluster medium (Yun et al., 2018); baryonic processes affect the matter power spectrum across a range of scales  and steepen the inner power-law total density profiles of early-type galaxies (Wang et al., 2018); a significant number of OVII, OVIII  and NeIX  absorption systems are expected to be detectable by future X-ray telescopes like ATHENA.
IllustrisTNG has also been used to generate mock 21-cm maps (Villaescusa-Navarro et al., 2018) and estimates of the molecular hydrogen budget  in central and satellite galaxies in the local  as well as in the high-redshift Universe as probed by ALMA (Popping et al., 2019). Finally, TNG provides a test bed to explore future observational applications of machine learning techniques: for example, the use of Deep Neural Networks to estimate galaxy cluster masses from Chandra Xray mock images (Ntampaka et al., 2018) or optical morphologies versus SDSS (Huertas-Company et al., 2019).
See the up to date list of results [3] for additional references. Please note that on this page we provide, and will continue to release, data files accompanying published papers as appropriate. For instance, electronic versions of tables, and data points from key lines and figures, to enable comparisons with other results. These are available with small [data] links next to each paper.

Breadth of Simulated Data
All of the observational validations and early results from TNG100 and TNG300 demonstrate the broad applications of the IllustrisTNG simulations. To give a sense of the expansive scope, the richness of the resulting data products, and the potential for wide applications across many areas of galaxy formation and cosmology, Figure 3 visualizes the TNG100 simulation at redshift zero. Each slice reveals a view into the synthetic IllustrisTNG universe. Together, they range from purely theoretical quantities to directly observable signatures, spanning across the baryonic and nonbaryonic matter components of the simulation: dark matter, gas, stars, and blackholes.
[3] www.tng-project.org/results The wealth of available information in the simulation outputs translates directly into the wide range of astrophysical phenomena which can be explored with the TNG simulations.

Data Products
We release all 100 snapshots of the IllustrisTNG cosmological volumes. These include up to five types of resolution elements (dark matter particles, gas cells, gas tracers, stellar and stellar wind particles, and supermassive blackholes). The same volumes are available at multiple resolutions: high (-1 suffix, e.g. TNG100-1), intermediate (-2 suffix), and low (-3 suffix), always separated by a factor of two (eight) in spatial (mass) resolution. At each resolution, these 'baryonic' runs include the fiducial TNG model for galaxy formation physics. Each baryonic run is matched to its dark matter only analog (-Dark suffix).
For all runs, at every snapshot, two types of group catalogs are provided: friends-of-friends (FoF) halo catalogs, and Subfind subhalo catalogs. In postprocessing, these catalogs are used to generate two distinct merger trees, which are both released: SubLink, and LHaloTree. Finally, supplementary data catalogs containing additional computations and modeling, and focusing on a variety of topics, are being continually created and released. All these data types are described below.

Snapshot Organization
There are 100 snapshots stored for every run. These include all particles/cells in the whole volume. The complete snapshot listings, spacings and redshifts can be found online. Note that, unlike in Illustris, TNG contains two different types of snapshots: 'full' and 'mini'. While both encompass the entire volume, 'mini' snapshots only have a subset of particle fields available (detailed online). In TNG, twenty snapshots are full, while the remaining 80 are mini. The 20 full snapshots are given in Table 3. Every snapshot is stored on-disk in a series of 'chunks', which are more manageable, smaller HDF5 files -additional details are provided in Table  A.1 of the appendix.
Note that, just as in Illustris, the snapshot data is not organized according to spatial position. Rather, particles within a snapshot are sorted based on their group/subgroup memberships, according to the FoF or Subfind algorithms. Within each particle type, the sort order is: group number, subgroup number, and then binding energy. Particles/cells belonging to the group but not to any of its subhalos ("inner fuzz") are included after the last subhalo of each group. In Figure  4 we show a schematic of the particle organization (as  Figure 3 Overview of the variety of physical information accessible in the different matter components of the TNG simulations. From top to bottom: dark matter density, gas density, gas velocity field, stellar mass density, gas temperature, gas-phase metallicity, shock mach number, magnetic field strength, and x-ray luminosity. Each panel shows the same ∼ 110 × 14 × 37 Mpc volume of TNG100-1 at z = 0. in Nelson et al., 2015), for one particle type. Note that halos may happen to be stored across multiple, subsequent file chunks, and different particle types of a halo are in general stored in different sets of file chunks.

Snapshot Contents
Each HDF5 snapshot contains several groups: 'Header', 'Parameters', 'Configuration', and five additional 'Part-TypeX' groups, for the following particle types (DM only runs have a single PartType1 group): The 'Header' group contains a number of attributes giving metadata about the simulation and snapshot. The 'Parameters' and 'Configuration' groups provide the complete set of run-time parameter and compiletime configuration options used to run TNG. That is, they encode the fiducial "TNG Galaxy Formation Model". Many will clearly map to Table 1 of Pillepich et al. (2018b), while others deal with more numerical/technical options. In the future, together with the release of the TNG initial conditions and the TNG code base, this will enable any of the TNG simulations to be reproduced.
The complete snapshot field listings of the 'Part-TypeX' groups, including dimensions, units and descriptions, are given online. The general system of units is kpc/h for lengths, 10 10 M /h for masses, and km/s for velocities. Comoving quantities can be converted to the corresponding physical ones by multiplying by the appropriate power of the scale factor a. New fields in TNG, not previously available in the original Illustris, are specially highlighted. ...

Figure 4
Illustration of the organization of particle/cell data within a snapshot for one particle type (e.g dark matter). Therein, particle order is set by a global sort of the following fields in this order: FoF group number, Subfind subhalo number, binding energy. As a result, FOF halos are contiguous, although they can span file chunks. Subfind subhalos are only contiguous within a single group, being separated between groups by an "inner fuzz" of all FOF particles not bound to any subhalo. "Outer fuzz" particles outside all halos are placed at the end of each snapshot.
With respect to Illustris, the following new fields are generally available in the snapshots: (i) EnergyDissipation and Machnumber, giving the output of the on-thefly shock finder, (ii) GFM Metals, giving the individual element abundances of the nine tracked species (H, He, C, N, O, Ne, Mg, Si, Fe), (iii) GFM MetalsTagged, metal tracking as described below, (iv) MagneticField and MagneticFieldDivergence, providing the primary result of the MHD solver.

Tagged Metals
The units of all the entries of GFM MetalsTagged field, except for NSNS, are the same as GFM Metals: dimensionless mass ratios. Summing all elements of GFM Metals heavier than Helium recovers the sum  Table 5 Details of the subbox snapshots: the number and approximate time resolution ∆t at three redshifts: z = 6, z = 2, and z = 0. Every subbox for a given volume and resolution combination has the same output times.

Run
Nsnap of the three tags SNIa+SNII+AGB. Likewise, the Fe entry of GFM Metals roughly equals the sum of FeS-NIa+FeSNII, modulo the small amount of iron consumed (i.e. negative contribution) by AGB winds. The fields are (in order): • SNIa (0): The total metals ejected by Type Ia SN.
• AGB (2): the total metals ejected by stellar winds, which is dominated by AGB stars. • NSNS (3): the total mass ejected from NS-NS merger events, which are modeled stochastically (i.e. no fractional events) with a DTD scheme similar to that used for SNIa, except with a different τ value. Note that the units of NSNS are arbitrary. To obtain physical values in units of solar masses, this field must be multiplied by α/α 0 where α is the desired mass ejected per NS-NS merger event, and α 0 is the base value (arbitrary) used in the simulation, e.g. Shen et al. (2015) take α = 0.05M . The value of α 0 varies by run, and it is 0.05 for all TNG100 runs, and 5000.0 for all TNG300 and TNG50 runs. See Naiman et al. (2018) for more details. • FeSNIa (4): The total iron ejected by Type Ia SN.
Note a somewhat subtle but fundamental detail: these tags do not isolate where a given heavy element was created, but rather identify the last star it was ejected from. This can be problematic since, for example, AGB winds create little iron, but eject a significant amount of iron which was previously created by SnIa and SNII at earlier epochs. The FeSNIa field is, for example, more accurately described as 'the total iron ejected by type Ia supernovae not yet consumed and re-ejected from another star'.
These are spatial cutouts of fixed comoving size and fixed comoving coordinates, and the primary benefit is that their time resolution is significantly better than that of the main snapshots -details are provided in Tables 4 and 5. These snapshots are useful for some types of analysis and science questions requiring high time-resolution data, and for creating time-evolving visualizations. There are two subboxes for TNG100 (corresponding to the original Illustris subboxes #0 and #2, the latter increased in size), and three subboxes for TNG50 and TNG300. Note that subboxes, unlike full boxes, are not periodic.
The subboxes sample different areas of the large boxes, roughly described by the environment column in Table 4. The particle fields are all identical to the main snapshots, except that the particles/cells are not sorted by their group membership, since no group catalogs exist for subbox snapshots.

Group Catalogs
Group catalogs give the results of substructure identification, and broadly contain two types of objects: dark matter halos (either FoF halos or central subhalos) and galaxies themselves (the inner stellar component of subhalos, either centrals or satellites). There is one group catalog produced for each snapshot, which includes both FoF and Subfind objects. The group files are split into a small number of sub-files, just as with the raw snapshots. In TNG, these files are called fof subhalo tab *, whereas in original Illustris they were called groups * (they are otherwise essentially identical). Every HDF5 group catalog contains the following groups: Header, Group, and Subhalo. The IDs of the member particles of each group/subgroup are not stored in the group catalog files. Instead, particles/cells in the snapshot files are ordered according to group membership.
In order to reduce confusion, we adopt the following terminology when referring to different types of objects. "Group", "FoF Group", and "FoF Halo" all refer to halos. "Subgroup", "Subhalo", and "Subfind Group" all refer to subhalos. The first (most massive) subgroup of each halo is the "Primary Subgroup" or "Central Subgroup". All other following subgroups within the same halo are "Secondary Subgroups", or "Satellite Subgroups".
FoF Groups. The Group fields are derived with a standard friends-of-friends (FoF) algorithm with linking length b = 0.2. The FoF algorithm is run on the dark matter particles, and the other types (gas, stars, BHs) are attached to the same groups as their nearest DM particle. Subfind Groups. The Subhalo fields are derived with the Subfind algorithm. In identifying gravitationally bound substructures the method considers all particle types and assigns them to subhalos as appropriate.
Complete documentation for the TNG group catalogs, comprising FoF halos as well as Subfind subhalos, is available online. Differences and additions with respect to original Illustris are highlighted.

Merger Trees
Merger trees have been created for the TNG simulations using SubLink  and LHaloTree . In the population average sense the different merger trees give similar results. In more detail, the exact merger history or mass assembly history for any given halo may differ. For a particular science goal, one type of tree may be more or less useful, and users are free to use whichever they prefer. We generally recommend use of the SubLink trees as a first option, as they are more efficiently stored and accessible.
Trees can be 'walked', i.e. the descendants or progenitors of a given subhalo can be determined, thus linking objects across snapshots saved at different points in time. Main branches, such as the main progenitor branch (MPB), as well as full trees can be extracted. Examples of walking the tree are provided in the example scripts. For the technical details, algorithmic descriptions, and storage structures of the trees, please refer to Nelson et al. (2015) and the online documentation -we omit these details here.

SubLink
The SubLink merger tree is one large data structure split across several sequential HDF5 files named tree extended. [fileNum].hdf5, where [fileNum] goes from e.g. 0 to 19 for the TNG100-1 run, and 0 to 125 for the TNG300-1 run.

LHaloTree
The LHaloTree merger tree is one large data structure split across several HDF5 files named trees sf1 99. [chunkNum].hdf5, where TNG100-1 has for instance 80 chunks enumerated by [chunkNum], while TNG300-1 has 320. Within each file there are a number of HDF5 groups named "TreeX", each of which represents one disjoint merger tree.

Offsets Files
As described above, snapshot particle data is ordered by the subhalo each particle belongs to. To facilitate rapid loading of snapshot data, particle 'offset' numbers provide the location where particles belonging to each subhalo begin. Most simply, offsets describe where in the group catalog files to find a specific halo/subhalo, and where in the snapshot files to find the start of the particles of a given halo/subhalo.
To use the helper scripts (provided online) for working with the actual data files (snapshots or group catalogs) on a local machine, then it is required to download the offset file(s) for the snapshot(s) of interest. The offsets are not required when using the webbased API or analyzing the particle cutouts it provides, for instance.
Note that in Illustris, offsets were embedded inside the group catalog files for convenience. In TNG however, we have kept offsets as separate files called offsets *.hdf5 (one per snapshot), which must be downloaded as well.
3.3.4 The 'simulation.hdf5' file Each run has a single file called 'simulation.hdf5' which is purely optional, for convenience, and not required by any of the public scripts. Its purpose is to encapsulate all data of an entire simulation into a single file.
To accomplish this, we make advantage of a new feature of the HDF5 library called "virtual datasets". A virtual dataset is a collection of symbolic links to one or more datasets in other HDF5 file(s), where these symlinks can refer to subsets of a dataset, in either the source or target of the link. The simulation.hdf5 is thus a large collection of links, which refer to other files which actually contain data. In order to use it, the corresponding files must also be downloaded (e.g. of snapshots, group catalogs, or supplementary data catalogs).
Using this resource, the division of snapshots and group catalogs over multiple file chunks is no longer relevant. Loading particle data from snapshots or subhalo or halo fields from group catalogs become one line operations. It also makes loading the particles of a given halo or subhalo using the offset information trivial. Finally, supplementary data catalogs (either those we provide, or similar user-run computations) can be 'virtually' inserted as datasets in snapshots or group catalogs. This provides a clean way to organize postprocessing computations which produce additional values for halos, subhalos, or individual particles/cells. Such data can then be loaded with the same scripts (and same syntax) as 'original' snapshot/group catalog fields.
We refer to the online documentation for examples of each use case as well as technical requirements, namely a relatively new version (1.10+) of the HDF5 library.

Initial Conditions
We provide as part of this release the initial conditions for all TNG volumes as well as the original Illustris volumes. These were created with the Zeldovich approximation and the N-GenIC code (Springel, 2015). Each particular realization was chosen from among tends of random realizations of the same volume as the most average, based on sinspection of the z = 0 power spectrum and/or dark matter halo mass function -see Vogelsberger et al. (2014b) and Pillepich et al. (in prep) for details. Each IC is a single HDF5 file with selfevident structure: the coordinates, velocities, and IDs of the set of total matter particles at z = 127, the starting redshift for all runs. These ICs were used as is for dark-matter only simulations, while for baryonic runs total matter particles were split upon initialization in the usual way, into dark matter and gas, according to the cosmic baryon fraction and offsetting in space by half the mean interparticle spacing. These ICs can be run by e.g. Gadget or Arepo as is, or easily converted into other data formats.

Supplementary Data Catalogs
Many additional data products have been computed in post-processing, based on the raw simulation outputs. These are typically in support of specific projects and analysis in a published paper, after which the author makes the underlying data catalog public. Many such catalogs have been made available for the original Illustris simulation, and the majority of these will also be recalculated for TNG. We provide a list of TNG supplementary data catalogs which are now available or which we anticipate to release in the near future: (A) Tracer Tracks -time-evolution of Monte Carlo tracer properties for TNG100 (Nelson et al. in prep). (B) Stellar Mass, Star Formation Rates -multiaperture and resolution corrected masses, timeaveraged SFRs ). (C) Stellar Circularities, Angular Momenta, and Axis Ratios -for the stellar components of galaxies, as for Illustris . (D) Subhalo Matching Between Runs -cross-matching subhalos between baryonic and dark-matter only runs, between runs at different resolutions, and between TNG100 and Illustris (Lovell et al., 2018;Nelson et al., 2015;Rodriguez-Gomez et al., 2015. (E) Stellar Projected Sizes -half-light radii of TNG100 galaxies . (F) Blackhole Mergers and Details -records of BH-BH mergers and high time-resolution BH details, as for Illustris (Blecha et al., 2016;Kelley et al., 2017), and with an updated approach (Katz et al. in prep). to all-sky projections, across the different matter components, to facilitate lensing, x-ray, Sunyaev-Zeldovich, and related explorations (Giocoli et al. in prep). Several of these were previously available for the original Illustris simulation and will be re-computed for TNG. We would plan to provide a number of 'pre-defined' galaxy samples, particularly with respect to common observational selection techniques, current and/or upcoming surveys, and other distinct classes of interest. This can include, for example, red versus blue galaxies, luminous red galaxies (LRGs) and emissionline galaxies (ELGs) of SDSS, damped Lyman-alpha (DLA) host halos, and ultra-diffuse or low surface brightness (LSB) galaxies. Such samples would facilitate rapid comparisons to certain types of observational samples, and can be included as supplementary data catalogs as they become available.

Data Access
There are three complementary ways to access and analyze TNG data products.
1 (Local data, local analysis). Raw files can be directly downloaded, and example scripts are provided as a starting point for local analysis. 2 (Remote data, local analysis). The web-based API can be used, either through a web browser or programmatically in a script, to perform search, data extraction, and visualization tasks, followed by local analysis of these derivative products. 3 (Remote data, remote analysis). A web-based JupyterLab (or Jupyter notebook) session can be instantiated to explore the data, develop analysis scripts with persistent storage, run data-intensive and compute-intensive tasks, and make final plots for publication.
These different approaches can be combined. For example, by downloading the full redshift zero group catalog to perform a complex search which cannot be easily done with the API. After determining a sample of interesting galaxies (i.e. a set of subhalo IDs), one can then extract their individual merger trees (and/or raw particle data) without needing to download the full simulation merger tree (or a full snapshot).
These approaches are described below, while "getting started" tutorials for several languages (currently: Python, IDL, and Matlab) can be found online.

Direct File Download and Example Scripts
Local data, local analysis. All of the primary outputs of the TNG simulations are released in HDF5 format, which we use universally for all data products. This is a portable, self-describing, binary specification (similar to FITS), suitable for large numerical datasets. File access libraries, routines, and examples are available in all common computing languages. We typically use only basic features of the format: attributes for metadata, groups for organization, and large datasets containing one and two dimensional numeric arrays. To maintain reasonable filesizes for transfer, most outputs are split across multiple files called "chunks". For example, each snapshot of TNG100-1 is split into 448 sequentially numbered chunks. Links to the individual file chunks for a given simulation snapshot or group catalog are available under their respective pages on the main data release page.
The provided example scripts (in IDL, Python, and Matlab) give basic I/O functionality, and we expect they will serve as a useful starting point for writing any analysis task, and intend them as a 'minimal working examples' which are short and simple enough that they can be quickly understood and extended. For a getting-started guide and reference, see the online documentation.

Web-based API
Remote data, local analysis. For TNG we enhance the web-based interface (API) introduced with the original Illustris simulation, augmented by a number of new features and more sophisticated functionality. At its core, the API can respond to a variety of user requests and queries. It provides a well-defined interface between the user and simulation data, and the tools it provides are independent, as much as possible, from any underlying details of data structure, heterogeneity, storage format, and so on. The API can be used as an alternative to downloading large data files for local analysis. Fundamentally, the API allows a user to search, extract, visualize, or analyze a simulation, a snapshot, a group catalog, or a particular galaxy/halo. By way of example, the following requests can be handled by the current API: • Search across subhalos with numeric range(s) over any field(s) present in the Subfind catalogs. • Retrieve a snapshot cutout for all the particles/cells within a given subhalo/halo, optionally restricted to a subset of specified particle/cell type(s) and fields(s). • Retrieve the complete merger history or main branches for a given subhalo. • Download subsets of snapshot files, containing only specified particle/cell type(s), and/or specific field(s) for each type. • Traverse links between halos and subhalos, for instance from a satellite galaxy, to its parent The IllustrisTNG data access API is available at the following permanent URL: http://www.tng-project.org/api/ For a getting-started guide for the API, as well as a cookbook of common examples and the complete reference, see the online documentation.

Remote Data Analysis
Remote data, remote analysis. Coincident with the TNG public data release we introduce a new, third option for working with and analyzing large simulation datasets. Namely, an online, browser-based scientific computing environment which enables researchers' computations to "be brought to" the data. It is similar in spirit to the NOAO Data Lab (Fitzpatrick et al., 2014) and SciServer services (Raddick et al., 2017), i.e. simultaneously hosting petabyte-scale datasets as well as a full-featured analysis platform and toolset. This alleviates the need to download any data, or run any calculations locally, thereby facilitating broad, universal, open access to large cosmological simulation datasets such as TNG.
To enable this functionality we make use of extensive development on Jupyter and JupyterLab over the last few years. JupyterLab is the evolution of the Jupyter Notebook (Kluyver et al., 2016), previously called IPython (Pérez and Granger, 2007). It is a nextgeneration, web-based user interface suitable for scientific data analysis. In addition to the previous 'notebook' format, JupyterLab also enables a traditional workflow based around a collection of scripts on a filesystem, text editors, a console, and command-line [4] New feature in the TNG data release.
execution. It provides an experience nearly indistinguishable from working directly on a remote computing cluster via SSH.
Computation is language agnostic, as 'kernels' are supported in all common languages, including Python 2.x, Python 3.x, IDL, Matlab, R, and Julia. Development, visualization, and analysis in any language or environment practically available within a Linux environment is possible, although we focus at present on Python 3.x support.
Practically, this service enables direct access to one of the complete mirrors of the Illustris[TNG] data, which is hosted at the Max Planck Computing and Data Facility (MPCDF) in Germany. Users can request a new, on-demand JupyterLab instance, which is launched on a system at MPCDF and connected to the user web browser. All Illustris[TNG] data is then directly available for analysis. A small amount of persistent user storage is provided, so that underdevelopment scripts, intermediate analysis outputs, and in-progress figures for publication all persist across sessions. Users can log out and pick up later where they left off. A base computing environment is provided, which can be customized as needed (e.g. by installing new python packages with either pip or conda). Users can synchronize their pre-existing tools, such as analysis scripts, with standard approaches (git, hg, rsync) or via the JupyterLab interface. Results, such as figures or data files, can be viewed in the browser or downloaded back to the client machine with the same tools.
For security and resource allocation, users must specifically request access to the JupyterLab TNG service. At present we anticipate providing this service on an experimental (beta) basis, and only to active academic users.

Subhalo Search Form
We provide the same, simple search form to query the subhalo database as was available in the Illustris data release. It exposes the search capabilities of the API in a user-friendly interface, enabling quick exploration without the need to write a script or URL by hand. As examples, objects can be selected based on total mass, stellar mass, star formation rate, or gas metallicity. The tabular output lists the subhalos which match the search, along with their properties. In addition, each result contains links to a common set of API endpoints and web-based tools for inspection and visualization.

Explore: 2D and 3D
The 2D Explorer and 3D Explorer interfaces are experiments in the interactive visualization and exploration of large data sets such as those generated by the Illus-trisTNG simulations. They both leverage the approach of thin-client interaction with derived data products. The 2D Explorer exposes a Google Maps−like tile viewer of pre-computed imagery from a slice of the TNG300-1 simulation at redshift zero, similar to the original Illustris explorer. Multiple views of different particle types (gas, stars, dark matter, and blackholes) can be toggled and overlaid, which is particularly useful in exploring the spatial relationships between different phenomena of these four matter components.
The 3D Explorer introduces a new interface, showing a highly derivative (although spatially complete) view of an entire snapshot. That is, instead of particle-level information, we facilitate interactive exploration of the group catalog output in three-dimensional space. This allows users to rotate, zoom, and move around the cubic box representing the simulation domain, where the largest dark matter halos are represented by wireframe spheres of size equal to their virial radii, while the remaining smaller halos are represented by points. User selection of a particular halo, via on-click ray cast and sphere intersection testing, launches an API query and returns the relevant halo information and further introspection links. At present, both Explorers remain largely proof of concept interfaces for how tighter integration of numeric, tabular, and visual data analysis components may be combined in the future for the effective exploration and analysis of large cosmological datasets (see also Dykes et al., 2018, and the Dark Sky simulation).

Merger Tree Visualization
In the Illustris data release we demonstrated a richclient application built on top of the API, in the form of an interactive visualization of merger trees. The tree is vector based, and client side, so each node can be interacted with individually. The informational popup provides a link, back into the API, where the details of the selected progenitor subhalo can be interrogated. This functionality is likewise available for all new simulations. Furthermore, we have added a new, static visualization of the complete merger tree of a subhalo. This allows a quick overview of the assembly history of a given object, particularly its past merger events and its path towards quiescence. In the fiducial configuration, node size in the tree is scaled with the logarithm of total halo mass, while color is mapped to instantaneous sSFR.

Plot Group Catalog
The first significant new feature of the API for the TNG public data release is a plotting routine to examine the group catalogs. Since the objects in the catalogs are either galaxies or dark matter halos, plotting the relationships among their various quantities is one of most fundamental explorations of cosmological simulations. Classically observed scaling relations, such as Tully-Fisher (rotation velocity vs. stellar mass), Faber-Jackson (stellar velocity dispersion vs. luminosity), the stellar size-mass relation, the star-formation main sequence, or the Magorrian relation (blackhole mass versus bulge mass) are all available herein. Such relations can be used to assess the outcome of the simulations by comparison to observational data. More complex relations, those involving currently unobserved properties of galaxies/halos, and/or those only currently observed with very limited statistics or over limited dynamic range, represent a powerful discovery space and predictive regime for simulations such as TNG. At the level of the galaxy (or halo) population, i.e. with tens to hundreds of thousands of simulated objects, many such relationships reveal details of the process of galaxy formation and evolution, as well as the working mechanisms of the physical/numerical models.
The 'group catalog plotter' is an API endpoint which returns publication quality figures (e.g. PNG or PDF outputs). In Figure 5 we show several examples of its output, taken from TNG300-1 and TNG100-1 at z = 0. Many options exist to control the behavior and structure of the plots, all of which are detailed in the online documentation. As for the subhalo search form, we also provide a new web-based interface to assist in interactively constructing plots from this service. Fundamentally, the quantities to be plotted against each other on the x-and y-axes can be selected. In this case, a two-dimensional histogram showing the density of subhalos in this space is overlaid with the median relation and bounding percentiles. Optionally, a third quantity can be added, which is then used to color every pixel in the histogram according to a userdefined statistic (e.g. median) of all the objects in that bin. For example, plotting the stellar-mass halo-mass relation, colored by stellar metallicity, reveals one reason for the scatter in this relation. This third quantity can optionally be normalized relative to the median value at each x-axis value (e.g. as a function of stellar mass), highlighting the 'relative' deviation of that property compared to its evolving median value. The types of subhalos included can be chosen, for example selecting only centrals or only satellite galaxies, and the subhalos to be included can be filtered based on numeric range selections on a fourth quantity. We expect that this tool will enable rapid, initial exploration of interesting relationships among galaxy (or halo) integral properties, and serve as a starting point for more in-depth analysis (see also de Souza and Ciardi, 2015). Complete usage documentation is available online.  Figure 5 Four examples of exploratory plots for common scaling relations, galaxy trends, and other relationships between properties of the objects in the group catalogs, galaxies and halos, for TNG300-1 and TNG100-1 at z = 0. Made using the web-based API functionality.

Visualize Galaxies and Halos
The second significant new feature of the API for the TNG public data release is an on-demand visualization service. Primarily, this API endpoint renders projections of particle-level quantities (of gas cells, dark matter particles, or stellar particles) for a given subhalo or halo. For example, it can produce gas column density projections, gas temperature projections, stellar line-of-sight velocity maps, or dark matter velocity dispersion maps. Its main rendering algorithm is based on the standard SPH kernel projection technique, with adaptive kernel sizes for all particle types, although alternatives are available. In Figure 6 we show several examples of output, at both the halo-scale (circle indicating virial radius), and the galaxy scale (outer circle showing twice the stellar half mass radius). The visualization service returns publication quality figures (e.g. PNG or PDF outputs). It can also return the raw data used to construct any image, in scientifically accurate units (HDF5 output). For instance, a user can request not only an image of the gas density projection of an ongoing galaxy merger, but also the actual grid of density values in units of e.g. M kpc −2 . Numerous options exist to control the behavior of the rendered projections, as well as the output style, all of which are detailed in the online documentation. All parameters of the rendering can be specified -as an example, the view direction can be a rotation into faceon or edge-on orientations. Most properties available in the snapshots can be visualized, for any galaxy/halo, at any snapshot, for any run.
Beyond snapshot level information, the visualization service currently has two more advanced features. First, it is coupled to the CLOUDY photoionization code (Ferland et al., 2017), following Nelson et al. (2018b). This enables ionic abundance calculations for gas cells on the fly. For example, a user can request a column density map of the O VI or C IV ions. All relevant atoms are supported, assuming solar abundances for non-tracked elements, typically up to the tenth ionization state (Al, Ar, Be, B, Ca, C, Cl, Cr, Co, Cu, F, He, H, Fe, Li, Mg, Mn, Ne, Ni, N, O, P, K, Sc, Si, Na, S, Ti, V, Zn). Emission line luminosities are also available -a surface brightness map of metal-line emission from  Figure 6 Example of halo-scale and galaxy-scale visualizations from TNG300-1 and TNG100-1, made using the web-based API functionality, viewing the dark matter, gas, and stars. The top eight panels show the 20th most massive halo of TNG300-1 at z = 0 (circle indicating r vir ). The bottom eight panels show face-on and edge-on views of subhalo 468590 of TNG100-1 at z = 0 (circles indicating r 1/2, and 2r 1/2, ).
O VIII at 22.1012Å, for example. Secondly, this service is also coupled to the FSPS stellar population synthesis code (Conroy and Gunn, 2010;Conroy et al., 2009) through python-fsps (Foreman-Mackey et al., 2014), following Nelson et al. (2018a). This enables emergent stellar light calculations for stellar population particles on the fly, with optional treatments of dust attenuation effects. For example, a user can request a map of stellar surface brightness, or luminosity, either rest frame or observed frame, for any of the ∼140 available bands, including common filters on surveys/instruments such as SDSS, DES, HST, and JWST.
We expect that this tool will enable rapid, initial exploration of many interesting facets of galaxies and halos across the simulations, and serve as a starting point for more in-depth analysis. We caution that, used improperly, this tool can easily return nonsensical results (e.g. requesting OI emission properties from ISM gas), and users should understand the relevant scientific limitations. In this particular case, we refer to the effective two-phase ISM model used in TNG (Springel and Hernquist, 2003) which intentionally avoids resolving the cold, dense phases of the ISM. Complete usage documentation is available online.

Scientific Remarks and Cautions
In the original Illustris simulation we identified a number of non-trivial issues in the simulated galaxy and halo populations in comparison to observational constraints (see Nelson et al., 2015, for a summary). These disagreements motivated a series of important caveats against drawing certain strong scientific conclusions in a number of regimes.
In contrast, our initial explorations of TNG (specifically, of the TNG100-1 and TNG300-1 simulations) have revealed no comparably significant tensions with respect to observable comparisons. With this data release we invite further detailed observational comparisons and scrutiny. The TNG simulations have been shown to realistically resolve numerous aspects of galactic structure and evolution, including many internal properties of galaxies (though, clearly, not all) as well as their co-evolution within the cosmic web of large-scale structure (see Section 2.2). TNG reproduces various observational details and scaling relations of the demographics and properties of the galaxy population, not only at the present epoch (z = 0), but also at earlier times (see likewise Section 2.2). This has been achieved with a physically plausible although necessarily simplified galaxy formation model. The TNG model is intended to account for most, if not all, of the primary processes that are believed to be important for the formation and evolution of galaxies.

IllustrisTNG: Possible Observational Tensions
We therefore do not specifically caution against the use of TNG in any of the regimes where the original Illustris simulation was found to be less robust. However, the enormous spatial and temporal dynamic range of these simulations, as well as the multi-scale, multiphysics nature of the complex physical phenomena involved, implies modeling approximations and uncertainties. Early comparisons of TNG against observations have identified a number of interesting regimes in which possible tensions exist.
Our ability to make any stronger statement is frequently limited by the complexity of the observational comparison, i.e. the need to accurately reproduce (or 'mock') the observational measurement closely and with care. In the qualitative sense, however, these regimes may plausibly indicate areas where the TNG model has shortcomings or is less physically realistic. It will be helpful for any user of the public data to be aware of these points, which should be carefully considered when advancing strong scientific conclusions or making claims based on observational comparisons. Possible tensions of interest include the following: (I) The simulated stars in Milky Way-like galaxies are too alpha-enhanced in comparison to observations of the Milky Way . (II) The Eddington ratio distributions of massive blackholes (> 10 9 M ) at z = 0 are dominated in TNG by low accretion rates, generically far below the Eddington limit; recent observations favor at least some fraction of higher accretion rate massive blackholes. This is reflected in a steeper hard X-ray AGN luminosity function at 1 z 4 (Habouzit et al., 2018). (III) TNG galaxies may have a weaker connection between galaxy morphology and color than observed at z ∼ 0, reflected in a possible excess of red disklike galaxies in the simulations (Rodriguez-Gomez et al., 2019), although see Tachella et al. (in prep). (IV) TNG galaxies exhibit a somewhat sharper trend than observations in quenched fraction vs. galaxy stellar mass for M ∈ 10 10−11 M , and similarly in the relation between sSFR and M BH at low redshift (Terrazas et al. in prep). (V) The locus of the galaxy star-forming main sequence is below the face-value observed SFMS at 1 z 2, modulo known inconsistencies with e.g. the observed stellar mass function . (VI) Similarly, the H 2 mass content of massive TNG galaxies at z = 1 − 3 may be lower than implied by ALMA observations (Popping et al., 2019) and sub-mm galaxy demographics (Hayward et al. in prep). (VII) The DM fractions within massive elliptical galaxies at z = 0 are consistent with observations at large galactocentric distances, but may be too high within their effective radii (Lovell et al., 2018) and likewise are tentatively higher than values inferred from observations of massive z = 2 star forming galaxies (Lovell et al., 2018, and Ubler et al. in prep).
With respect to points (III)−(IV) there is, in general, an interesting transitional mass regime (galaxy stellar mass ∼ 10 10.5 M ) where central blue vs. red galaxies or star-forming vs. quiescent galaxies co-exist: this reflects the effective quenching mechanism of the TNG model based on SMBH feedback  but how precisely such transitional galaxies differ also in other structural and kinematical properties still requires careful examination and consideration.
Note that for the items in this list we have not included more specific quantification of observed tension (i.e. χ 2 or fractional deviation values) -the referenced papers contain more discussion. On the one side, not all observational results are in agreement among each other, making quantitative statements necessarily partial; nor observational statements of different galaxy properties are necessarily consistent within one another, especially across cosmic times. On the other side, excruciating care is often necessary to properly map simulated variables into observationally-derived quantities.

Numerical Considerations and Issues
To better inform which features of the simulations are robust when making science conclusions, we note the following points related to numerical considerations: 1. SubhaloFlag. Not all objects in the Subfind group catalogs should be considered 'galaxies'. In particular, not all satellite subhalos have a cosmological origin, in the sense that they may not have formed due to the process of structure formation and collapse. Rather, some satellite subhalos will represent fragments or clumps produced through baryonic processes (e.g. disk instabilities) in already formed galaxies, and the Subfind algorithm cannot a priori differentiate between these two cases. Such non-cosmological objects are typically low mass, small in size, and baryon dominated (i.e. with little or no dark matter), residing at small galactocentric distances from their host halos, preferentially at late times (z < 1). These objects may appear as outliers in scatter plots of typical galaxy scaling relations, and should be considered with care.
We have added a SubhaloFlag field to the group catalogs to assist in their identification, which was constructed as follows. First, a variant of the SubLink merger tree was used which tracks baryonic, rather than dark matter, particles -namely, star-forming gas cells and stars . The algorithm is otherwise the same, with the same weighting scheme for determining descendants/progenitors, except that this "SubLinkGal" tree allows us to track subhalos which contain little or no dark matter.
Then, we flag a subhalo as non-cosmological if all the following criteria are true: (i) the subhalo is a satellite at its time of formation, (ii) it forms within 1.0 virial radii of its parent halo, and (iii) its dark matter fraction, defined as the ratio of dark matter mass to total subhalo mass, at the time of formation of the subhalo, is less than 0.8.
These are relatively conservative choices, implying a low false positive rate. On the other hand, some spurious subhalos may not be flagged under this definition. A much more aggressive criterion would be to flag a subhalo if its instantaneous dark matter fraction is low, e.g. less than 10% (as used in e.g. Genel et al., 2018;Pillepich et al., 2018b). Such a selection will result in a purer sample, with less contaminating subhalos, but will also exclude more genuine galaxies, such as those which have undergone extensive (i.e. physical) stripping of their dark matter component during infall. We encourage users to enforce the provided SubhaloFlag values as a default, but to carefully consider the implications and details, particularly for analyses focused on satellite galaxy populations or dark-matter deficient systems.
2. Gas InternalEnergy Corrections. In all TNG simulations, the time-variable UV-background radiation field (Faucher-Giguère et al., 2009, FG11 version) is enabled only for z < 6. Therefore, the ionization state of the IGM above redshift six should be studied with caution, as the usual density-temperature relation will not be present. Two further technical issues exist for the original InternalEnergy field (i.e. gas temperature) of all TNG simulations. These have been corrected in post-processing, as described below, and the fiducial InternalEnergy field of all snapshots in all TNG simulations has been rewritten with updated values. The original dataset has been renamed to In-ternalEnergyOld for reference, although we do not recommend its use for any purpose.
The first issue is seen in the low-density, lowtemperature regime of the intergalactic medium (IGM).
Here, due to a numerical issue in the TNG codebase related to the Hubble flow across gas cells, spurious energy injection could occur in underdense gas. In practice, this only affects extremely low density IGM gas in equilibrium between adiabatic cooling and photoheating from the background. The result is a slight upwards curvature in the usual (ρ, T ) phase diagram. To correct this issue, we have used one of the TNG model variant boxes (with side length 25Mpc/h and 512 3 resolution) which includes the fix for this issue. The adiabat was then identified in all TNG runs as well as in the corrected simulation by binning the density-temperature phase diagram and locating the temperature of peak gas mass occupancy as a function of density. A multiplicative correction f corr , taken as the ratio between the corrected and uncorrected linear gas temperatures, is then defined and applied as a function of density, for gas with physical hydrogen number density < 10 −6 a −3 cm −3 . We further restrict the correction to the low-temperature IGM by smoothly damping f corr to unity by 10 5.0 K as log T corr = log T orig + log(f corr )w(T orig ) with the window function w(T orig ) = 1−[tanh(5(T orig −5.0))+1]/2. This issue manifests only towards low redshift, and for simplicity and clarity we apply this correction only for z ≤ 5 (snapshots 17 and later).
The second issue arises for a very small fraction of low-temperature gas cells with T < 10 4 K, the putative cooling floor of the model. Here, due to a numerical issue in the TNG codebase related to the cosmological scaling of the energy source term in the Powell divergence cleaning of the MHD scheme (right-most term in Eqn. 21 of Pakmor and Springel, 2013), spurious cooling could occur in gas with high bulk velocity and large, local divergence error (|∇ B| > 0). In practice, this affects a negligible number of cells which appear in the usual (ρ, T ) phase diagram with temperatures less than 10,000 K and densities between the star-formation threshold and four orders of magnitude lower. To correct this issue we simply update the gas temperature values, for all cells in this density range with log(T [K]) < 3.9, to the cooling floor value of 10 4 K, near the background equilibrium value. As this issue also manifests only towards low redshift (being most problematic at intermediate redshifts 1 z 4), we likewise apply this correction only for z ≤ 5 (snapshots 17 and later).
Note that for both issues, we have verified in reruns of smaller volume simulations, by applying the fix in correspondingly corrected TNG model variant simulations, that no properties of galaxies or of the galaxy population are noticeably affected by these fixes.
3. Unresolved ISM. The multi-phase model of the interstellar medium in TNG (which is the same as in Illustris) is a necessarily coarse approximation of a complex physical regime. In particular, the cold neutral and molecular phases of the ISM are not resolved in the current generation of cosmological simulations like TNG; giant molecular clouds (GMCs) and the individual birth sites of massive star formation and, for example, the resultant nebular excitation is likewise not explicitly resolved. Modeling observables which arise in dense ISM phases (e.g. CO masses) should be undertaken with care.
The modeling of the star formation process is explicitly designed to reproduce the empirical Kennicutt-Schmidt relation, so the correlation between star formation rate and gas density, at the scale where this scaling is invoked, is not a predictive result. Star formation, as in all computational models of galaxy formation, proceeds at a numerical threshold density which is many orders of magnitude lower than the true density at which stars form. This threshold is n H 0.1cm −3 in TNG, which may have consequences for the spatial clustering of young stars, as one example (Buck et al., 2018).
4. Resolution Convergence. Numerical convergence is a complex issue, and working with simulations at multiple resolutions can be challenging. Analysis which includes more than one TNG box at once (e.g. TNG100 and TNG300 together), or explicitly uses multiple realizations at different resolutions should carefully consider the issue of convergence. The degree to which various properties of galaxies or the simulation as a whole is converged depends on the specific property, as well as the mass regime, redshift, and so on. For example, see Pillepich et al. (2018b) for convergence of the stellar mass functions of TNG100 and TNG300, and details on a simple 'resolution correction' procedure which may be desirable to apply, particularly when combining the results of multiple flagship boxes together into a single analysis.

Citation Request
To support proper attribution, recognize the effort of individuals involved, and monitor ongoing usage and impact, the following is requested. Any publication making use of data from the TNG100/TNG300 simulations should cite this release paper (Nelson et al., 2019a) as well as the five works from the "introductory paper series" of TNG100/300, the order being random: Any publication making use of the data from the TNG50 simulation should cite this release paper, as well as the two introductory papers of TNG50, the order being random: Finally, use of any of the supplementary data products should include the relevant citation. A full and up to date list is maintained on the TNG website.

Collaboration and Contributions
The full snapshots of TNG100-1, and especially those of TNG300-1, are sufficiently large that it may be prohibitive for most users to acquire or store a large number. We note that transferring ∼ 1.5 TB (the size of one full TNG100-1 snapshot) at a reasonably achievable 10 MB/s will take roughly 48 hours, increasing to roughly five days for a ∼ 4.1 TB full snapshot of TNG300-1. As a result, projects requiring access to full simulation datasets, or extensive post-processing computations beyond what are being made publicly available, may benefit from closer interaction with members of the TNG collaboration.
We also welcome suggestions, comments, or contributions to the data release effort itself. These could take the form of analysis code, derived data catalogs, etc. For instance, interesting data products can be released as a "supplementary data catalog". Fast analysis routines which operate on individual halos/subhalos can be integrated into the API, such that the result can be requested on demand for any object.

Future Data Releases
We anticipate to release additional data in the future, for which further documentation will be provided online.

Rockstar and Velociraptor
We plan to derive and release different group catalogs, based on the Rockstar (Behroozi et al., 2013) and Velociraptor (Elahi et al., 2011) algorithms, and will provide further documentation at that time. Such group catalogs will identify different subhalo populations than found by the Subfind algorithm, particularly during mergers. The algorithm used to construct the 'Consistent Trees' assembly histories also has fundamental differences to both LHaloTree and Sub-Link. This can provide a powerful comparison and consistency check for any scientific analysis. We also anticipate that some users will simply be more familiar with these outputs, or need them as inputs to other tools.

Additional Simulations
The flagship volumes of the IllustrisTNG -TNG50, TNG100, and TNG300 -are accompanied by an additional resource: a large number of 'TNG Model Variation' simulations. Each modifies exactly one choice or parameter value of the base, fiducial TNG model. The variations cover every physical aspect of the model, including the stellar and blackhole feedback mechanisms, aspects of the star formation, as well as numerical parameters. They are invaluable in assessing the robustness of a physical conclusion to model changes, as well as in diagnosing the underlying cause or mechanism responsible for a given feature in the primary simulations. They were first presented in the Pillepich et al. (2018b) TNG methods paper, and used for example in Nelson et al. (2018b) to understand the improvement in OVI column densities, in Lovell et al. (2018) to study the impact of baryons on dark matter fractions, and in Terrazas et al. (in prep) to probe the origin of quenched galaxies in the TNG model.
Each of the ∼ 100 TNG model variants simulates the exact same 25Mpc/h volume at a resolution approximately equivalent to the flagship TNG100-1. Individual halos can also therefore be cross-matched between the simulations, although the statistics is necessarily limited by the relatively small volume. We plan to publicly release these variations in the near future, and encourage those interested to get in touch in the meantime.
Finally, we anticipate that ongoing and future simulation projects will also be released through this platform in the future. Most notably, this includes the high-resolution TNG50 simulation (Nelson et al., 2019b;Pillepich et al., 2019), the third and final volume of the IllustrisTNG project, and potentially other simulations as well.

API Functionality Expansion
There is significant room for the development of additional features in the web-based API. In particular, for (i) on-demand visualization tasks, (ii) on-demand analysis tasks, and (iii) client-side, browser based tools for data exploration and visualization. We have two specific services which are anticipated to be developed in the near-term future and made available.
First, the on-demand generation of 'zoom' initial conditions (ICs), for individual galaxies/halos, based on any object of interest selected from any simulation box. This will allow a user to select a sample of galaxies, perhaps in analogy to an observed sample, or with a peculiar type of assembly history, and obtain ICs for resimulation. Such resimulations could use other codes or galaxy formation models, or explore model parameter variations, to assess how such changes affected a particular galaxy/halo, or class of galaxies/halos. As IC generation will take several minutes at least, it does not fit within our current framework of 'responses within a few seconds', and therefore requires a task-based work queue with delayed execution and subsequent notification (e.g. via email) of completion and the availability of new data products for download.
Second, the on-demand execution of longer running analysis tasks, with similar notification upon completion. Specifically, the ability to request SKIRT radiative-transfer calculations for specific galaxies/halos of interest, leveraging the development efforts of Rodriguez-Gomez et al. (2019). Other expensive mocks, such as spectral HI (with MARTINI; Oman et al., 2019) or x-ray datacubes, or intergalactic quasar absorption sightlines, can similarly be generated.
We welcome community input and/or contributions in any of these directions, or comments related to any aspect of the public data release of TNG.

Architectural and Design Details
In the development of the original Illustris public data release, many design decisions were made, including technical details related to the release effort, based on expected use cases and methods of data analysis. Nelson et al. (2015) discusses the architectural goals and considerations that we followed and continue to follow with the IllustrisTNG data release, and contrasts against other methodologies, as implemented in other astrophysics simulation data releases. We refer the reader to that paper and present only a few updates here.

Usage of the Illustris Public Data Release
Since its release, the original Illustris public data release has seen widespread adoption and use. To date, in the three and a half years since launch, 2122 new users have registered and made a total of 269 million API requests, including 2.7 million 'mock FITS' file downloads. For the flagship Illustris-1 run, a total of 1390 full snapshots, 6650 group catalogs, and 180 merger trees have been downloaded. 26 million subhalo 'cutouts' of particle-level data, and 3.1 million Sub-Link merger tree extractions have been requested. The total data transfer for this simulation to date is 2.15 PB. Roughly 3100 subbox snapshots of Illustris-1 have been downloaded. The next most accessed simulation is Illustris-3, likely because it is included in the getting started tutorials as an easy, lightweight alternative to Illustris-1. Since launch, there has been a nearly constant number of ∼ 100 − 120 active users, based on activity within the last 30 days.
To date, 163 publications have directly resulted from, or included analysis results from, the Illustris simulation. While early papers were written largely by the collaboration itself, recent papers typically do not involve members of the Illustris team, representing widespread public use of the data release. Of the 10 most recent papers published on Illustris, only one was from the team. Given the significantly expanded scope of TNG with respect to Illustris, as well as the relatively more robust and reliable physical model and outcomes, we expect that uptake and usage will be similarly broad.

New JupyterLab Interface
In the original Illustris data release, we promoted two ways to work with the data: either downloading large simulation data files directly (referred to above as 'local data, local compute'), or by searching and downloading data subsets using functionality in the webbased API ('remote data, local compute'). Previously, the backend was focused solely on storage and data delivery, and did not have any system in place to allow guest access to compute resources which were local to the datasets themselves. For the TNG data release we have developed this functionality.
We label this newly introduced, third method of working with the data 'remote data, remote computation'. Technically, we make use of JupyterHub to manage the instantiation of per-user JupyterLab instances. These are spawned inside containerized Docker instances (Merkel, 2014) to isolate the user from the host systems -Singularity (Kurtzer et al., 2017) could be used in the future. Read-only mounts to the parallel filesystems hosting simulation data are provided, while the user home directory within the container is made persistent by volume mapping it onto the host. Resource limits on CPU, memory, and storage are controllable and will be adjusted during the initial phase of this service as needed.
The JupyterLab instances themselves provide a familiar environment for the development and execution of user analysis programs. Over the past few years there has been significant recent development on remote, multi-user, rich interfaces to computational kernels, and JupyterLab (the successor of Jupyter, previously called IPython; Pérez and Granger, 2007) is a mature, full-featured solution we deploy. These instances are launched, on demand, inside the sandboxed containers, through a web-based portal with authentication integrated into the existing user registration system of the data release. We anticipate that this will be a particularly interesting development for researchers who would otherwise not have the computational resources to use the simulation data for their science.

Retiring the Relational Database
In the original Illustris data release we noted that the read-only, highly structured nature of simulation output motivates different and more efficient approaches for data search, aggregation, processing, and retrieval tasks. The web-based API uses a representational state transfer architecture (REST, Fielding, 2000), and in TNG we continue to employ a relational database on the backend, although we made a design decision never to expose such a database to direct user query.
Looking forward, instead of bringing the object or group catalog data into a traditional database, one could employ a scheme such as bitmap indexing over HDF5, e.g. FastQuery (Byna et al., 2012;Chou et al., 2011), possibly combined with a SQLcompatible query layer (Wang et al., 2013). In this case, the database would be used only to handle light meta-data -fast index-accelerated search queries would be made directly against structured binary data on disk. This improvement would be largely transparent from the user perspective. Most obviously, it would remove a layer of complexity and the need to ingest of order billions of rows of group catalog data into a database. It would also enable a tighter coupling of search capabilities and on-disk data contents.
More efficient API standards such as GraphQL represent modern alternatives to REST, whereby users make specific, detailed requests to a single endpoint based on a well-defined query language and typed schema, rather than a number of generic requests to a diversity of endpoints. Resolving these declarative queries efficiently and directly on the simulation output data would unify many of these goals -a clear target for future development.

Summary and Conclusions
We have made publicly available data from the Illus-trisTNG simulation project at the permanent URL: http://www.tng-project.org/data/ IllustrisTNG is a series of large-scale, cosmological simulations ideal for studying the formation and evolution of galaxies. The simulation suite consists of three volumes: TNG50, TNG100, and TNG300. Each flagship run is accompanied by lower-resolution realizations, and a dark-matter only analog of every simulation is also available. The current data release includes TNG100 and TNG300 in their entirety, and TNG50 will be publicly released in the future. Full snapshots, group catalogs (both friends of friends halos and SubFind subhalos), merger trees, high timeresolution subboxes, and many supplementary data catalogs are made available. The highest resolution TNG300-1 includes more than ten million gravitationally bound structures, and the TNG100-1 volume contains ∼20,000 well-resolved galaxies at z = 0 with stellar mass exceeding 10 9 M . The galaxies sampled in these volumes encompass a broad range of mass, type, environment and assembly history, and realize fully representative synthetic universes within the context of ΛCDM.
The total data volume produced by the Illustris[TNG] project is sizeable, ∼1.1 PB in total, all directly accessible online. We have developed several tools to make these data accessible to the broader community, without requiring extensive local computational resources. In addition to direct data download, example scripts, web-based API access methods, and extensive documentation previously developed for the original Illustris simulation, we extend the data access functionality in several ways. Namely, with new ondemand visualization and analysis functionality, and with the remote JupyterLab-based analysis interface. By making the TNG data publicly available, we aim to maximize the scientific return from the considerable computational resources invested in the TNG simulation suite. York, NY 10010, USA. 5 Kavli Institute for Astrophysics and Space