Virtual reality

News Real-time mixed reality display of dual particle radiation detector data

Radiation image calculation

An overview of the data pipeline from detector interaction to MR display is shown in Figure 2. The H2DPI system is set up on an optical plate to ensure repeatable positioning, and a more detailed description of the detector geometry can be found in the Methods section.

figure 2
figure 2

Overview of data processing from particle interaction to MR display. (A sort of) Radiative interactions in the H2DPI detector system are observed with a SiPM array connected to a digitizer. (b) acquired waveforms are sent periodically via fiber optics to the acquisition PC where they are filtered and sent via WiFi to the HoloLens2. (C) HoloLens2 projects data into the user’s field of view. single event (d) can be used to estimate the total count rate for each detector (i.e., neutrons and gamma rays), which exhibit a specific shape depending on the azimuth position of the source (electronic). The average count rate for each detector is given as input to the pretrained neural network (f) predicts the azimuth of the source, displaying an arrow to the user. Double scattering events in two different detector volumes (G) allows reconstruction of the incident cone of the detected particle with the aperture angle \(\α\). Subsequent particle classification of individual events in an organic glass scintillator (OGS) by pulse shape discrimination (h) are able to distinguish between neutron-neutron and gamma-gamma incidence cones. Discriminative lines between pulse shape clusters (white) were found by fitting a double Gaussian to each energy-equivalent slice and using the valley as the decision threshold. Using the backprojection algorithm, the cones are superimposed to create the radiometric image (A generation) in the corner space (Azimuth \(\theta\)altitude \(\Phi\)) for gamma rays or fast neutrons. A 3D projection of the image is then displayed to the user in MR.

H2DPI consists of a series of detectors optimized for imaging, in particular by arranging 12 Plexiglas scintillator rods and a lattice of 8 CeBr for fast neutron and gamma ray detection3 Inorganic scintillators for optoelectronic gamma-ray detection. Radiometric images were computed from double-scattering events in H2DPI (Fig. 2g), i.e., all waveforms were filtered into event pairs that met the condition to occur within 30 ns of each other in two different detector volumes.These twin events are then classified as gamma ray or fast neutron events by the single event waveform4 (Fig. 2h), and filtered to only consider neutron-neutron and gamma-gamma doubles.Each double scatter event has an associated estimated incidence coneThey are superimposed to construct what is called a simple backprojection image3. By using neutron-neutron or gamma-gamma events, neutron and/or gamma-ray images accumulated over time can be calculated in angular space (Fig. 2i).

The digitized waveforms from the detector system (Figure 2a) are sent to the acquisition PC (Figure 2b) in arbitrary batch sizes (typically set to write a batch approximately every 2 seconds). Event filtering for the double-scattering incidence cone is performed here. Individual cones are then available on a locally hosted server that HoloLens2 accesses wirelessly (Figure 2c). It was found that sending individual cones instead of images reduces the overall network pressure, allowing a constant flow of data from the detectors to the HoloLens.

Real-time radiation data display in MR

The MR display in HoloLens2 is achieved in a few steps using an application written in the Unity game engine25 Runs on HoloLens. Start by providing HoloLens with the position of the detector. 3D spatial mapping is a fundamental feature of HoloLens, and the application programming interface (API) provides back-end programmers with a spatial map of their surroundings. We further use the HoloLens API, which allows scanning a QR code in space to locate a certain 3D coordinate in a given spatial map. The QR code is set at a known position on the H2DPI surface relative to the center of the detector system (see Fig. 3a,d), enabling the calculation of the reference position.

At the same time, the HoloLens uses the local WiFi network to query the server hosted on the acquisition PC to retrieve the detector data. After receiving the frustum, the visualizer performs a simple backprojection to generate a radiometric image texture in H2DPI space. The image is then rendered onto the spatial grid using a pixel shader. For each pixel on the spatial grid, the pixel screen position is converted to reference coordinates and then to H2DPI spherical coordinates for sampling from radiometric images. A colormap is applied to convey intensity, creating a colored impression of a spatial grid that is otherwise transparent. Therefore, a more intense coloring of the grid will indicate a more intense radiation image in that spatial direction.

On top of the spatial mesh shading, we iterate over each pixel in the backprojected texture and cast a ray from H2DPI to account for the angular nature of the information in the image. If the ray intersects the spatial grid beyond a certain threshold, it draws a line between the H2DPI and the spatial grid. All mentioned processing steps typically take less than a second, providing a visually smooth experience. Users can display ray or spatial mesh shading via a virtual menu on the wrist (i.e., when the user’s wrist is in the field of view of the HoloLens, a menu appears with virtual buttons for settings).

As an illustration of the resulting visual experience, we show photos taken with the HoloLens depicting the two experiments in Figure 3.In the first example, we hide a 137The Cs source is located under one of the three caps (Fig. 3a). After approximately 30 seconds of acquisition time, the HoloLens displays either a color space grid (Figure 3b) or rays (Figure 3c) based on the backprojected image. At any time, the user can switch between the display of gamma-ray or neutron images.Examples of neutron and gamma-ray images are shown in Fig. 3e,f, where we use 252The Cf source is set up in the previous detection system.

An important aspect of hotspot color shading is the choice of colormap.We hereby follow the rule of using a perceptually uniform sequential colormap26. We implement the standard perceptual uniform sequential colormap used in Matplotlib (viridis, inferno, magma, plasma, cividis)27 As an option, the user can select in a virtual menu. Thus, the user can choose an appropriate colormap to maximize the contrast of a given background, or to adjust for the user’s ability to distinguish colors. For example, a colormap containing red (hell) on a green background (e.g. grass) may not be informative for a user suffering from deuteranomaly, who may choose to switch to a yellow/blue based map (cividis).

image 3
image 3

Example of mixed reality visuals seen by a user wearing HoloLens2. (A sort of) a 100 \(\mu\)Kindness 137The Cs source is set under one of the three covers on the front of the detector system. (b) In MR, the user sees the color of the spatial grid after approximately 30 seconds of acquisition. (C) the user can choose to also display rays pointing at the hotspot. (d) 1 mCi 252Compare the source code in front of the system. (electronic) gamma-ray images are formed in about 20 seconds. (f) the user can switch to the (simultaneously acquired) neutron image via the wrist menu. After about a minute, an image of convergent neutrons intersecting the expected source location can be seen.

Fast Source Search via Neural Networks

Double scattering events are rare compared to single events, and in many cases a sufficiently convergent image may not be formed within a given measurement time. To optimally guide detector operators to move the detector system to a closer location, we recommend using a single interaction count rate to quickly predict the location of the source (Fig. 2d), as it can be measured during the measurement. Due to the arrangement of the detector lattice, we expect a gradient in the count rate (Fig. 2e, i.e., detectors closer to the source have a higher count rate than detectors farther away, allowing statistical learning methods to predict the angle of incidence).

The count rate for each detector is estimated from a rolling average of the waveform over a 1-2 second long measurement period and is used as input to a trained neural network which outputs the estimated azimuth position of the source. We describe the neural network (training data and hyperparameters) in the Methods section. In MR, the user sees an arrow pointing in the direction of the predicted azimuth.

Figure 4 shows an example of MR visuals for source direction estimation for both 137cesium and 252Reference source. Due to the small size of the neural network, the prediction takes less than 1 second and provides the expected results. Uncertainty quantification of neural network predictions is beyond the scope of this study, so we present this result qualitatively as a proof of concept.Note that the neural network is in 252Comparing the simulations, but both predictions are correct 252cf and 137Cesium experiments. This shows promise for future research to elucidate the performance of neural networks, especially with regard to training them on multiple sources, multiple distances, and elevation angles.

Figure 4
Figure 4

Example of mixed reality visuals seen by a user wearing HoloLens2 using a neural network for fast source direction estimation. (A sort of) a 100 \(\mu\)Kindness 137The Cs source is located at three different locations around the detector, and the MR display shows yellow arrows to represent the neural network’s guess. (b) 1 mCi 252The Cf sources are located at different locations around the detector. In each case, the estimated position of the source is predicted and displayed in less than 1 second.

Detection Limits of Current H2DPI Systems

Double scattering events are relatively rare and their occurrence is a major limitation in achieving radiometric images that converge according to some empirically determined thresholds.In Fig. 5 we show the double-scattering event rate 252cf and 137The Cs source increases in distance to the system to account for this. The drop in single event rate clearly follows an approximate inverse square law with increasing distance, and we observe a drop in double scatter rate. We further indicate the approximate event rate (arbitrarily defined as 1000 cones based on the author’s experience) for producing converging images, showing that the source used can only image converging within a surrounding 1-meter radius in less than 1 hour. system. The H2DPI system will be upgraded in the future to include up to 64 plexiglass scintillator rods, increasing neutron detection efficiency ~10X and gamma ray detection efficiency ~20X, thereby alleviating convergence time issues4.

However, the resources we use are relatively weak. Extrapolation of bi-event rates and activities is expected to scale linearly, allowing scenario-specific predictions.For example, most radiation accidents caused by lost or stolen sources in the late 20th century involved sources with activity >100 GBq28, and thus more than three orders of magnitude stronger than the source we used in this work.Here, it is assumed that the model fits us for 252Cf, formed within a minute at distances of more than 10 meters, allows safe assessment of the location of the source even with current systems.

Figure 5
Figure 5

Double scattering interaction rate (per minute) vs. source to detector system distance of 100 \(\mu\)Kindness 137Cesium and 1 mCi 252Reference source. The inverse square law model fit is shown for illustration, with the two lines next to it representing the time required to achieve a converged image (1000 double scattering events). Error bars are usually smaller than markers.

Related Articles

Back to top button