
eLetters is an online forum for ongoing peer review. Submission of eLetters are open to all . Please read our guidelines before submitting your own eLetter.
- RE: Response to Hetzer et al
- Samuel Patz, Department of Radiology, Brigham and Women’s Hospital, Boston, MA, USA
(10 April 2020)Response to Letter to the Editor
Submitted by Hetzer et al. [1] regarding our Science Advances paper [2] titled:
Imaging localized neuronal activity at fast time scales through biomechanics.
Samuel Patz1,2, Daniel Fovargue3, David Nordsletten3,4 and Ralph Sinkus3,5
1Department of Radiology, Brigham and Women’s Hospital, Boston, MA, USA
2Harvard Medical School, Boston, MA, USA.
3School of Biomedical Engineering and Imaging Sciences, Kings College London, London, UK
4Department of Biomedical Engineering and Cardiac Surgery, University of Michigan, Ann Arbor, MI, USA
5Inserm U1148, LVTS, University Paris Diderot, University Paris 13, Paris, France
Corresponding authors:
Samuel Patz: patz@bwh.harvard.edu
Ralph Sinkus: ralph.sinkus@inserm.fr
We appreciate Hetzer et al.'s thorough analysis of our work. This is a good example of how results are vetted in the scientific community. Hetzer et al. make critiques regarding the validity of our observations, leading them to claim that our observation of a functionally mediated change in brain tissue stiffness is unfounded. Below we provide a rebuttal of these serious critiques, as well as their more minor concerns, d...
Show MoreResponse to Letter to the Editor
Submitted by Hetzer et al. [1] regarding our Science Advances paper [2] titled:
Imaging localized neuronal activity at fast time scales through biomechanics.
Samuel Patz1,2, Daniel Fovargue3, David Nordsletten3,4 and Ralph Sinkus3,5
1Department of Radiology, Brigham and Women’s Hospital, Boston, MA, USA
2Harvard Medical School, Boston, MA, USA.
3School of Biomedical Engineering and Imaging Sciences, Kings College London, London, UK
4Department of Biomedical Engineering and Cardiac Surgery, University of Michigan, Ann Arbor, MI, USA
5Inserm U1148, LVTS, University Paris Diderot, University Paris 13, Paris, France
Corresponding authors:
Samuel Patz: patz@bwh.harvard.edu
Ralph Sinkus: ralph.sinkus@inserm.fr
We appreciate Hetzer et al.'s thorough analysis of our work. This is a good example of how results are vetted in the scientific community. Hetzer et al. make critiques regarding the validity of our observations, leading them to claim that our observation of a functionally mediated change in brain tissue stiffness is unfounded. Below we provide a rebuttal of these serious critiques, as well as their more minor concerns, demonstrating that our conclusions are valid.
Statistics
The principal concern of Hetzer et al. is that our z statistic criteria, |z| is greater than or equal to 1 does not provide sufficient significance. However, this conclusion from Hetzer et al. seems to stem from a misunderstanding of our analysis.
The average noise in our control experiments ranged from 0.5 – 0.8 kPa. The noise was determined by measuring the standard distribution over all 707 voxels of the control DG’ and DG” maps from individual animal studies. We used these values as a threshold for the z-score. What we used to calculate activated voxels, however, are single DG maps obtained by averaging over all animals, producing one experiment and one control average map. The procedure is that datasets from individual experiments are first morphed to a Waxholm atlas and averaged before any differences are calculated. The purpose of the averaging operation is, of course, to reduce the noise, which now ranges from only 0.2 – 0.3 kPa, as documented in the Supplementary Material for our paper. Consequently, the threshold of 0.5 – 0.8 kPa now becomes an effective z-score that ranges from 2.25 to 2.75 (see below for details), providing highly statistically significant results. Figure 1 shows pixels satisfying the effective z-score for DG’ and is equivalent to Figure 4a of our paper. This threshold provides a very low false positive rate in the control experiments and demonstrates neuromechanically activated regions consistently over all 3 experiments. In particular, the cingulate shows activation for DG’ for all timescales, and the motor-cortex for the two slowest timescales. This anatomical correspondence between timescales is not true for the viscosity (Figure 2) and is one reason why we do not consider the DG” regions significant. A second reason is that the number of false positive results in the control DG” maps is of similar magnitude as those seen in the experiment DG” maps (Figure 2). Thus, while there may be viscous changes, our data is not of sufficient quality to show it unambiguously.
We conclude that the statistical significance of our results is valid and there is a change in G’ associated with neuronal activity.
Outlier from SLOW dataset
This particular dataset posed issues in the thalamus region for the phase unwrapping algorithm we used at the time of publication. The neuromechanically activated regions for the SLOW experiments are not affected by this cropping. Meanwhile, we solved the unwrapping issue. Fig.3 shows the corresponding activation maps adding the uncropped dataset, which are identical to those presented in the paper, as expected. All other activation maps are of course unaffected by that dataset.
Communication
The editor from Science Advances contacted us in Sept 2019 for two different issues. Each time upon being notified of an issue, we immediately responded, which was acknowledged by the editor. When we uploaded the code, we were unaware that we had left out one subroutine. We received no further communication either from Science Advances or from Hetzer et al. Since receiving Hetzer et al.’s Letter to Editor, we have made sure that all code is in the data repository.
Details
As mentioned, we based our z score cuts on the average noise (<SD>) over individual control experiments. After morphing to the Waxholm atlas and averaging over all animals to create a single DG’ and DG’’ map, the noise was significantly reduced in the final maps (SDavg-map) with the effective z-score now ranging from 2.25 – 2.75. Tables 1a and 1b provide details.
CONTROL DG’ SD’s
SLOW
FAST
Ultra-FAST
<SD> (kPa)
0.552
0.718
0.517
SDavg-map (kPa)
0.202
0.321
0.202
<SD>/SDavg-map
2.75
2.25
2.60
TABLE 1a.
CONTROL DG” SD’s
SLOW
FAST
Ultra-FAST
<SD> (kPa)
0.603
0.810
0.560
SDavg-map (kPa)
0.234
0.340
0.216
<SD>/SDavg-map
2.60
2.40
2.65
TABLE 1b.
To demonstrate the very low p-value (or low false positive rate), for our determination of activated voxels for DG’, see Figure 1. Notice that for the Controls, there are no false positives for the SLOW and Ultra-FAST cases whereas only 3 voxels appear for the FAST case. This shows that using actual data, the false positive rate for the FAST case is less than 3 out of 707 voxels, which equals p=0.0042. This experimentally determined false positive rate is even smaller for the SLOW and Ultra-FAST cases.
Figure 1. Using averaged DG’ maps, voxels shown are those that satisfy the criteria |DG'| > 0.55, 0.72, 0.52 respectively for SLOW, FAST and Ultra-FAST. For the Control differences, no voxels satisfy the criteria in the SLOW and Ultra-FAST data whereas only 3 voxels (arrow) appear in the FAST data. Color bars represent percentage differences as used in our paper.
Regarding the viscosity analysis, Figure 2 shows the corresponding data for DG”. Unlike for DG’, we do not see activated regions that are co-localized for adjacent time-scales or “for all experiments”. And further, the number of activated voxels in the experiment maps is of similar magnitude as the false positives in the control maps.
Figure 2. Using averaged DG” maps, voxels shown are those that satisfy the criteria |DG"| > 0.6, 0.81, 0.56 respectively for SLOW, FAST and Ultra-FAST. Note that it is well known, and also borne out by our data, that error bars for G” determination are inherently higher than for G’.
Outlier Issue
Another concern from Hetzer et al. is that there is an outlier in the data set. The original control data set had image construction issues and one possible remedy was to try a different mask. Figure 3 shows that the activated ROIs for the SLOW experiment are unaffected by this cropped dataset. Raw data goes through processing to construct the wave images and the wave images are subject to an inverse problem to construct elasticity images. Hence, small effects near mask boundaries are present. The corresponding datapoint in Figure 4E of the original manuscript for that particular animal and experiment is hence changing from 0.979kPa to 0.983kPa which is representing a change of less than 0.5%.
Remaining concerns of Hetzer et al.
Student's t-test
As described above, the population variance is very well determined as it is calculated from the difference distribution of all brain MRE voxels, which was 707 voxels. Therefore, the use of the z statistic is justified.
Negative viscosity
Hetzer et al. do not quote the entire comment from the code which is “Data from Dec 30 2015 has negative viscosity due to image reconstruction parameters”.
For this case, the real and imaginary parts of the dataset were flipped which acts like reversing of time (hence the “-1” in the file). Consequently, as expected, the viscosity turns negative. Of course, this sign flip is undone downstream in the processing pipeline.
Multiple comparisons
In our case, the three experiments, SLOW, FAST and Ultra-FAST, were completely separate, each obtained from a different set of MRI scans. Multiple comparison would only apply if the same data were applied to multiple experiments. The two parameters for each stimulus switching speed, G’ and G”, are derived from the same datasets. Hence, the multiple comparison factor is 2. To achieve a false positive rate less than 5%, we need to set p = 0.025. Table 2 shows that all results are still significant.
SLOW
FAST
Ultra-FAST
z-score (from Table 1a)
2.75
2.25
2.60
p-value (2 tailed)
0.006
0.024
0.0093
Multiple comparisons
2
2
2
Corrected p=0.05 value
0.025
0.025
0.025
Table 2.
We further note that even if one considers all 6 measures (DG’, DG’’, and three different stimulus switching rates) as needing to be regarded as multiple comparisons, putting all 6 p-values through the Benjamini-Hochberg multiple comparison criteria still results in all of our results being significant at the p=0.05 values.
Excitation frequency
This critique is correct. We should have stated: “…to ensure a minimum of approximately 10 pulses per ON state.”
Registration procedure
The manual registration was also done with the MRE magnitude images. In a similar way as described in the beginning of the registration description, we used elastograms synonymously with MRE magnitude in the context of registration.
Pulse frequencies
We are aware of this issue. Note that we use “pulse frequency” to represent the frequency of the electrical stimulation pulses during the stimulus ON time. What Hetzer et al. are referring to here is the stimulus switching rate where we alternate between two stimulus states every 9 seconds (SLOW), 0.9 seconds (FAST) or 0.1 seconds (Ultra-FAST). While it certainly makes sense to define one period as the time it takes to go through one full cycle of a stimulus ON and OFF, we decided that the term “stimulus switching rate” could also be interpreted as the inverse of the time spent in one stimulus state as this is the rate at which one switches between stimuli.
References
[1] S. Hetzer, C. Allefeld, S. Hirsch, P. Boehm-Sturm, and U. Dirnagl, “Letter to the Editor,” Sci. Adv., vol. Online eLetter, Apr. 2020.
[2] S. Patz et al., “Imaging localized neuronal activity at fast time scales through biomechanics,” Sci. Adv., vol. 5, no. 4, p. eaav3816, Apr. 2019, doi: 10.1126/sciadv.aav3816.
Competing Interests: None declared. - Response to Imaging localized neuronal activity at fast time scales through biomechanics
- Stefan Hetzer, Berlin Center for Advanced Neuroimaging Charité – Universitätsmedizin Berlin
(10 April 2020)Letter to the Editor
Stefan Hetzer1,2, Carsten Allefeld3, Sebastian Hirsch1,2, Philipp Boehm-Sturm4,5, Ulrich Dirnagl4,6
1Berlin Center for Advanced Neuroimaging, Charité – Universitätsmedizin Berlin, corporate member of Freie Universität Berlin, Humboldt-Universität zu Berlin, and Berlin Institute of Health, Berlin, Germany
2Bernstein Center for Computational Neuroscience, Berlin, Germany
3Department of Psychology, City, University of London, London, UK
4Department of Experimental Neurology and Center for Stroke Research, Charité – Universitätsmedizin Berlin, Berlin, Germany
5NeuroCure Cluster of Excellence and Charité Core Facility 7T Experimental MRIs, Charité – Universitätsmedizin Berlin, Berlin, Germany
6QUEST Center, Berlin Institute of Health, Berlin, GermanyCorrespondence address:
Dr. Stefan Hetzer
Berlin Center for Advanced Neuroimaging
Charité – Universitätsmedizin Berlin
Charitéplatz 1
10117 Berlin, Germany
stefan.hetzer@charite.de
Tel +49 30 450 539353
Competing Interests: N...
Show MoreLetter to the Editor
Stefan Hetzer1,2, Carsten Allefeld3, Sebastian Hirsch1,2, Philipp Boehm-Sturm4,5, Ulrich Dirnagl4,6
1Berlin Center for Advanced Neuroimaging, Charité – Universitätsmedizin Berlin, corporate member of Freie Universität Berlin, Humboldt-Universität zu Berlin, and Berlin Institute of Health, Berlin, Germany
2Bernstein Center for Computational Neuroscience, Berlin, Germany
3Department of Psychology, City, University of London, London, UK
4Department of Experimental Neurology and Center for Stroke Research, Charité – Universitätsmedizin Berlin, Berlin, Germany
5NeuroCure Cluster of Excellence and Charité Core Facility 7T Experimental MRIs, Charité – Universitätsmedizin Berlin, Berlin, Germany
6QUEST Center, Berlin Institute of Health, Berlin, GermanyCorrespondence address:
Dr. Stefan Hetzer
Berlin Center for Advanced Neuroimaging
Charité – Universitätsmedizin Berlin
Charitéplatz 1
10117 Berlin, Germany
stefan.hetzer@charite.de
Tel +49 30 450 539353
Competing Interests: None.
Dear Professor Thorp,
We appreciate your journal’s policy embracing the open data principle, and thank you for your help in accessing data and code for the paper recently published in Science Advances by Patz et al. (1).
Given the authors’ claim to have developed a new methodology which uses magnetic resonance elastography (MRE) to track neuronal activity on time scales inaccessible to conventional BOLD functional imaging, we feel that a more thorough and critical data analysis methodology would have been required to substantiate the extraordinary claims made in this article. In this letter we would like to expose severe methodological weaknesses of the publication, along with outlining some minor flaws, which together invalidate the authors’ claim that they have presented a ‘new methodology for noninvasively tracking brain functional activity at high speed’.
Despite multiple requests and the authors’ attempts to comply with ScienceAdvances’ open data policy, the processing pipeline provided by the authors is not executable due to a missing sub-function (expandOrShrinkMask). However, we found that this sub-function has only a negligible influence on the results – at least with respect to our purpose: to check how the results change if significance levels are used that are more commonly agreed on in the fMRI community.
The result of our reanalysis is that neither the data nor the statistical analysis by Patz et al. support the interpretation given by the authors, namely that neuronal activity was detected in their experiments delivering painful stimuli to mice.
Figure 1
First, our main concern is the very unusual choice of thresholding the computed statistic at |z| > 1 (combined with a cluster threshold of k > 2). This choice corresponds to a significance level of p = 0.32, which is much higher than the commonly used significance levels between 0.05 and 0.001 (2), and which means that by chance alone in the absence of any stimulus one would expect “significant” effects in 32 % of the voxels. Moreover, there is general agreement in the neuroimaging community that fMRI statistical tests need to be corrected for multiple comparisons (3).
We were able to reproduce the pattern of activations reported by Patz et al. using their threshold of |z| > 1 (see Fig. 1 left, 1st and 2nd row). However, when we increase the threshold to |z| > 1.96, corresponding the minimally acceptable significance level of p < 0.05, no activation is detected for any of the three contrasts (Fig. 1 left, 3rd and 4th row). In summary, using state-of-the art statistical thresholds used by the fMRI community, not a single significant signal change in response to the stimulus could be found.
Figure 2
Second, the group analysis is based on data that includes an obvious outlier with respect to the elastogram of one mouse brain scanned in the first experimental condition (slow electric stimulation). As shown in Fig. 2A, roughly 70% of this brain was removed from further analysis by the brain mask, but this selective use of data was not reported in the article. It is surprising that this outlier was not detected during the manual inspection as part of the registration process on page 9: “[…] a manual ±1 voxel adjustment was sometimes made to best align the ventricles.”
When this outlier is removed, the statistical analysis with |z| > 1 shows “activation” patterns similar to those reported (Fig. 2B), but with two new clusters appearing. Not surprisingly, after using an appropriate threshold (p < 0.05) none of the activations is conserved.
Third, there are two false statements made in the paper:
a) The authors state: “Although both G′ and G″ were reconstructed, significant differences were only observed for G′ for all experiments.” However, full inspection of the results shows several “activations” (at |z| > 1) across the brain in the viscosity (G″) contrast, in the fast and ultra-fast condition (see Fig. 3).
Figure 3
b) In order to demonstrate that “a common mechanical brain response to neuronal activity” (page 3) is detected by significant signal changes during the experimental condition (with stimulation), the absence of such significant signal changes during the control condition (without stimulation) is required. In line with that, the authors state on page 6: “As expected, control scans show no stiffness changes”. However, full inspection of the results clearly shows an “activation” (|z| > 1) in the right ventricle for the fast control condition.
After careful consideration of all information gathered during our attempt to reanalyze the data we must express serious concerns regarding the scientific validity of the paper by Patz and colleagues. In addition to the points made above, we attached a list of several minor problems with the paper that we would be happy to address in a technical comment.
Best Regards,
Stefan Hetzer and colleagues
Minor concerns
- Interestingly, when the authors were asked to make their code available, they not only uploaded the code to the journal website, but also replaced the previously uploaded original dataset. Comparing this new data with the original data showed that all files were binary identical except for the files containing the outlier referenced in our Fig. 2, which were replaced with new data under the same filename (recon_1st-m1_24May2016_slow-fmre.mat; recon_2nd--m1_24May2016_slow-fmre.mat).
- In the code, there are two comments pointing at an MRE quantification problem that was not reported in the manuscript: “Data from Dec 30 2015 has negative viscosity”. Negative viscosity is physically impossible. Later in the code, individual viscosity maps with this comment were simply multiplied by -1.
- If the population variance is unknown and the sample size is not large (N<30), the Student's t-test is more appropriate than the z-test used by Patz et al. with N=6±1 in the three experimental conditions.
- When correcting for multiple comparisons the number of experimental conditions (slow, fast, ultra-fast) and the number of viscoelasticity measures explored (stiffness and viscosity) should be considered.
- On page 3 the authors are describing the fast experiment condition: “Here, the electrical stimulation was pulsed at 10 Hz to ensure a minimum of 10 pulses per ON state.” Arithmetically, a stimulus-ON period of 0.9 s allows for maximum of 9 pulses with a frequency of 10 Hz.
- In the description of the registration procedure on page 9, an explanation is missing as to why anatomy/magnitude images (sharper features, similar contrast like the target atlas image) were only used for the automated registration. Instead elastograms were used for the manual registration step: “This manual adjustment was performed on one of the elastograms from each experiment and applied to the remainder from the same experiment.” Furthermore, based on which criteria was decided to use the first or second elastogram for the manual registration?
- The pulse frequencies given in the paper should be corrected to 0.056 Hz, 0.56 Hz and 5 Hz. By definition, the frequency is the reciprocal of the period that is the duration of time of one cycle in a repeating event (between two consecutive pulses).
References
- S. Patz, D. Fovargue, K. Schregel, N. Nazari, M. Palotai, P. E. Barbone, B. Fabry, A. Hammers, S. Holm, S. Kozerke, D. Nordsletten, R. Sinkus, Imaging localized neuronal activity at fast time scales through biomechanics. Sci Adv. 5, eaav3816 (2019).
- J. Carp, The secret lives of experiments: Methods reporting in the fMRI literature. NeuroImage. 63, 289–300 (2012).
- L. Lyon, Dead salmon and voodoo correlations: should we be sceptical about functional MRI? Brain. 140, e53 (2017).
Competing Interests: None declared.