« If not Notebooks, then what? Look to Literate Programming | Main | Because it's Friday: Hurricane Trackers »

September 14, 2018

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

I stopped reading after:
"Sadly however the code and data behind the analysis have not yet been released..."

What a sham. "Lies, Damn Lies and Statistics" indeed.

No data => BS results. Let us see the raw data and we will make up our own minds.

Where were all the bodies at the morgues? Nobody went there to check.

Is it good practice to give the central estimate to 4 significant figures when the CI range is about 40% of the central figure? The middle column is surely more fairly summarised as "2100 + or - 200 deaths".

It's sad that the code and data for one analysis were not released, as it gives people who would deny the results no matter what a seemingly legitimate reason to do so (as seen by the statement made by the first poster above, who failed to read far enough to see this:

These results are consistent in scale with another earlier study by Nishant Kishore et al. (The data and R code behind this study is available on GitHub.)

So let's compare the two reports. The GWU report makes the headline (https://edition.cnn.com/2018/08/28/health/puerto-rico-gw-report-excess-deaths/index.html) :

"Puerto Rico's new Hurricane Maria death toll is 46 times higher than the government's previous count"

The GWU report claims to be consistent with the previous “Mortality in Puerto Rico after Hurricane Maria”(https://www.nejm.org/doi/full/10.1056/NEJMsa1803972) But that one claims:

“...the number of excess deaths related to Hurricane Maria in Puerto Rico is more than 70 times the official estimate.”

OK… I understand the GWU report alleges to focus on the direct causality between Maria and mortality. But the headlines are frustrating and makes the “scientific Community” appear to be incapable of being consistent.

We have advocated for “reproducible research” within the R community for years. Until I see the data and methodology, I will continue to “deny the results”.

@ John Brooks: Certainly releasing the data (and code) would be more transparent, but the study (linked above) is pretty detailed in its methodology explanation. IMO, the "Methods" section (in the actual report, pages 3-7) includes more details and is better written than most peer-reviewed publications.
To immediately conclude that "no data => BS results" seems unfair. I would rather have a clear and thorough methods section than the raw data.
Plus, what's more likely: a "6 to 18" death toll (as reported by the US president) or a ~3,000 death toll?

Vastly boring and not worthy of a technology blog - keep politics out of this blog.

@Jordan Erickson: Fair enough, my initial comment was harsh.

Unfortunately, this issue has been politicized, mainly due to the President's tweets. The official number of 64 was (per the above report) attributed to direct causes from the hurricane is probably still valid. But residual, collateral or lingering effects, are much more amorphous. For example, veterans from the Vietnam War are still dying from the effects of exposure to Agent Orange... shall these be considered "excess deaths"? Is there a time limit for attributing a death to an event...?

The term, with focus on "excess", and the implication thereof, seems to be lost in common discourse.

I do hope to be able to review the actual data and methodology at some point. In any case, the report makes recommendations that are applicable to any jurisdiction facing a natural disaster.

@Jordan Erickson: Looks like I was responding to your post before your edit... I agree, keep politics out of a tech blog. But the OP opened the discussion with a political statement:

"President Trump is once again causing distress..."

I'm finished. :) Have a good day.

One thought I have about the methodology is that they are comparing year on year deaths to determine what the excess deaths were due to the hurricane. This implies that the control values would have stayed the same year on year. That is where I have a problem with the study. (Concern heightened by not having data or code released...)

If the island nation went into default in May and the government was unable to pay creditors, what affect did that have on hospitals, medical coverage and access to medicine? Certainly this kind of variable would have an impact on the quality of healthcare year on year. How was this impact determined? What percentage of the deaths would be attributable to this change that began months before the hurricane? What impact did this have on the government infrastructure before and after the hurricane? What is due to the weather event and what is due to the crash?

From the article and the study notes it did not seem that this was taken into account. Those facts, more than the politics, make me question the conclusions.

Not sure why you have to make this political. Please keep politics out of technology blogs. I won't be clicking on links to Revolutions blogs as a result.

"Excess deaths" is calcuated as actual deaths minus expected deaths from Sep 2017 to Feb 2018.

However, what I don't see in the study is the accuracy of their expected deaths forecasts prior to Sep 2017. If their expected deaths forecast have a high mean absolute percentage error, then their residuals (or "excess deaths") post-Sep 2017 are obviously going to be high.

The comments to this entry are closed.

Search Revolutions Blog




Got comments or suggestions for the blog editor?
Email David Smith.
Follow revodavid on Twitter Follow David on Twitter: @revodavid
Get this blog via email with Blogtrottr