Fundraiser: Stardew Valley

My thought process for choosing this game was pretty straightforward: fall is a time of harvest, some sort of game where I’m harvesting makes sense, so why not play Stardew Valley? I’ve fallen head over heels for sandbox games, yet never played Harvest Moon. And Stardew Valley is an indie game originally made by one person. It ticks every box.

Alas, it also ticks one more: colonialism. The game gives you a farm once owned by a grandparent, for free, and allows you to develop the place into whatever farm or mining business you want. The context and location sound very North American, which implies there were once First Nations people on that land (though apparently the game actually takes place in Russia?). In reality, if you’re stuck in a soul-sucking job there’s no shortage of options to wiggle out, like starting up an art collective or striking for improved work conditions; in game, though, your “choice” is either to remain in stasis or move into someone else’s place. The game never gives you that choice, though, the moment you gain control of your character is the moment after you’ve taken over this new space. To add insult to injury, you’re still not free of capitalism; oh no, you’ll spend a fair bit of time finding ways to earn cash to trade for goods and services, planting the seeds of your old society in this new space.

This doesn’t make the game any less fun, but it also lingers over it like a faint smell. On Saturday December 5th, at 10:00 AM MDT, I’ll be both enjoying and dissecting this game on my Twitch channel. If you like the concept, consider donating to our fundraiser. It’ll help pay off the legal fees we’re still paying thanks to Richard Carrier. Alternatively, toss some money at Skepticon to pay off their Carrier-related bills. No funds? Not a problem, though you might want to read this to get into the spirit of things.

Fundraiser: What’s So Scary about Climate Change?

There’s a sort of inevitability that’s common to all horror. When watching Jason or the xenomorph, you’re already certain people will die. The real question is who, when, and how. Sometimes its fairly obvious, for instance that lone person in the vent is a goner, in which case the focus is more on how everyone else copes with the loss. Other times, the death comes out of nowhere; Annie Phillips steps into a car on a sunny day and chats up the driver, only to have her throat graphically slit a few minutes later.

This is a lot like climate change. The basic physics has been known since the 1890’s, so there’s no debate over whether it’s happening (unless you’re the Mayor, of course). The focus is on who, when, and how. Desert regions will get hotter, but maybe plants will also get more poisonous? Tinkering with an entire planet will inevitably lead to unforeseen or odd circumstances, most of which will be unwelcome.

So Abe Drayton has joined with PZ Myers, Joshua Johnson, and a sacrificial lamb (me!) to discuss the spooky side of climate change. It’ll be premiering in 45 minutes over on my YouTube channel. It’s just one of many things we’re doing to help pay off our legal bills, so if you have some spare candy please consider donating it to us or Skepticon.

The View from The Street

Whenever mass protests arise, I’m always indebted to the people and protestors who stand right in the thick of it. Hunter Walker, for instance, gave me quite a bit of insight into the Washington, DC. protests. For the Portland, Oregon protests, I got lucky and someone on this very network has been covering them.

1. yes, we’ve always had a few asshats in the crowd doing asshat-y things like throwing fireworks.

2. We actually didn’t have any of that last night, to the point where there was not even a single instance of coordinated banging on the fence to make noise (and not to damage the fence). Like, this shit was peaceful. 100% peaceful. No excuses peaceful. I was actually surprised we could get more than 1500 people down there for a protest like this, with real, legitimate grievances that would anger any caring heart, and have no one engaging in any of the behaviors that they’ve used to justify past attacks. No one at all. I was so fucking proud of us before the tear gas flew and chaos came down. This shit wasn’t even 1% on the protesters. This shit was all on the feds. All of it.

and,
3. Holy fuck, those assaults last night were BAD. Really bad. Mega bad. Even, if you’ll pardon the pun, MAGA BAD.

Crip Dyke has been on the case, which is amazing when you realize her ‘nym is quite literal.

And now we’re back where we started, with me telling you about the decision I had to make to stay and possibly be pushed away from the car, and because of my slower ability to flee inevitably coming into contact with cops that I **know** assault crutch users as if they were armed. If I fell, would I even be able to get up? Especially if the club was aimed at an arm or wrist?

I talk with BFF and she’s scared. We haven’t been together, but she has her own scary stories about how aggressive the cops have been tonight. She convinces me to get in the car. We’re sitting. We’re talking. We make the decision. We leave.

I felt bad retreating with others still facing the Feds’ rage, but it was the right decision.

Tonight was so bad.

If you’re listening to me, if you’ve been listening to me the past 11 days, I’m telling you, however bad the other nights have been, however much you thought those nights sounded scary, they weren’t tonight. Tonight was its own thing, a category to itself.

She has an extensive series on the protests, in fact. You can learn that expired tear gas was fired, watch as she ponders discomfort, cringe as she reveals the Feds were poisoning the air, enjoy a few flowers, witness a police-induced stampede, dream about glitter, observe people getting tear gassed without warning, sigh as people fall short, see the change that happens when Portland gets national press coverage or when the Mayor is nearby, listen to a detailed account of police violence, rewind back to when she was first tear-gassed as well as a first set of photos from the protests. It’s well worth your time.

I know it may not seem that way. Click on the first link to her blog, and you’ll see I’m only getting around to sharing these links a month after they were written. Why on Earth would I link to stale news, surely the protests stopped when the Feds pulled out?

The worst nights follow the same script: A large group takes to the streets calling for an end to police violence and systemic racism. A small fraction commits low-level crimes — often lighting small fires, graffiti-ing buildings and throwing fireworks or water bottles at officers. The police respond with force against the entire crowd.

Over the last month, demonstrators have been battered with batons as they left protests. Police have charged at crowds until they’re pushed deep into residential neighborhoods. Journalists have been shoved and arrested. Tear gas, while used more sparingly than in the early days of the protests, is threatened near nightly. And police regularly shut down protests by declaring them riots. That happened twice over the weekend, though police declined to intervene as far-right activists, some brandishing firearms, brawled with counter-protesters for hours on Saturday afternoon. […]

The mayor recognizes the problem with these scenes that play out on the streets of his city every night: non-violent protesters facing force as police respond to the misbehavior of a few. He just hasn’t found the answer.

“the weekend” referred to above is the weekend of August 22nd. The protests didn’t stop, we just stopped paying attention to them when the level of violence dropped to an “acceptable” level. As I type this, lawsuits are being launched against the US federal government over their behaviour in Portland. The events Crip Dyke documented continue to have resonance, and are due to be replicated elsewhere.

In fact it’ll probably happen this week. Jacob Blake was shot in the back seven times by the police of Kenosha, Wisconsin, as his three children watched on in horror. On day three of the protests against the incident, a gunman opened fire on peaceful protestors, killing two and wounding a third. By now, you shouldn’t be shocked at what happened next.

The apparent shooter, meanwhile, was seen on video walking away from the scene — his AR-style rifle clearly visible, his hands above his head. But Kenosha police who were responding to the reports of gunfire showed no interest in arresting or even questioning the man. Instead, they asked him for directions. “Is someone injured, straight ahead?” an officer asks him via loudspeaker. “Get out of the road,” said another.

He even approached an idling police car, going up close to the window, but then appeared to change his mind and walked away.

Brent Ford, 24, a photographer, witnessed the entire scene. “He had his hands up and they told him to get out of there, even though everyone was yelling that he was the shooter,” Ford told VICE News. “The police didn’t seem to hear or care what the crowd was saying.”

Yep, the police protected a murderer. After all, he was one of their own.

His connections to law enforcement, however, go beyond his vocal support of police on social media. In a statement to BuzzFeed News on Wednesday, the Grayslake Police Department confirmed that [the shooter] was a former member of the Lindenhurst, Grayslake, Hainesville Police Department’s Public Safety Cadet Program. According to a description that was recently removed from the department’s official website, the program “offers boys and girls the opportunity to explore a career in law enforcement” through “hands-on career activities,” such as riding along with officers on patrol and firearms training.

Along with the page describing the Public Safety Cadet Program, the organization’s official Facebook account was deleted after images from 2018 of a boy in a police uniform [resembling the shooter] began to circulate online.

Before he killed two people, he was apparently being thanked by the police for being there. Even as first-degree murder charges were announced against him, his actions were being obfuscated in order to make them easier to defend. And while I’m not aware of any Republican amounting an explicit defense, this is a party that celebrated two white people who brandished weapons against peaceful protesters, headed by a person who views all protestors as terrorists and fantasizes about torturing people he hates. They have innocent blood on their hands, and they’re likely to get a fresh coat of it.

We will NOT stand for looting, arson, violence, and lawlessness on American streets. My team just got off the phone with Governor Evers who agreed to accept federal assistance (Portland should do the same!) TODAY, I will be sending federal law enforcement and the National Guard to Kenosha, WI to restore LAW and ORDER!

Portland could easily become the new normal in the US. This makes Crip Dyke’s series all the more vital to read.

Fundraising Update 1

TL;DR: We’re pretty much on track, though we also haven’t hit the goal of pushing the fund past $78,890.69. Donate and help put the fund over the line!

With the short version out of the way, let’s dive into the details. What’s changed in the past week and change?

import datetime as dt

import matplotlib.pyplot as pl

import pandas as pd
import pandas.tseries.offsets as pdto


cutoff_day = dt.datetime( 2020, 5, 27, tzinfo=dt.timezone(dt.timedelta(hours=-6)) )

donations = pd.read_csv('donations.cleaned.tsv',sep='\t')

donations['epoch'] = pd.to_datetime(donations['created_at'])
donations['delta_epoch'] = donations['epoch'] - cutoff_day
donations['delta_epoch_days'] = donations['delta_epoch'].apply(lambda x: x.days)

# some adjustment is necessary to line up with the current total
donations['culm'] = donations['amount'].cumsum() + 14723

new_donations_mask = donations['delta_epoch_days'] > 0
print( f"There have been {sum(new_donations_mask)} donations since {cutoff_day}." )
There have been 8 donations since 2020-05-27 00:00:00-06:00.

There’s been a reasonable number of donations after I published that original post. What does that look like, relative to the previous graph?

pl.figure(num=None, figsize=(8, 4), dpi=150, facecolor='w', edgecolor='k')

pl.plot( donations['delta_epoch_days'], donations['culm'], '-',c='#aaaaaa')
pl.plot( donations['delta_epoch_days'][new_donations_mask], \
        donations['culm'][new_donations_mask], '-',c='#0099ff')

pl.title("Defense against Carrier SLAPP Suit")

pl.xlabel("days since cutoff")
pl.ylabel("dollars")
pl.xlim( [-365.26,donations['delta_epoch_days'].max()] )
pl.ylim( [55000,82500] )
pl.show()

An updated chart from the past year. New donations are in blue.

That’s certainly an improvement in the short term, though the graph is much too zoomed out to say more. Let’s zoom in, and overlay the posterior.

# load the previously-fitted posterior
flat_chain = np.loadtxt('starting_posterior.csv')


pl.figure(num=None, figsize=(8, 4), dpi=150, facecolor='w', edgecolor='k')

x = np.array([0, donations['delta_epoch_days'].max()])
for m,_,_ in flat_chain:
    pl.plot( x, m*x + 78039, '-r', alpha=0.05 )
    
pl.plot( donations['delta_epoch_days'], donations['culm'], '-', c='#aaaaaa')
pl.plot( donations['delta_epoch_days'][new_donations_mask], \
        donations['culm'][new_donations_mask], '-', c='#0099ff')

pl.title("Defense against Carrier SLAPP Suit")

pl.xlabel("days since cutoff")
pl.ylabel("dollars")
pl.xlim( [-3,x[1]+1] )
pl.ylim( [77800,79000] )

pl.show()

A zoomed-in view of the new donations, with posteriors overlaid.

Hmm, looks like we’re right where the posterior predicted we’d be. My targets were pretty modest, though, consisting of an increase of 3% and 10%, so this doesn’t mean they’ve been missed. Let’s extend the chart to day 16, and explicitly overlay the two targets I set out.

low_target = 78890.69
high_target = 78948.57
target_day = dt.datetime( 2020, 6, 12, 23, 59, tzinfo=dt.timezone(dt.timedelta(hours=-6)) )
target_since_cutoff = (target_day - cutoff_day).days

pl.figure(num=None, figsize=(8, 4), dpi=150, facecolor='w', edgecolor='k')

x = np.array([0, target_since_cutoff])
pl.fill_between( x, [78039, low_target], [78039, high_target], color='#ccbbbb', label='blog post')
pl.fill_between( x, [78039, high_target], [high_target, high_target], color='#ffeeee', label='video')

pl.plot( donations['delta_epoch_days'], donations['culm'], '-',c='#aaaaaa')
pl.plot( donations['delta_epoch_days'][new_donations_mask], \
        donations['culm'][new_donations_mask], '-',c='#0099ff')

pl.title("Defense against Carrier SLAPP Suit")

pl.xlabel("days since cutoff")
pl.ylabel("dollars")
pl.xlim( [-3, target_since_cutoff] )
pl.ylim( [77800,high_target] )

pl.legend(loc='lower right')
pl.show()

The previous graph, this time with targets overlaid.

To earn a blog post and video on Bayes from me, we need the line to be in the pink zone by the time it reaches the end of the graph. For just the blog post, it need only be in the grayish- area. As you can see, it’s painfully close to being in line with the lower of two goals, though if nobody donates between now and Friday it’ll obviously fall quite short.

So if you want to see that blog post, get donating!

Fundraising Target Number 1

If our goal is to raise funds for a good cause, we should at least have an idea of where the funds are at.

(Click here to show the code)
created_at amount epoch delta_epoch culm
0 2017-01-24T07:27:51-06:00 10.0 2017-01-24 07:27:51-06:00 -1218 days +19:51:12 14733.0
1 2017-01-24T07:31:09-06:00 50.0 2017-01-24 07:31:09-06:00 -1218 days +19:54:30 14783.0
2 2017-01-24T07:41:20-06:00 100.0 2017-01-24 07:41:20-06:00 -1218 days +20:04:41 14883.0
3 2017-01-24T07:50:20-06:00 10.0 2017-01-24 07:50:20-06:00 -1218 days +20:13:41 14893.0
4 2017-01-24T08:03:26-06:00 25.0 2017-01-24 08:03:26-06:00 -1218 days +20:26:47 14918.0

Changing the dataset so the last donation happens at time zero makes it both easier to fit the data and easier to understand what’s happening. The first day after the last donation is now day one.

Donations from 2017 don’t tell us much about the current state of the fund, though, so let’s focus on just the last year.

(Click here to show the code)

The last year of donations, for the lawsuit fundraiser.

The donations seem to arrive in bursts, but there have been two quiet portions. One is thanks to the current pandemic, and the other was during last year’s late spring/early summer. It’s hard to tell what the donation rate is just by eye-ball, though. We need to smooth this out via a model.
The simplest such model is linear regression, aka. fitting a line. We want to incorporate uncertainty into the mix, which means a Bayesian fit. Now, what MCMC engine to use, hmmm…. emcee is my overall favourite, but I’m much too reliant on it. I’ve used PyMC3 a few times with success, but recently it’s been acting flaky. Time to pull out the big guns: Stan. I’ve been avoiding it because pystan‘s compilation times drove me nuts, but all the cool kids have switched to cmdstanpy when I looked away. Let’s give that a whirl.

(Click here to show the code)
CPU times: user 5.33 ms, sys: 7.33 ms, total: 12.7 ms
Wall time: 421 ms
CmdStan installed.

We can’t fit to the entire three-year time sequence, that just wouldn’t be fair given the recent slump in donations. How about the last six months? That covers both a few donation burts and a flat period, so it’s more in line with what we’d expect in future.

(Click here to show the code)
There were 117 donations over the last six months.

With the data prepped, we can shift to building the linear model.

(Click here to show the code)

I could have just gone with Stan’s basic model, but flat priors aren’t my style. My preferred prior for the slope is the inverse tangent, as it compensates for the tendency of large slope values to “bunch up” on one another. Stan doesn’t offer it by default, but the Cauchy distribution isn’t too far off.

We’d like the standard deviation to skew towards smaller values. It naturally tends to minimize itself when maximizing the likelihood, but an explicit skew will encourage this process along. Gelman and the Stan crew are drifting towards normal priors, but I still like a Cauchy prior for its weird properties.

Normally I’d plunk the Gaussian distribution in to handle divergence from the deterministic model, but I hear using Student’s T instead will cut down the influence of outliers. Thomas Wiecki recommends one degree of freedom, but Gelman and co. find that it leads to poor convergence in some cases. They recommend somewhere between three and seven degrees of freedom, but skew towards three, so I’ll go with the flow here.

The y-intercept could land pretty much anywhere, making its prior difficult to figure out. Yes, I’ve adjusted the time axis so that the last donation is at time zero, but the recent flat portion pretty much guarantees the y-intercept will be higher than the current amount of funds. The traditional approach is to use a flat prior for the intercept, and I can’t think of a good reason to ditch that.

Not convinced I picked good priors? That’s cool, there should be enough data here that the priors have minimal influence anyway. Moving on, let’s see how long compilation takes.

(Click here to show the code)
CPU times: user 4.91 ms, sys: 5.3 ms, total: 10.2 ms
Wall time: 20.2 s

This is one area where emcee really shines: as a pure python library, it has zero compilation time. Both PyMC3 and Stan need some time to fire up an external compiler, which adds overhead. Twenty seconds isn’t too bad, though, especially if it leads to quick sampling times.

(Click here to show the code)
CPU times: user 14.7 ms, sys: 24.7 ms, total: 39.4 ms
Wall time: 829 ms

And it does! emcee can be pretty zippy for a simple linear regression, but Stan is in another class altogether. PyMC3 floats somewhere between the two, in my experience.

Another great feature of Stan are the built-in diagnostics. These are really handy for confirming the posterior converged, and if not it can give you tips on what’s wrong with the model.

(Click here to show the code)
Processing csv files: /tmp/tmpyfx91ua9/linear_regression-202005262238-1-e393mc6t.csv, /tmp/tmpyfx91ua9/linear_regression-202005262238-2-8u_r8umk.csv, /tmp/tmpyfx91ua9/linear_regression-202005262238-3-m36dbylo.csv, /tmp/tmpyfx91ua9/linear_regression-202005262238-4-hxjnszfe.csv

Checking sampler transitions treedepth.
Treedepth satisfactory for all transitions.

Checking sampler transitions for divergences.
No divergent transitions found.

Checking E-BFMI - sampler transitions HMC potential energy.
E-BFMI satisfactory for all transitions.

Effective sample size satisfactory.

Split R-hat values satisfactory all parameters.

Processing complete, no problems detected.

The odds of a simple model with plenty of datapoints going sideways are pretty small, so this is another non-surprise. Enough waiting, though, let’s see the fit in action. First, we need to extract the posterior from the stored variables …

(Click here to show the code)
There are 256 samples in the posterior.

… and now free of its prison, we can plot the posterior against the original data. I’ll narrow the time window slightly, to make it easier to focus on the fit.

(Click here to show the code)

The same graph as before, but now slightly zoomed in on and with trendlines visible.

Looks like a decent fit to me, so we can start using it to answer a few questions. How much money is flowing into the fund each day, on average? How many years will it be until all those legal bills are paid off? Since humans aren’t good at counting in years, let’s also translate that number into a specific date.

(Click here to show the code)
mean/std/median slope = $51.62/1.65/51.76 per day

mean/std/median years to pay off the legal fees, relative to 2020-05-25 12:36:39-05:00 =
	1.962/0.063/1.955

mean/median estimate for paying off debt =
	2022-05-12 07:49:55.274942-05:00 / 2022-05-09 13:57:13.461426-05:00

Mid-May 2022, eh? That’s… not ideal. How much time can we shave off, if we increase the donation rate? Let’s play out a few scenarios.

(Click here to show the code)
median estimate for paying off debt, increasing rate by   1% = 2022-05-02 17:16:37.476652800
median estimate for paying off debt, increasing rate by   3% = 2022-04-18 23:48:28.185868800
median estimate for paying off debt, increasing rate by  10% = 2022-03-05 21:00:48.510403200
median estimate for paying off debt, increasing rate by  30% = 2021-11-26 00:10:56.277984
median estimate for paying off debt, increasing rate by 100% = 2021-05-17 18:16:56.230752

Bumping up the donation rate by one percent is pitiful. A three percent increase will almost shave off a month, which is just barely worthwhile, and a ten percent increase will roll the date forward by two. Those sound like good starting points, so let’s make them official: increase the current donation rate by three percent, and I’ll start pumping out the aforementioned blog posts on Bayesian statistics. Manage to increase it by 10%, and I’ll also record them as videos.

As implied, I don’t intend to keep the same rate throughout this entire process. If you surprise me with your generosity, I’ll bump up the rate. By the same token, though, if we go through a dry spell I’ll decrease the rate so the targets are easier to hit. My goal is to have at least a 50% success rate on that lower bar. Wouldn’t that make it impossible to hit the video target? Remember, though, it’ll take some time to determine the success rate. That lag should make it possible to blow past the target, and by the time this becomes an issue I’ll have thought of a better fix.

Ah, but over what timeframe should this rate increase? We could easily blow past the three percent target if someone donates a hundred bucks tomorrow, after all, and it’s no fair to announce this and hope your wallets are ready to go in an instant. How about… sixteen days. You’ve got sixteen days to hit one of those rate targets. That’s a nice round number, for a computer scientist, and it should (hopefully!) give me just enough time to whip up the first post. What does that goal translate to, in absolute numbers?

(Click here to show the code)
a   3% increase over 16 days translates to $851.69 + $78039.00 = $78890.69

Right, if you want those blog posts to start flowing you’ve got to get that fundraiser total to $78,890.69 before June 12th. As for the video…

(Click here to show the code)
a  10% increase over 16 days translates to $909.57 + $78039.00 = $78948.57

… you’ve got to hit $78,948.57 by the same date.

Ready? Set? Get donating!

It’s Payback Time

I’m back! Yay! Sorry about all that, but my workload was just ridiculous. Things should be a lot more slack for the next few months, so it’s time I got back blogging. This also means I can finally put into action something I’ve been sitting on for months.

Richard Carrier has been a sore spot for me. He was one of the reasons I got interested in Bayesian statistics, and for a while there I thought he was a cool progressive. Alas, when it was revealed he was instead a vindictive creepy asshole, it shook me a bit. I promised myself I’d help out somehow, but I’d already done the obsessive analysis thing and in hindsight I’m not convinced it did more good than harm. I was at a loss for what I could do, beyond sharing links to the fundraiser.

Now, I think I know. The lawsuits may be long over, thanks to Carrier coincidentally dropping them at roughly the same time he came under threat of a counter-suit, but the legal bill are still there and not going away anytime soon. Worse, with the removal of the threat people are starting to forget about those debts. There have been only five donations this month, and four in April. It’s time to bring a little attention back that way.

One nasty side-effect of Carrier’s lawsuits is that Bayesian statistics has become a punchline in the atheist/skeptic community. The reasoning is understandable, if flawed: Carrier is a crank, he promotes Bayesian statistics, ergo Bayesian statistics must be the tool of crackpots. This has been surreal for me to witness, as Bayes has become a critical tool in my kit over the last three years. I suppose I could survive without it, if I had to, but every alternative I’m aware of is worse. I’m not the only one in this camp, either.

Following the emergence of a novel coronavirus (SARS-CoV-2) and its spread outside of China, Europe is now experiencing large epidemics. In response, many European countries have implemented unprecedented non-pharmaceutical interventions including case isolation, the closure of schools and universities, banning of mass gatherings and/or public events, and most recently, widescale social distancing including local and national lockdowns. In this report, we use a semi-mechanistic Bayesian hierarchical model to attempt to infer the impact of these interventions across 11 European countries.

Flaxman, Seth, Swapnil Mishra, Axel Gandy, H Juliette T Unwin, Helen Coupland, Thomas A Mellan, Tresnia Berah, et al. “Estimating the Number of Infections and the Impact of Non- Pharmaceutical Interventions on COVID-19 in 11 European Countries,” 2020, 35.

In estimating time intervals between symptom onset and outcome, it was necessary to account for the fact that, during a growing epidemic, a higher proportion of the cases will have been infected recently (…). Therefore, we re-parameterised a gamma model to account for exponential growth using a growth rate of 0·14 per day, obtained from the early case onset data (…). Using Bayesian methods, we fitted gamma distributions to the data on time from onset to death and onset to recovery, conditional on having observed the final outcome.

Verity, Robert, Lucy C. Okell, Ilaria Dorigatti, Peter Winskill, Charles Whittaker, Natsuko Imai, Gina Cuomo-Dannenburg, et al. “Estimates of the Severity of Coronavirus Disease 2019: A Model-Based Analysis.” The Lancet Infectious Diseases 0, no. 0 (March 30, 2020). https://doi.org/10.1016/S1473-3099(20)30243-7.

we used Bayesian methods to infer parameter estimates and obtain credible intervals.

Linton, Natalie M., Tetsuro Kobayashi, Yichi Yang, Katsuma Hayashi, Andrei R. Akhmetzhanov, Sung-mok Jung, Baoyin Yuan, Ryo Kinoshita, and Hiroshi Nishiura. “Incubation Period and Other Epidemiological Characteristics of 2019 Novel Coronavirus Infections with Right Truncation: A Statistical Analysis of Publicly Available Case Data.” Journal of Clinical Medicine 9, no. 2 (February 2020): 538. https://doi.org/10.3390/jcm9020538.

A significant chunk of our understanding of COVID-19 depends on Bayesian statistics. I’ll go further and argue that you cannot fully understand this pandemic without it. And yet thanks to Richard Carrier, the atheist/skeptic community is primed to dismiss Bayesian statistics.

So let’s catch two stones with one bird. If enough people donate to this fundraiser, I’ll start blogging a course on Bayesian statistics. I think I’ve got a novel angle on the subject, one that’s easier to slip into than my 201-level stuff and yet more rigorous. If y’all really start tossing in the funds, I’ll make it a video series. Yes yes, there’s a pandemic and potential global depression going on, but that just means I’ll work for cheap! I’ll release the milestones and course outline over the next few days, but there’s no harm in an early start.

Help me help the people Richard Carrier hurt. I’ll try to make it worth your while.