## Rat Tumors and PyMC3

** Published:**

I’m very proud to say I have contributed this example to PyMC3’s documentation. It details how to compute posterior means for Gelman’s rat tumour example in BDA3.

** Published:**

I’m very proud to say I have contributed this example to PyMC3’s documentation. It details how to compute posterior means for Gelman’s rat tumour example in BDA3.

** Published:**

Today was a fairly easy challenge. Part one provides us with a 2d array of integers and asks to find the sum of the differences between the largest and smallest numbers in each row. Super easy to do without loops if you know how to use numpy.

** Published:**

** Published:**

Back in September 2017, I was really tired and learning about Maximum Likelihood and yearned to do some more machine learning. For the longest time I wanted to scrape my gym’s twitter account to get information about when the gym was busiest.

** Published:**

I was recently asked to make 4 plots for a collaborator. The plots are all the same, just a scatter plot and a non-linear trend line. Every time I have to do something repetitive, I wince, *especially with respect to plots*. I thought I would take this opportunity to write a short blog post on how to use functional programming in R to make the same plot for similar yet different data.

** Published:**

I love Fivethirtyeight’s Riddler column. Usually, I can solve the problem with computation, but on some rare occasions I can do some interesting math to get the solution without having to code. Here is the first puzzle I ever solved. It is a simple puzzle, yet it has an elegant computational and analytic solution. Let’s take a look.

** Published:**

When I was doing my Masters, I had to generate a lot of plots, which means I had to generate a lot of data. Usually, the data I would be generating would depend on a parameter (maybe something like the rolling window length, or maybe the bandwidth for some smoothing function) and I would have to try a whirlwind of combinations. In order to do this, I would end up doing is writing code to generate the data once, then just loop over that code for different values of the parameters.

** Published:**

Let me ask you a question: Considering logistic regression can be performed without the use of a penalty parameter, why does sklearn include a penalty in their implementation of logistic regression? I think most people would reply with something about overfitting, which I suppose is a reasonable answer, but isn’t very satisfactory, especially since the documentation for `sklearn.linear_model.LogisticRegression()`

is awash with optimization terminology and never mentions overfitting.

** Published:**

Back in September 2017, I was really tired and learning about Maximum Likelihood and yearned to do some more machine learning. For the longest time I wanted to scrape my gym’s twitter account to get information about when the gym was busiest.

** Published:**

I was recently asked to make 4 plots for a collaborator. The plots are all the same, just a scatter plot and a non-linear trend line. Every time I have to do something repetitive, I wince, *especially with respect to plots*. I thought I would take this opportunity to write a short blog post on how to use functional programming in R to make the same plot for similar yet different data.

** Published:**

I was recently asked to make 4 plots for a collaborator. The plots are all the same, just a scatter plot and a non-linear trend line. Every time I have to do something repetitive, I wince, *especially with respect to plots*. I thought I would take this opportunity to write a short blog post on how to use functional programming in R to make the same plot for similar yet different data.

** Published:**

I’m very proud to say I have contributed this example to PyMC3’s documentation. It details how to compute posterior means for Gelman’s rat tumour example in BDA3.

** Published:**