Cross Validated Asked by Learning stats by example on October 15, 2020
I’m trying to understand how I might model count data where there’s diminishing marginal utility and a stochastic process.
So, let’s say we’re modeling the number of “useful intelligence tips” given to us by Russian agents dependent on how many shots of vodka we give them. I want to show that past a certain point, the number of useful tips starts to decay.
The way I’d think of this is by making the rate parameter lambda dynamic by time t.
So at time t = 1
we might say we get rpois(1,10)
pieces of intel n. Poisson distributed n pieces of intel, with mean 10.
But let’s say we start to see decay so that by time t = 100
only 2 pieces of intel are returned. We can model time t = 100
with rpois(1,100)
.
My thinking on this is we build a vector so that lambda is dynamic and varies at each point t:
lambda_t[t] = t * rexp(1, rate = z)
.
We then just need to solve for z
.
for (1:t) Y[t] ~ rpois(1, lambda_t[t]);
Does this approach make sense? I’ve written in pseudo-code using a bit of stan syntax and R syntax as I imagine I’ll mix the two. But I really am curious how count data is generally modeled when the counts vary over t.
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP