/* Test of observe_abc in Picat. Testing observe_abc/2 for some larger parameter recover problems. The main thing I want to check how slow it is for parameter recovering larger datasets, e.g. 1_000 or even 10_000 sized datasets. Well, it's not _very_ slow, and some minute or two is not that bad. It should be noted that observe_abc/2-4 does not use any fancy algorithm, and is definitely not in the realm of PPL system such as Stan, PyMC, etc. Having said that, it might be useful anyway, at least for experimenting. This program was created by Hakan Kjellerstrand, hakank@gmail.com See also my Picat page: http://www.hakank.org/picat/ */ import ppl_distributions, ppl_utils. import util. % import ordset. main => go. /* Normal(100,15) with uninformed priors 1..1000 for Mu and 100 for Sigma. * Data len: 100, with min_accepted_samples=1000 11.3s ... Num accepted samples: 998 Total samples: 111082 (0.009%) Num accepted samples: 999 Total samples: 111094 (0.009%) Num accepted samples: 1000 Total samples: 111171 (0.009%) var : mu Probabilities (truncated): 118.295208123650042: 0.0010000000000000 117.377968881920907: 0.0010000000000000 116.321215145905128: 0.0010000000000000 116.060514865005629: 0.0010000000000000 ......... 79.638354528061285: 0.0010000000000000 78.26166389568786: 0.0010000000000000 75.395747156998027: 0.0010000000000000 75.18316764206773: 0.0010000000000000 mean = 98.5819 var : sigma Probabilities (truncated): 36.308452172348488: 0.0010000000000000 35.738887719688421: 0.0010000000000000 35.685100414177917: 0.0010000000000000 34.756636676730885: 0.0010000000000000 ......... 1.126654441993057: 0.0010000000000000 1.098836912819574: 0.0010000000000000 1.077783935739558: 0.0010000000000000 1.063272614992816: 0.0010000000000000 mean = 15.8068 * Data len:1000, with min_accepted_samples=100 12.2s Num accepted samples: 97 Total samples: 11556 (0.008%) Num accepted samples: 98 Total samples: 11788 (0.008%) Num accepted samples: 99 Total samples: 11885 (0.008%) Num accepted samples: 100 Total samples: 12075 (0.008%) var : mu Probabilities (truncated): 115.479509252812491: 0.0100000000000000 115.19437829320988: 0.0100000000000000 114.789023106353838: 0.0100000000000000 113.673611458238966: 0.0100000000000000 ......... 88.100889587821854: 0.0100000000000000 87.99886706983618: 0.0100000000000000 86.826095870615035: 0.0100000000000000 86.152063693456384: 0.0100000000000000 mean = 100.837 var : sigma Probabilities (truncated): 29.875724513025826: 0.0100000000000000 29.684012169336903: 0.0100000000000000 29.209358667586631: 0.0100000000000000 28.668061638562037: 0.0100000000000000 ......... 3.034089174137492: 0.0100000000000000 2.911357328720091: 0.0100000000000000 2.650945745711655: 0.0100000000000000 1.59423705404356: 0.0100000000000000 mean = 16.723 * Data length: 10000, with min_accepted_samples=100 1min33s ... Num accepted samples: 98 Total samples: 9505 (0.010%) Num accepted samples: 99 Total samples: 9686 (0.010%) Num accepted samples: 100 Total samples: 9886 (0.010%) var : mu Probabilities (truncated): 115.003367108294441: 0.0100000000000000 114.637507030105922: 0.0100000000000000 114.553096353333956: 0.0100000000000000 114.30072633051347: 0.0100000000000000 ......... 87.130679450524354: 0.0100000000000000 86.868737097302798: 0.0100000000000000 86.843063415886405: 0.0100000000000000 85.438650380558641: 0.0100000000000000 mean = 102.238 var : sigma Probabilities (truncated): 29.780497820014368: 0.0100000000000000 29.117487258798207: 0.0100000000000000 28.819990477440875: 0.0100000000000000 28.603285641224723: 0.0100000000000000 ......... 2.441798251327965: 0.0100000000000000 2.114546660852873: 0.0100000000000000 1.740066478373514: 0.0100000000000000 1.628718728492371: 0.0100000000000000 mean = 15.2549 * Data len 10_000, min_accepted_samples=10 (!) min_accepted_samples = 10 Num accepted samples: 1 Total samples: 16 (0.062%) Num accepted samples: 2 Total samples: 44 (0.045%) Num accepted samples: 3 Total samples: 58 (0.052%) Num accepted samples: 4 Total samples: 351 (0.011%) Num accepted samples: 5 Total samples: 825 (0.006%) Num accepted samples: 6 Total samples: 917 (0.007%) Num accepted samples: 7 Total samples: 1052 (0.007%) Num accepted samples: 8 Total samples: 1105 (0.007%) Num accepted samples: 9 Total samples: 1121 (0.008%) Num accepted samples: 10 Total samples: 1217 (0.008%) var : mu Probabilities: 109.906: 0.1000000000000000 109.755: 0.1000000000000000 109.463: 0.1000000000000000 105.638: 0.1000000000000000 99.2253: 0.1000000000000000 98.9952: 0.1000000000000000 97.9942: 0.1000000000000000 96.5837: 0.1000000000000000 94.7001: 0.1000000000000000 88.6765: 0.1000000000000000 mean = 101.094 var : sigma Probabilities: 28.7841: 0.1000000000000000 24.1674: 0.1000000000000000 21.7351: 0.1000000000000000 20.9033: 0.1000000000000000 11.7665: 0.1000000000000000 11.2827: 0.1000000000000000 10.9395: 0.1000000000000000 8.56579: 0.1000000000000000 6.33055: 0.1000000000000000 5.94868: 0.1000000000000000 mean = 15.0424 */ go ?=> N = 100, % N = 1000, % N = 10000, Data = normal_dist_n(100,15,N), println(data=Data), reset_store, run_model(100,$model(Data),[show_probs_trunc,mean, % show_percentiles,show_histogram, % show_hpd_intervals,hpd_intervals=[0.84], min_accepted_samples=100 % ,show_accepted_samples=true ]), nl, % show_store_lengths,nl, % fail, nl. go => true. model(Data) => Len = Data.len, % Uninformed priors Mu = uniform(1,1000), Sigma = uniform_dist(1,100), Y = normal_dist_n(Mu,Sigma,Len), observe_abc(Data,Y,1/2), % default mean,stdev % observe_abc(Data,Y,1,[$q(1),$q(99),$central_moment(1),range,iqr,mad]), % Experiment Using observe_abc/4 Post = normal_dist(Mu,Sigma), if observed_ok then % println(y=Y), add("mu",Mu), add("sigma",Sigma), add("post",Post), end. /* Testing binomial_dist(10,1/7) Note: when using Mathematica's FindDistributionParameters[data, BinomialDistribution[n, p]] it complains about "The likelihood of BinomialDistribution[n,p] with given data does not attain its maximum for finite values of n.", so perhaps it's not that strange that Picat PPL is not very good either. * Data len: 100, min_accepted_samples=100, With an informed prior: N in 1..1000, and P: 0..1, it does not go so well 45.9s ... Num accepted samples: 98 Total samples: 5167 (0.019%) Num accepted samples: 99 Total samples: 5226 (0.019%) Num accepted samples: 100 Total samples: 5246 (0.019%) var : N Probabilities (truncated): 991.258208005809365: 0.0100000000000000 910.709350864267662: 0.0100000000000000 895.086925724096091: 0.0100000000000000 808.482504073289419: 0.0100000000000000 ......... 1.680946379751408: 0.0100000000000000 1.636659290938014: 0.0100000000000000 1.195663606373436: 0.0100000000000000 1.051956303907538: 0.0100000000000000 mean = 155.315 var : P Probabilities (truncated): 0.90635427605155: 0.0100000000000000 0.844446366079802: 0.0100000000000000 0.821077699564496: 0.0100000000000000 0.780135940536507: 0.0100000000000000 ......... 0.000734132667014: 0.0100000000000000 0.000650887968311: 0.0100000000000000 0.000483421187471: 0.0100000000000000 0.000013939410273: 0.0100000000000000 mean = 0.133919 - Data len: 20, min_accepted_samples=1000 and reduced the prior for N to 1..100: Num accepted samples: 1000 Total samples: 8581 (0.117%) var : N Probabilities (truncated): 99.584194805279466: 0.0010000000000000 99.423506693180428: 0.0010000000000000 99.289799939975978: 0.0010000000000000 98.692423075294315: 0.0010000000000000 ......... 1.049094139621171: 0.0010000000000000 1.021495126663472: 0.0010000000000000 1.017393429771715: 0.0010000000000000 1.011620176030146: 0.0010000000000000 mean = 23.0879 var : P Probabilities (truncated): 0.996943216749321: 0.0010000000000000 0.990526555678761: 0.0010000000000000 0.984901493020361: 0.0010000000000000 0.94026476651011: 0.0010000000000000 ......... 0.000341834189003: 0.0010000000000000 0.000243349177012: 0.0010000000000000 0.000210174242589: 0.0010000000000000 0.000185452535281: 0.0010000000000000 mean = 0.190155 */ go2 ?=> N = 40, P = 1/7, Dist = $binomial_dist_n(10,P,N), println(dist=Dist), Data = Dist.apply, println(data=Data), println([n=N,mean=Data.mean,stdev=Data.stdev]), reset_store, run_model(10_000,$model2(Data),[show_probs_trunc,mean, % show_percentiles,show_histogram, % show_hpd_intervals,hpd_intervals=[0.84], min_accepted_samples=1000,show_accepted_samples=true ]), nl, % show_store_lengths,nl, % fail, nl. go2 => true. model2(Data) => Len = Data.len, % Uninformed priors N = uniform(1,100), P = beta_dist(1,1), Y = binomial_dist_smart_n(N,P,Len), observe_abc(Data,Y,1/4), if observed_ok then add("N",N), add("P",P), end. /* poisson_dist(10) - data len 100, min_accepted_samples=1000 12.3s data = [20,10,19,7,18,22,15,14,9,12,11,7,18,21,14,11,14,11,10,22,15,14,10,16,11,12,11,15,12,14,9,16,11,2,16,29,16,11,13,25,14,10,20,11,9,12,14,10,11,13,13,18,4,10,14,12,13,12,7,12,21,12,10,17,13,15,12,8,15,11,11,15,14,9,16,11,14,10,12,14,17,8,9,15,20,13,12,15,11,8,12,9,16,10,18,5,11,7,16,9] [n = 100,mean = 13.03,stdev = 4.30919] ... Num accepted samples: 998 Total samples: 11907 (0.084%) Num accepted samples: 999 Total samples: 11948 (0.084%) Num accepted samples: 1000 Total samples: 11957 (0.084%) var : lambda Probabilities (truncated): 18.775545251916881: 0.0010000000000000 17.874723354761826: 0.0010000000000000 17.808509299442409: 0.0010000000000000 17.793812332113184: 0.0010000000000000 ......... 8.399640984553676: 0.0010000000000000 8.325518025236912: 0.0010000000000000 8.317045128120597: 0.0010000000000000 8.313431035873215: 0.0010000000000000 mean = 12.9986 HPD intervals: HPD interval (0.84): 9.57312252911419..16.62747963314293 HPD interval (0.9): 9.04394346896743..16.65408255330011 HPD interval (0.94): 9.04394346896743..17.10800260031037 HPD interval (0.99): 8.45943817750525..17.58016151356518 * data len: 1000, min_accepted_samples=100 15.1s Num accepted samples: 98 Total samples: 1546 (0.063%) Num accepted samples: 99 Total samples: 1555 (0.064%) Num accepted samples: 100 Total samples: 1561 (0.064%) var : lambda Probabilities (truncated): 16.560546318330125: 0.0100000000000000 16.420269649205856: 0.0100000000000000 16.377367078968028: 0.0100000000000000 16.32392174672518: 0.0100000000000000 ......... 9.699517241073547: 0.0100000000000000 9.69546335036655: 0.0100000000000000 9.640795940831675: 0.0100000000000000 9.46763557776233: 0.0100000000000000 mean = 12.8975 HPD intervals: HPD interval (0.84): 10.17078194356094..15.76670539647653 HPD interval (0.9): 9.69546335036655..15.76670539647653 HPD interval (0.94): 9.64079594083167..16.11095976136204 HPD interval (0.99): 9.64079594083167..16.56054631833013 * data len: 1000, min_accepted_samples=100 2min:23s Num accepted samples: 98 Total samples: 1559 (0.063%) Num accepted samples: 99 Total samples: 1577 (0.063%) Num accepted samples: 100 Total samples: 1590 (0.063%) var : lambda Probabilities (truncated): 16.638428309763981: 0.0100000000000000 16.582529556743118: 0.0100000000000000 16.578885222589079: 0.0100000000000000 16.329814400677485: 0.0100000000000000 ......... 9.574873194366168: 0.0100000000000000 9.527924293898012: 0.0100000000000000 9.513417813700354: 0.0100000000000000 9.433986984861077: 0.0100000000000000 mean = 12.7762 HPD intervals: HPD interval (0.84): 9.91153803929292..15.57844666930774 HPD interval (0.9): 9.51341781370035..15.57844666930774 HPD interval (0.94): 9.43398698486108..16.07128657497060 HPD interval (0.99): 9.51341781370035..16.63842830976398 */ go3 ?=> % N = 100, N = 1000, % N = 10000, Lambda = 13, Dist = $poisson_dist_n(Lambda,N), println(dist=Dist), Data = Dist.apply, println(data=Data), println([n=N,mean=Data.mean,stdev=Data.stdev]), reset_store, run_model(10_000,$model3(Data),[show_probs_trunc,mean, % show_percentiles,show_histogram, show_hpd_intervals,hpd_intervals=[0.84,0.9,0.94,0.99], min_accepted_samples=100,show_accepted_samples=true ]), nl, % println(data=Data), println([n=N,mean=Data.mean,stdev=Data.stdev]), % show_store_lengths,nl, % fail, nl. go3 => true. /* Comparing with a "manual" (explicit) version of observe_abc/2 in the model, to skip the copying the the observe_abc/2 procedure. Sometime this is faster than using observe_abc/2, but sometimes it as slow, or even slower. */ model3(Data) => Len = Data.len, % Uninformed priors Lambda = uniform(1,100), Y = poisson_dist_n(Lambda,Len), observe_abc(Data,Y,1), if observed_ok then add("lambda",Lambda), end. % The manual version model3b(Data) => Len = Data.len, Mean = Data.mean, Stdev = Data.stdev, % Uninformed priors Lambda = uniform(1,100), Y = poisson_dist_n(Lambda,Len), YMean = Y.mean, YStdev = Y.stdev, % "Manual" version observe(abs(Mean-YMean)<=Stdev), observe(abs(Stdev-YStdev)<=Stdev), % observe_abc(Data,Y,1), if observed_ok then add("lambda",Lambda), end. /* random_integer(37) - datalen: 100 min_accepted_samples=10_000 11.7s Num accepted samples: 9997 Total samples: 228281 (0.044%) Num accepted samples: 9998 Total samples: 228298 (0.044%) Num accepted samples: 9999 Total samples: 228300 (0.044%) Num accepted samples: 10000 Total samples: 228361 (0.044%) var : N Probabilities (truncated): 41: 0.0258000000000000 45: 0.0257000000000000 16: 0.0253000000000000 31: 0.0251000000000000 ......... 64: 0.0006000000000000 12: 0.0005000000000000 66: 0.0002000000000000 65: 0.0001000000000000 mean = 35.3573 HPD intervals: HPD interval (0.84): 14.00000000000000..50.00000000000000 HPD interval (0.9): 14.00000000000000..53.00000000000000 HPD interval (0.94): 14.00000000000000..55.00000000000000 HPD interval (0.99): 12.00000000000000..59.00000000000000 - datalen: 1000 min_accepted_samples=1000 9.1s Num accepted samples: 998 Total samples: 23650 (0.042%) Num accepted samples: 999 Total samples: 23652 (0.042%) Num accepted samples: 1000 Total samples: 23679 (0.042%) var : N Probabilities (truncated): 35: 0.0330000000000000 18: 0.0300000000000000 49: 0.0290000000000000 47: 0.0290000000000000 ......... 38: 0.0130000000000000 15: 0.0110000000000000 59: 0.0060000000000000 60: 0.0010000000000000 mean = 36.95 HPD intervals: HPD interval (0.84): 16.00000000000000..52.00000000000000 HPD interval (0.9): 18.00000000000000..56.00000000000000 HPD interval (0.94): 16.00000000000000..56.00000000000000 HPD interval (0.99): 15.00000000000000..58.00000000000000 - datalen: 10000 min_accepted_samples=100 8.6s Num accepted samples: 98 Total samples: 2211 (0.044%) Num accepted samples: 99 Total samples: 2214 (0.045%) Num accepted samples: 100 Total samples: 2244 (0.045%) var : N Probabilities (truncated): 50: 0.0600000000000000 45: 0.0600000000000000 31: 0.0500000000000000 18: 0.0500000000000000 ......... 44: 0.0100000000000000 40: 0.0100000000000000 38: 0.0100000000000000 34: 0.0100000000000000 mean = 35.3 HPD intervals: HPD interval (0.84): 18.00000000000000..51.00000000000000 HPD interval (0.9): 16.00000000000000..52.00000000000000 HPD interval (0.94): 16.00000000000000..54.00000000000000 HPD interval (0.99): 16.00000000000000..57.00000000000000 */ go4 ?=> % N = 100, % N = 1000, N = 10000, Lambda = 13, Dist = $random_integer1_n(37,N), println(dist=Dist), Data = Dist.apply, println(data=Data), println([n=N,mean=Data.mean,stdev=Data.stdev]), reset_store, run_model(10_000,$model4(Data),[show_probs_trunc,mean, % show_percentiles,show_histogram, show_hpd_intervals,hpd_intervals=[0.84,0.9,0.94,0.99], min_accepted_samples=100,show_accepted_samples=true ]), nl, % println(data=Data), println([n=N,mean=Data.mean,stdev=Data.stdev]), % show_store_lengths,nl, % fail, nl. go4 => true. model4(Data) => Len = Data.len, % Uninformed priors N = random_integer1(1000), Y = random_integer1_n(N,Len), observe_abc(Data,Y), if observed_ok then add("N",N), end. /* From A short introduction to approximate Bayesian computation (ABC) https://www.youtube.com/watch?v=77yECJ8_dyk around @33:30 y(i) ~ Poisson(lambda), lambda ~ Gamma(1,1) yobs = (0,0,0,0,5) Testing with statistics - mean - mean + variance - variance - mean + stdev (the default for observe_abc/2-3) [data = [0,0,0,0,5],mean = 1.0,stdev = 2.0,variance = 4.0] statistics = [mean] min_accepted_samples = 10000 var : lambda Probabilities (truncated): 5.097658813750965: 0.0001000000000000 4.982896832856594: 0.0001000000000000 4.656565351432567: 0.0001000000000000 4.602325628383237: 0.0001000000000000 ......... 0.000276108698226: 0.0001000000000000 0.000263246857859: 0.0001000000000000 0.000098985197082: 0.0001000000000000 0.000071859762812: 0.0001000000000000 mean = 0.846656 HPD intervals: HPD interval (0.84): 0.00007185976281..1.56307691682920 HPD interval (0.9): 0.00007185976281..1.90904819624898 HPD interval (0.94): 0.00007185976281..2.25094160600539 HPD interval (0.99): 0.00007185976281..3.15256654200383 var : y Probabilities (truncated): [0,0,0,0,0]: 0.1734000000000000 [0,0,0,0,1]: 0.0313000000000000 [0,1,0,0,0]: 0.0309000000000000 [0,0,0,1,0]: 0.0290000000000000 ......... [0,0,0,4,1]: 0.0001000000000000 [0,0,0,3,4]: 0.0001000000000000 [0,0,0,3,3]: 0.0001000000000000 [0,0,0,0,4]: 0.0001000000000000 Mean (truncated) mean = [[0,0,0,0,0] = 0.1734,[0,0,0,0,1] = 0.0313,[0,1,0,0,0] = 0.0309,[0,0,0,1,0] = 0.029,[0,0,1,0,0] = 0.0279,[1,0,0,0,0] = 0.0271,[0,1,1,0,0] = 0.011,[1,0,0,0,1] = 0.0109,[0,0,1,1,0] = 0.0106,[1,0,1,0,0] = 0.0101,[0,0,0,1,1] = 0.0098,[0,0,1,0,1] = 0.0097,[0,1,0,0,1] = 0.0096,[1,0,0,1,0] = 0.0093,[1,1,0,0,0] = 0.0086,[0,1,0,1,0] = 0.008,[0,0,1,1,1] = 0.0065,[0,1,1,1,0] = 0.0062,[0,2,0,0,0] = 0.0057,[1,1,0,0,1] = 0.0054] HPD intervals: show_hpd_intervals: data is not numeric var : p Probabilities: false: 1.0000000000000000 mean = [false = 1.0] HPD intervals: show_hpd_intervals: data is not numeric statistics = [mean,variance] min_accepted_samples = 10000 var : lambda Probabilities (truncated): 5.696125876807489: 0.0001000000000000 5.398010164981287: 0.0001000000000000 5.255629597105669: 0.0001000000000000 5.212239684834783: 0.0001000000000000 ......... 0.232162560357806: 0.0001000000000000 0.231558580085259: 0.0001000000000000 0.214998997817617: 0.0001000000000000 0.190633818209587: 0.0001000000000000 mean = 1.88352 HPD intervals: HPD interval (0.84): 0.81419718346124..2.84097220350004 HPD interval (0.9): 0.68851403676865..3.02520421281194 HPD interval (0.94): 0.57309740187549..3.19740337113145 HPD interval (0.99): 0.31235010107998..3.83217044862985 var : y Probabilities (truncated): [4,0,0,0,1]: 0.0024000000000000 [1,4,0,0,1]: 0.0023000000000000 [0,4,0,1,1]: 0.0023000000000000 [0,1,0,4,1]: 0.0023000000000000 ......... [0,0,0,5,3]: 0.0001000000000000 [0,0,0,4,4]: 0.0001000000000000 [0,0,0,3,6]: 0.0001000000000000 [0,0,0,3,4]: 0.0001000000000000 Mean (truncated) mean = [[4,0,0,0,1] = 0.0024,[1,4,0,0,1] = 0.0023,[0,4,0,1,1] = 0.0023,[0,1,0,4,1] = 0.0023,[1,0,0,4,0] = 0.0022,[0,0,4,1,1] = 0.0022,[0,0,1,1,4] = 0.0022,[4,1,0,0,1] = 0.002,[4,1,0,0,0] = 0.002,[1,4,0,1,0] = 0.002,[0,4,0,0,1] = 0.002,[0,1,1,4,0] = 0.002,[0,0,0,4,1] = 0.002,[1,0,0,0,4] = 0.0019,[0,1,0,0,4] = 0.0019,[1,4,0,0,0] = 0.0018,[1,1,0,4,0] = 0.0018,[0,1,4,1,0] = 0.0018,[0,1,4,0,1] = 0.0018,[0,0,0,0,4] = 0.0018] HPD intervals: show_hpd_intervals: data is not numeric var : p Probabilities: false: 0.9997000000000000 true: 0.0003000000000000 mean = [false = 0.9997,true = 0.0003] HPD intervals: show_hpd_intervals: data is not numeric statistics = [variance] min_accepted_samples = 10000 var : lambda Probabilities (truncated): 12.535426592411403: 0.0001000000000000 10.900216062507798: 0.0001000000000000 10.192737114514273: 0.0001000000000000 9.669447345034017: 0.0001000000000000 ......... 0.280166174254342: 0.0001000000000000 0.209485969085158: 0.0001000000000000 0.186346933271526: 0.0001000000000000 0.169441426006831: 0.0001000000000000 mean = 2.40893 HPD intervals: HPD interval (0.84): 0.69930656029682..3.66139686972002 HPD interval (0.9): 0.69736264146407..4.24336752088908 HPD interval (0.94): 0.52504824909100..4.59761220789931 HPD interval (0.99): 0.32263516997245..6.28195639964253 var : y Probabilities (truncated): [0,4,1,0,0]: 0.0021000000000000 [1,1,4,0,0]: 0.0020000000000000 [0,0,0,4,0]: 0.0020000000000000 [1,0,4,0,1]: 0.0017000000000000 ......... [0,0,0,2,5]: 0.0001000000000000 [0,0,0,1,6]: 0.0001000000000000 [0,0,0,1,5]: 0.0001000000000000 [0,0,0,0,5]: 0.0001000000000000 Mean (truncated) mean = [[0,4,1,0,0] = 0.0021,[1,1,4,0,0] = 0.002,[0,0,0,4,0] = 0.002,[1,0,4,0,1] = 0.0017,[4,0,0,0,0] = 0.0016,[0,1,0,4,0] = 0.0016,[0,0,0,1,4] = 0.0016,[4,1,0,0,1] = 0.0015,[4,1,0,0,0] = 0.0015,[4,0,0,0,1] = 0.0015,[1,0,0,4,1] = 0.0015,[1,0,0,1,4] = 0.0015,[1,0,0,0,4] = 0.0015,[1,4,0,1,0] = 0.0014,[1,4,0,0,0] = 0.0014,[1,2,0,4,0] = 0.0014,[0,1,1,0,4] = 0.0014,[1,0,0,4,0] = 0.0013,[0,0,4,1,0] = 0.0013,[0,0,0,0,4] = 0.0013] HPD intervals: show_hpd_intervals: data is not numeric var : p Probabilities: false: 0.9999000000000000 true: 0.0001000000000000 mean = [false = 0.9999,true = 0.0001] HPD intervals: show_hpd_intervals: data is not numeric statistics = [mean,stdev] min_accepted_samples = 10000 var : lambda Probabilities (truncated): 4.703214994695039: 0.0001000000000000 4.572323000385846: 0.0001000000000000 4.525228335693912: 0.0001000000000000 4.503976131857685: 0.0001000000000000 ......... 0.000476273276798: 0.0001000000000000 0.00045658899023: 0.0001000000000000 0.000447232828182: 0.0001000000000000 0.000108901357588: 0.0001000000000000 mean = 0.869213 HPD intervals: HPD interval (0.84): 0.00010890135759..1.63719809346579 HPD interval (0.9): 0.00044723282818..1.97901133008129 HPD interval (0.94): 0.00010890135759..2.28337576093340 HPD interval (0.99): 0.00010890135759..3.15023126926591 var : y Probabilities (truncated): [0,0,0,0,0]: 0.1697000000000000 [0,0,0,0,1]: 0.0290000000000000 [1,0,0,0,0]: 0.0286000000000000 [0,1,0,0,0]: 0.0266000000000000 ......... [0,0,0,3,6]: 0.0001000000000000 [0,0,0,3,3]: 0.0001000000000000 [0,0,0,3,2]: 0.0001000000000000 [0,0,0,1,4]: 0.0001000000000000 Mean (truncated) mean = [[0,0,0,0,0] = 0.1697,[0,0,0,0,1] = 0.029,[1,0,0,0,0] = 0.0286,[0,1,0,0,0] = 0.0266,[0,0,1,0,0] = 0.0265,[0,0,0,1,0] = 0.0263,[1,1,0,0,0] = 0.0118,[0,1,1,0,0] = 0.0115,[0,0,1,0,1] = 0.011,[0,1,0,1,0] = 0.0106,[1,0,0,0,1] = 0.0095,[0,1,0,0,1] = 0.0094,[0,0,1,1,0] = 0.0092,[1,0,0,1,0] = 0.0088,[0,0,0,1,1] = 0.0085,[1,0,1,0,0] = 0.0074,[1,1,0,1,0] = 0.0059,[0,1,1,0,1] = 0.0056,[0,0,1,1,1] = 0.0056,[1,1,1,0,0] = 0.0052] HPD intervals: show_hpd_intervals: data is not numeric var : p Probabilities: false: 1.0000000000000000 mean = [false = 1.0] HPD intervals: show_hpd_intervals: data is not numeric */ go5 ?=> Data = [0,0,0,0,5], println([data=Data,mean=Data.mean,stdev=Data.stdev,variance=Data.variance]), member(Statistics,[[mean],[mean,variance],[variance],[mean,stdev]]), println(statistics=Statistics), reset_store, run_model(10_000,$model5(Data,Statistics),[show_probs_trunc,mean, % show_percentiles,show_histogram, show_hpd_intervals,hpd_intervals=[0.84,0.9,0.94,0.99], min_accepted_samples=10_000,show_accepted_samples=false ]), nl, % show_store_lengths,nl, fail, nl. go5 => true. model5(Data,Statistics) => Lambda = gamma_dist(1,1), Y = poisson_dist_n(Lambda,Data.len), observe_abc(Data,Y,1,Statistics), P = check(Y==Data), if observed_ok then add("lambda",Lambda), add("y",Y), add("p",P), end. /* Adding analyse_post(Data,Post). data = [105.453,89.9929,106.105,77.968,92.0137,86.7918,98.2084,97.6413,86.0226,81.5336,142.997,71.8199,116.837,119.146,100.682,85.3139,114.232,97.8366,96.9657,76.6922,83.1484,87.2115,119.055,95.2251,89.1641,108.752,76.7078,105.881,104.411,107.241,95.339,94.5178,93.9704,79.0406,84.9234,104.12,91.9911,94.4141,76.2712,111.25,111.209,86.8509,108.036,84.0066,79.9727,97.6344,89.3397,91.1579,81.3125,93.0655,117.694,97.6949,80.0592,105.865,114.398,109.738,118.82,127.073,108.135,84.3835,122.759,82.2611,98.9557,108.749,108.364,123.32,115.536,71.0866,70.5708,71.4958,89.4582,95.0817,127.896,89.9774,97.2017,85.8417,124.483,108.094,133.207,83.1034,84.1876,130.531,100.052,105.137,100.332,103.467,76.9163,113.87,117.973,94.2049,93.0481,104.601,107.61,83.1355,98.8706,89.0603,91.7176,61.2325,95.9057,93.1817] Histogram (total 100) 61.233: 1 ######## (0.010 / 0.010) 63.277: 0 (0.000 / 0.010) 65.321: 0 (0.000 / 0.010) 67.365: 0 (0.000 / 0.010) 69.409: 0 (0.000 / 0.010) 71.453: 4 ############################## (0.040 / 0.050) 73.497: 0 (0.000 / 0.050) 75.541: 1 ######## (0.010 / 0.060) 77.585: 4 ############################## (0.040 / 0.100) 79.630: 3 ####################### (0.030 / 0.130) 81.674: 3 ####################### (0.030 / 0.160) 83.718: 6 ############################################# (0.060 / 0.220) 85.762: 4 ############################## (0.040 / 0.260) 87.806: 3 ####################### (0.030 / 0.290) 89.850: 6 ############################################# (0.060 / 0.350) 91.894: 4 ############################## (0.040 / 0.390) 93.938: 7 ##################################################### (0.070 / 0.460) 95.982: 5 ###################################### (0.050 / 0.510) 98.027: 8 ############################################################ (0.080 / 0.590) 100.071: 3 ####################### (0.030 / 0.620) 102.115: 0 (0.000 / 0.620) 104.159: 5 ###################################### (0.050 / 0.670) 106.203: 4 ############################## (0.040 / 0.710) 108.247: 8 ############################################################ (0.080 / 0.790) 110.291: 3 ####################### (0.030 / 0.820) 112.335: 0 (0.000 / 0.820) 114.380: 3 ####################### (0.030 / 0.850) 116.424: 2 ############### (0.020 / 0.870) 118.468: 5 ###################################### (0.050 / 0.920) 120.512: 0 (0.000 / 0.920) 122.556: 2 ############### (0.020 / 0.940) 124.600: 1 ######## (0.010 / 0.950) 126.644: 1 ######## (0.010 / 0.960) 128.688: 1 ######## (0.010 / 0.970) 130.732: 1 ######## (0.010 / 0.980) 132.777: 1 ######## (0.010 / 0.990) 134.821: 0 (0.000 / 0.990) 136.865: 0 (0.000 / 0.990) 138.909: 0 (0.000 / 0.990) 140.953: 1 ######## (0.010 / 1.000) [len = 100,min = 61.2325,mean = 97.8581,median = 96.4357,max = 142.997,variance = 245.689,stdev = 15.6745] Scatter plot: ymax = 142.997 | * | | * | * | * * * | * * | ** * * * * * | * ** * * | * * * * * * * * |** ** * * * * * | * * * * ** * | * * * ** * * * * * * * | ** * *** * * * * * | * * * ** * * * * | * * * * ** ** * | * * * * * * * | * | * ** | | * ymin = 61.2325 x = [1,100] min_accepted_samples = 100 var : mu Probabilities (truncated): 107.172189039724046: 0.0100000000000000 106.902222061949885: 0.0100000000000000 106.841509845965319: 0.0100000000000000 106.801693752315686: 0.0100000000000000 ......... 90.287495574582124: 0.0100000000000000 90.128487057578042: 0.0100000000000000 89.209562877290679: 0.0100000000000000 87.634227664970894: 0.0100000000000000 mean = 97.9843 var : sigma Probabilities (truncated): 25.235207247657332: 0.0100000000000000 24.960892316867081: 0.0100000000000000 24.305713671867601: 0.0100000000000000 23.981751718549873: 0.0100000000000000 ......... 8.943308591816253: 0.0100000000000000 8.905127769757588: 0.0100000000000000 8.776420658350187: 0.0100000000000000 8.570526981060638: 0.0100000000000000 mean = 16.7197 var : post Probabilities (truncated): 174.080208953211695: 0.0100000000000000 135.974489050403918: 0.0100000000000000 127.629774030463722: 0.0100000000000000 127.010709526070542: 0.0100000000000000 ......... 65.673643500442438: 0.0100000000000000 62.192018581539855: 0.0100000000000000 55.89943157909898: 0.0100000000000000 31.410542004565492: 0.0100000000000000 mean = 97.9014 Post: [len = 100,min = 31.4105,mean = 97.9014,median = 99.3666,max = 174.08,variance = 348.479,stdev = 18.6676] Histogram (total 100) 31.411: 1 #### (0.010 / 0.010) 34.977: 0 (0.000 / 0.010) 38.544: 0 (0.000 / 0.010) 42.111: 0 (0.000 / 0.010) 45.678: 0 (0.000 / 0.010) 49.244: 0 (0.000 / 0.010) 52.811: 0 (0.000 / 0.010) 56.378: 1 #### (0.010 / 0.020) 59.944: 0 (0.000 / 0.020) 63.511: 1 #### (0.010 / 0.030) 67.078: 3 ############# (0.030 / 0.060) 70.645: 3 ############# (0.030 / 0.090) 74.211: 2 ######### (0.020 / 0.110) 77.778: 5 ##################### (0.050 / 0.160) 81.345: 3 ############# (0.030 / 0.190) 84.912: 3 ############# (0.030 / 0.220) 88.478: 9 ####################################### (0.090 / 0.310) 92.045: 4 ################# (0.040 / 0.350) 95.612: 8 ################################## (0.080 / 0.430) 99.179: 13 ######################################################## (0.130 / 0.560) 102.745: 14 ############################################################ (0.140 / 0.700) 106.312: 6 ########################## (0.060 / 0.760) 109.879: 8 ################################## (0.080 / 0.840) 113.446: 3 ############# (0.030 / 0.870) 117.012: 2 ######### (0.020 / 0.890) 120.579: 3 ############# (0.030 / 0.920) 124.146: 4 ################# (0.040 / 0.960) 127.713: 2 ######### (0.020 / 0.980) 131.279: 0 (0.000 / 0.980) 134.846: 1 #### (0.010 / 0.990) 138.413: 0 (0.000 / 0.990) 141.980: 0 (0.000 / 0.990) 145.546: 0 (0.000 / 0.990) 149.113: 0 (0.000 / 0.990) 152.680: 0 (0.000 / 0.990) 156.247: 0 (0.000 / 0.990) 159.813: 0 (0.000 / 0.990) 163.380: 0 (0.000 / 0.990) 166.947: 0 (0.000 / 0.990) 170.513: 1 #### (0.010 / 1.000) Scatter plot: ymax = 174.08 | * | | | | | * | * * | * *** * * * |* * * * * * * | ** * * * * ** * * * ** * * * | * ** **** * * * * ** * ** * * * *** * | * ** * * ** * * ** * * | ** * * * * * | * * ** * * | * * * * * * | * | * | | | * ymin = 31.4105 x = [1,100] normalized_rmse = 0.221029 Good match; only small differences across the distribution. abc_fit_score_explained = [0.316787,Moderate mismatch; distributions differ in shape.] */ go6 ?=> N = 100, % N = 1000, % N = 10000, Data = normal_dist_n(100,15,N), println(data=Data), show_histogram(Data), show_simple_stats(Data), show_scatter_plot(Data), reset_store, run_model(100,$model(Data),[show_probs_trunc,mean, % show_percentiles,show_histogram, % show_hpd_intervals,hpd_intervals=[0.84], min_accepted_samples=100 % ,show_accepted_samples=true ]), analyse_post(Data,get_store().get("post")), nl, % show_store_lengths,nl, % fail, nl. go6 => true.