/* GEV for maximum wind speeds in Picat. From Mathematica ExtremeValueDistribution """ ExtremeValueDistribution can be used to model monthly maximum wind speeds. Recorded monthly maxima of wind speeds in km/h for Boston, MA, from January 1950 till December 2009: ... Fit the distribution to the data: edist = EstimatedDistribution(maxWinds, ExtremeValueDistribution(a, b)) -> ExtremeValueDistribution(47.2374, 9.1497)) ... Find the probability of monthly maximum wind speed exceeding 60 mph: Probability(x > Quantity(60, "mph"), x ~ edist) -> 0.00454842 """ The original data was in km/h but the last query is in mph, so let's convert 60mph to km/h: """ UnitConvert[Quantity[60, ("Miles")/("Hours")], "km/h"] // N -> Quantity[96.5606, ("Kilometers")/("Hours")] Probability[x > QuantityMagnitude@%, x \[Distributed] edist] 0.00454842 """ Note: Here we use max_stable_dist/3 instead. Cf ppl_generalized_extreme_value_maximum_wind_speed.pi This program was created by Hakan Kjellerstrand, hakank@gmail.com See also my Picat page: http://www.hakank.org/picat/ */ import ppl_distributions, ppl_utils. import util. % import ordset. main => go. /* [len = 72,min = 55.44,mean = 73.56,median = 72.36,max = 138.96,variance = 192.254,stdev = 13.8656] Histogram (total 72) 55.440: 3 ############## (0.042 / 0.042) 58.423: 7 ################################ (0.097 / 0.139) 61.406: 2 ######### (0.028 / 0.167) 64.389: 13 ############################################################ (0.181 / 0.347) 67.371: 9 ########################################## (0.125 / 0.472) 70.354: 1 ##### (0.014 / 0.486) 73.337: 9 ########################################## (0.125 / 0.611) 76.320: 9 ########################################## (0.125 / 0.736) 79.303: 2 ######### (0.028 / 0.764) 82.286: 4 ################## (0.056 / 0.819) 85.269: 3 ############## (0.042 / 0.861) 88.251: 3 ############## (0.042 / 0.903) 91.234: 0 (0.000 / 0.903) 94.217: 1 ##### (0.014 / 0.917) 97.200: 3 ############## (0.042 / 0.958) 100.183: 1 ##### (0.014 / 0.972) 103.166: 0 (0.000 / 0.972) 106.149: 1 ##### (0.014 / 0.986) 109.131: 0 (0.000 / 0.986) 112.114: 0 (0.000 / 0.986) 115.097: 0 (0.000 / 0.986) 118.080: 0 (0.000 / 0.986) 121.063: 0 (0.000 / 0.986) 124.046: 0 (0.000 / 0.986) 127.029: 0 (0.000 / 0.986) 130.011: 0 (0.000 / 0.986) 132.994: 0 (0.000 / 0.986) 135.977: 1 ##### (0.014 / 1.000) HPD intervals: HPD interval (0.84): 55.44000000000000..85.31999999999999 HPD interval (0.9): 55.44000000000000..88.92000000000000 HPD interval (0.99): 55.44000000000000..138.96000000000001 HPD interval (0.999): 55.44000000000000..138.96000000000001 HPD interval (0.9999): 55.44000000000000..138.96000000000001 HPD interval (0.99999): 55.44000000000000..138.96000000000001 methods = [median,iqr,mad,q10,q90,kurtosis] explain = Data appears strongly skewed and heavy-tailed. Using robust shape statistics. min_accepted_samples = 1000 var : mu Probabilities (truncated): 80.773080749877039: 0.0010000000000000 80.535767829200537: 0.0010000000000000 80.050144075451158: 0.0010000000000000 79.356876192789187: 0.0010000000000000 ......... 55.102536632465387: 0.0010000000000000 54.988928556041884: 0.0010000000000000 54.288801155560733: 0.0010000000000000 51.724032808763198: 0.0010000000000000 mean = 68.4486 HPD intervals: HPD interval (0.84): 60.45899197468806..76.33121530444370 HPD interval (0.9): 58.50076707437429..76.20073325637593 HPD interval (0.99): 55.29411751293104..78.84090150562234 HPD interval (0.999): 54.28880115556073..80.77308074987704 HPD interval (0.9999): 51.72403280876320..80.77308074987704 HPD interval (0.99999): 51.72403280876320..80.77308074987704 var : sigma Probabilities (truncated): 9.989033783780892: 0.0010000000000000 9.951219302579396: 0.0010000000000000 9.937530565046487: 0.0010000000000000 9.90007549053993: 0.0010000000000000 ......... 0.795296747607783: 0.0010000000000000 0.744934529412973: 0.0010000000000000 0.68267711004367: 0.0010000000000000 0.665930197884296: 0.0010000000000000 mean = 5.80909 HPD intervals: HPD interval (0.84): 2.94496929875806..9.71004895852415 HPD interval (0.9): 2.26608116750889..9.68525070216751 HPD interval (0.99): 1.07056238272719..9.95121930257940 HPD interval (0.999): 0.66593019788430..9.95121930257940 HPD interval (0.9999): 0.66593019788430..9.98903378378089 HPD interval (0.99999): 0.66593019788430..9.98903378378089 var : xi Probabilities (truncated): 0.998997526708523: 0.0010000000000000 0.983558936502579: 0.0010000000000000 0.983433911569153: 0.0010000000000000 0.976014462754137: 0.0010000000000000 ......... 0.002272295300976: 0.0010000000000000 0.00195681071: 0.0010000000000000 0.001915838104634: 0.0010000000000000 0.000505435746398: 0.0010000000000000 mean = 0.350708 HPD intervals: HPD interval (0.84): 0.00191583810463..0.61677823616973 HPD interval (0.9): 0.00475988816692..0.71475234381610 HPD interval (0.99): 0.00050543574640..0.95232514150083 HPD interval (0.999): 0.00050543574640..0.98355893650258 HPD interval (0.9999): 0.00050543574640..0.99899752670852 HPD interval (0.99999): 0.00050543574640..0.99899752670852 var : post Probabilities (truncated): 339.848827750994644: 0.0010000000000000 208.080672564219753: 0.0010000000000000 165.808837245082799: 0.0010000000000000 157.93130491471311: 0.0010000000000000 ......... 49.822165513163256: 0.0010000000000000 49.246031308666858: 0.0010000000000000 48.046680010663543: 0.0010000000000000 43.747029413657607: 0.0010000000000000 mean = 74.1642 HPD intervals: HPD interval (0.84): 57.81286996715846..85.10119364747915 HPD interval (0.9): 56.25414064167550..91.98007995472297 HPD interval (0.99): 48.04668001066354..121.87051093601130 HPD interval (0.999): 43.74702941365761..208.08067256421975 HPD interval (0.9999): 43.74702941365761..339.84882775099464 HPD interval (0.99999): 43.74702941365761..339.84882775099464 var : p Probabilities: false: 0.9440000000000000 true: 0.0560000000000000 mean = 0.056 HPD intervals: show_hpd_intervals: data is not numeric var : p2 Probabilities: false: 0.9940000000000000 true: 0.0060000000000000 mean = 0.006 HPD intervals: show_hpd_intervals: data is not numeric Post: [len = 1000,min = 43.747,mean = 74.1642,median = 72.2508,max = 339.849,variance = 246.633,stdev = 15.7046] Histogram (total 1000) 43.747: 1 (0.001 / 0.001) 51.150: 19 ### (0.019 / 0.020) 58.552: 116 #################### (0.116 / 0.136) 65.955: 246 ########################################### (0.246 / 0.382) 73.357: 347 ############################################################ (0.347 / 0.729) 80.760: 138 ######################## (0.138 / 0.867) 88.162: 59 ########## (0.059 / 0.926) 95.565: 31 ##### (0.031 / 0.957) 102.967: 18 ### (0.018 / 0.975) 110.370: 10 ## (0.010 / 0.985) 117.772: 5 # (0.005 / 0.990) 125.175: 2 (0.002 / 0.992) 132.578: 1 (0.001 / 0.993) 139.980: 1 (0.001 / 0.994) 147.383: 2 (0.002 / 0.996) 154.785: 1 (0.001 / 0.997) 162.188: 1 (0.001 / 0.998) 169.590: 0 (0.000 / 0.998) 176.993: 0 (0.000 / 0.998) 184.395: 0 (0.000 / 0.998) 191.798: 0 (0.000 / 0.998) 199.200: 0 (0.000 / 0.998) 206.603: 1 (0.001 / 0.999) 214.006: 0 (0.000 / 0.999) 221.408: 0 (0.000 / 0.999) 228.811: 0 (0.000 / 0.999) 236.213: 0 (0.000 / 0.999) 243.616: 0 (0.000 / 0.999) 251.018: 0 (0.000 / 0.999) 258.421: 0 (0.000 / 0.999) 265.823: 0 (0.000 / 0.999) 273.226: 0 (0.000 / 0.999) 280.628: 0 (0.000 / 0.999) 288.031: 0 (0.000 / 0.999) 295.434: 0 (0.000 / 0.999) 302.836: 0 (0.000 / 0.999) 310.239: 0 (0.000 / 0.999) 317.641: 0 (0.000 / 0.999) 325.044: 0 (0.000 / 0.999) 332.446: 1 (0.001 / 1.000) HPD intervals: HPD interval (0.84): 57.81286996715846..85.10119364747915 HPD interval (0.9): 56.25414064167550..91.98007995472297 HPD interval (0.99): 48.04668001066354..121.87051093601130 HPD interval (0.999): 43.74702941365761..208.08067256421975 HPD interval (0.9999): 43.74702941365761..339.84882775099464 HPD interval (0.99999): 43.74702941365761..339.84882775099464 */ go ?=> % Annual maximum wind speed in Boston 1938..2009 % From Mathematica (FrechetDistribution, see above) Data = [72.36,64.8,72.36,68.4,77.76,55.44,96.48,96.48,66.6,72.36, 61.2,55.8,96.48,74.16,77.76,88.92,138.96,77.76,85.32,66.6, 70.56,63,81.72,66.6,74.16,77.76,64.8,57.6,66.6,77.76,63, 64.8,74.16,74.16,79.56,77.76,68.76,59.4,86.76,84.96,92.88, 81.72,79.56,81.72,64.8,105.48,64.8,85.32,64.8,63,72.36, 100.08,59.4,75.96,81.72,87.12,77.76,68.76,74.16,68.4,75.96, 63,63,64.8,57.6,61.2,59.4,66.6,57.24,63,55.44,59.4], show_simple_stats(Data), show_histogram(Data), show_hpd_intervals(Data,[0.84,0.9,0.99,0.999,0.9999,0.99999]), [Methods,Explain] = recommend_abc_stats_explained(Data), println(methods=Methods), println(explain=Explain), reset_store, run_model(10_000,$model(Data),[show_probs_trunc,mean, % show_percentiles, show_hpd_intervals,hpd_intervals=[0.84,0.9,0.99,0.999,0.9999,0.99999], % show_histogram, min_accepted_samples=1000 % ,show_accepted_samples=true ]), println("Post:"), Post = get_store().get("post"), show_simple_stats(Post), show_histogram(Post), show_hpd_intervals(Post,[0.84,0.9,0.99,0.999,0.9999,0.99999]), nl, % show_store_lengths,nl, % fail, nl. go => true. model(Data) => Mu = normal_dist(Data.mean,Data.stdev), Sigma = uniform(0,10), Xi = uniform(0,1), Y = max_stable_dist_n(Mu,Sigma,Xi,Data.len), % observe_abc(Data,Y,1/10), observe_abc(Data,Y,1,[median,iqr,mad,q10,q90,kurtosis]), Post = max_stable_dist(Mu,Sigma,Xi), P = check(Post >= 96.5606), P2 = check(Post >= Data.max), if observed_ok then add("mu",Mu), add("sigma",Sigma), add("xi",Xi), add("post",Post), add("p",P), add("p2",P2), end.