nohup: ignoring input 2 l2sgd-batch-size = 30 test-partial-annotation-one-by-one = 0 parameter-path = . thread-num = 20 get-cpostag-from-pdeprel = 0 english = 0 inst-max-num-eval = -500 l2sgd-c2 = 0.1 use-addtional-pos = 1 inst-max-len-to-throw = 100 fcutoff = 0 display-interval = 1000 constrained-tag-test = 0 param-num-for-eval = 18 param-tmp-num = 0 inst-max-num-train = -1000 test-batch-size = 100000 test-partial-annotation = 0 train-method = l2sgd test = 0 copy-cpostag-from-postag = 0 iter-num = 30 parameter-exist = 1 addtitional-pos = URL_ON_X dictionary-path = . dev-file = ../../data/ctb7/dev.conll test-file = ../../data/ctb7/test.conll output-file = test.out.conll07 train = 1 test-partial-annotation-ratio = 0.5 dictionary-exist = 1 train-file = ../../data/ctb7/train.conll *** l2sgd train options: 30-0.1-10-1e-06-0.1-2-1000-10-20 Get all instances from ../../data/ctb7/train.conll [Fri May 6 14:24:49 2016] [0:101] [1:106] [2:108] [3:121] [4:118] [5:130] [6:171] [7:107] [8:128] [9:160] [10:131] [11:102] [12:147] [13:127] [14:112] [15:241] [16:103] [17:101] [18:112] [19:160] [20:102] [21:115] [22:124] [23:123] [24:103] [25:107] [26:109] [27:124] [28:196] [29:113] [30:167] [31:129] [32:144] [33:109] [34:115] [35:110] [36:107] [37:108] [38:102] [39:121] [40:128] [41:114] [42:108] [43:101] [44:114] [45:121] [46:110] [47:122] [48:102] [49:104] [50:106] [51:102] [52:102] [53:132] [54:117] [55:103] [56:119] [57:107] [58:137] [59:105] [60:110] [61:108] [62:106] [63:142] [64:107] [65:109] [66:101] [67:146] [68:106] [69:129] [70:121] [71:105] [72:121] [73:125] [74:112] [75:121] [76:102] [77:106] [78:106] [79:136] [80:106] [81:115] [82:128] [83:126] [84:115] [85:144] [86:104] [87:101] [88:171] [89:123] [90:210] [91:105] [92:114] [93:126] [94:150] [95:117] [96:114] [97:125] [98:129] [99:101] [100:188] [101:125] [102:243] [103:111] [104:118] [105:142] [106:132] [107:135] [108:129] [109:151] [110:101] [111:155] [112:148] [113:108] [114:194] [115:111] [116:105] [117:107] [118:120] [119:164] [120:141] [121:173] [122:154] [123:114] [124:191] [125:154] [126:128] [127:142] [128:112] [129:187] [130:120] [131:137] [132:121] [133:137] [134:103] [135:134] [136:103] [137:104] [138:108] [139:193] [140:175] [141:115] [142:137] [143:112] [144:121] instance num: 46427 Done! [Fri May 6 14:24:51 2016] Get all instances from ../../data/ctb7/dev.conll [Fri May 6 14:24:51 2016] [0:146] [1:129] [2:108] [3:120] [4:122] [5:122] [6:108] [7:102] [8:101] [9:111] [10:112] [11:120] [12:103] [13:141] [14:111] [15:129] [16:120] [17:110] instance num: 2061 Done! [Fri May 6 14:24:52 2016] FGen : loading feature dictionaries from "." [Fri May 6 14:24:52 2016] FeatureDictionary : loading from "./word.dict.gz" FeatureDictionary : 55324 features FeatureDictionary : loading from "./pos.dict.gz" FeatureDictionary : 36 features FeatureDictionary : loading from "./pos.features.gz" FeatureDictionary : ..... 1411761 features word number: 55324 postag number: 36 pos feature dimensionality: 1411761 total feature dimensionality: 50823396 pos feature start offset: 0 done! [Fri May 6 14:24:53 2016] instance num from train1: 46427 Stochastic Gradient Descent (SGD) [Fri May 6 14:24:53 2016] batch size: 30 c2: 0.1 max-iterations: 30 period: 10 delta: 0.0 instance num from train1: 46427 Calibrating the learning rate (eta) calibration.eta: 0.1 calibration.rate: 2.0 calibration.samples: 1000 calibration.candidates: 10 calibration.max_trials: 20 Initial loss: 82492.6 Trial #1 (eta = 0.100000000000000) 0 [Fri May 6 14:25:00 2016] instance num: 1000 [Fri May 6 14:25:09 2016] Loss: 65643.963710536379949 Trial #2 (eta = 0.200000000000000) 0 [Fri May 6 14:25:09 2016] instance num: 1000 [Fri May 6 14:25:18 2016] Loss: 126210.222474946669536 (worse) Trial #3 (eta = 0.050000000000000) 0 [Fri May 6 14:25:18 2016] instance num: 1000 [Fri May 6 14:25:28 2016] Loss: 35076.555208328041772 Trial #4 (eta = 0.025000000000000) 0 [Fri May 6 14:25:28 2016] instance num: 1000 [Fri May 6 14:25:38 2016] Loss: 28839.402313053240505 Trial #5 (eta = 0.012500000000000) 0 [Fri May 6 14:25:38 2016] instance num: 1000 [Fri May 6 14:25:49 2016] Loss: 32314.675823225439672 Trial #6 (eta = 0.006250000000000) 0 [Fri May 6 14:25:49 2016] instance num: 1000 [Fri May 6 14:25:59 2016] Loss: 39296.658863046752231 Trial #7 (eta = 0.003125000000000) 0 [Fri May 6 14:25:59 2016] instance num: 1000 [Fri May 6 14:26:09 2016] Loss: 47020.300869878432422 Trial #8 (eta = 0.001562500000000) 0 [Fri May 6 14:26:09 2016] instance num: 1000 [Fri May 6 14:26:19 2016] Loss: 55040.708236095953907 Trial #9 (eta = 0.000781250000000) 0 [Fri May 6 14:26:19 2016] instance num: 1000 [Fri May 6 14:26:29 2016] Loss: 63123.976730163718457 Trial #10 (eta = 0.000390625000000) 0 [Fri May 6 14:26:29 2016] instance num: 1000 [Fri May 6 14:26:39 2016] Loss: 70651.338164313696325 Trial #11 (eta = 0.000195312500000) 0 [Fri May 6 14:26:39 2016] instance num: 1000 [Fri May 6 14:26:49 2016] Loss: 76083.107317909481935 Trial #12 (eta = 0.000097656250000) 0 [Fri May 6 14:26:49 2016] instance num: 1000 [Fri May 6 14:26:59 2016] Loss: 79197.338576757159899 Best learning rate (eta): 0.025000000000000 [Fri May 6 14:26:59 2016] t0: 9285400.000000000000000 ***** Iteration #1 ***** [Fri May 6 14:26:59 2016] instance num from train1: 46427 0 [Fri May 6 14:26:59 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 14:28:32 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 14:30:01 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 14:31:31 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 14:33:01 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 14:34:03 2016] Loss: 311017.747070159704890 Feature L2-norm: 98.258916409350633 Learning rate (eta): 0.024875667207112 Total number of feature updates: 46427.000000000000000 save parameters to "./parameters.001.gz" .............................................. parameters::save [nnz=48797642/50823396] 0 [Fri May 6 14:34:39 2016] 1000 2000 instance num: 2061 [Fri May 6 14:34:41 2016] POS Precision: 53401/57858 = 92.29666 POS Prec(OOV): 2222/2816 = 78.90625 *** mbr decoding *** 0 [Fri May 6 14:34:41 2016] 1000 2000 instance num: 2061 [Fri May 6 14:34:44 2016] POS Precision: 53462/57858 = 92.40209 POS Prec(OOV): 2230/2816 = 79.19034 ***** Iteration #2 ***** [Fri May 6 14:34:44 2016] instance num from train1: 46427 0 [Fri May 6 14:34:44 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 14:36:21 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 14:38:04 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 14:39:47 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 14:41:31 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 14:42:36 2016] Loss: 182751.57146 Feature L2-norm: 121.95773 Learning rate (eta): 0.02475 Total number of feature updates: 92854.00000 save parameters to "./parameters.002.gz" .............................................. parameters::save [nnz=48797646/50823396] 0 [Fri May 6 14:43:10 2016] 1000 2000 instance num: 2061 [Fri May 6 14:43:12 2016] POS Precision: 53935/57858 = 93.21961 POS Prec(OOV): 2280/2816 = 80.96591 *** mbr decoding *** 0 [Fri May 6 14:43:12 2016] 1000 2000 instance num: 2061 [Fri May 6 14:43:15 2016] POS Precision: 53960/57858 = 93.26282 POS Prec(OOV): 2287/2816 = 81.21449 ***** Iteration #3 ***** [Fri May 6 14:43:15 2016] instance num from train1: 46427 0 [Fri May 6 14:43:15 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 14:44:53 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 14:46:37 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 14:48:20 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 14:50:04 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 14:51:11 2016] Loss: 154727.50189 Feature L2-norm: 139.08517 Learning rate (eta): 0.02463 Total number of feature updates: 139281.00000 save parameters to "./parameters.003.gz" .............................................. parameters::save [nnz=48797667/50823396] 0 [Fri May 6 14:51:44 2016] 1000 2000 instance num: 2061 [Fri May 6 14:51:46 2016] POS Precision: 54067/57858 = 93.44775 POS Prec(OOV): 2286/2816 = 81.17898 *** mbr decoding *** 0 [Fri May 6 14:51:46 2016] 1000 2000 instance num: 2061 [Fri May 6 14:51:49 2016] POS Precision: 54097/57858 = 93.49960 POS Prec(OOV): 2292/2816 = 81.39205 ***** Iteration #4 ***** [Fri May 6 14:51:49 2016] instance num from train1: 46427 0 [Fri May 6 14:51:49 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 14:53:32 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 14:55:15 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 14:56:59 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 14:58:41 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 14:59:44 2016] Loss: 138183.75012 Feature L2-norm: 153.12960 Learning rate (eta): 0.02451 Total number of feature updates: 185708.00000 save parameters to "./parameters.004.gz" .............................................. parameters::save [nnz=48797667/50823396] 0 [Fri May 6 15:00:17 2016] 1000 2000 instance num: 2061 [Fri May 6 15:00:19 2016] POS Precision: 54307/57858 = 93.86256 POS Prec(OOV): 2322/2816 = 82.45739 *** mbr decoding *** 0 [Fri May 6 15:00:19 2016] 1000 2000 instance num: 2061 [Fri May 6 15:00:22 2016] POS Precision: 54331/57858 = 93.90404 POS Prec(OOV): 2317/2816 = 82.27983 ***** Iteration #5 ***** [Fri May 6 15:00:22 2016] instance num from train1: 46427 0 [Fri May 6 15:00:22 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 15:02:05 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 15:03:51 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 15:05:37 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 15:07:22 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 15:08:29 2016] Loss: 126280.60874 Feature L2-norm: 165.26083 Learning rate (eta): 0.02439 Total number of feature updates: 232135.00000 save parameters to "./parameters.005.gz" .............................................. parameters::save [nnz=48797679/50823396] 0 [Fri May 6 15:09:01 2016] 1000 2000 instance num: 2061 [Fri May 6 15:09:04 2016] POS Precision: 54122/57858 = 93.54281 POS Prec(OOV): 2294/2816 = 81.46307 *** mbr decoding *** 0 [Fri May 6 15:09:04 2016] 1000 2000 instance num: 2061 [Fri May 6 15:09:07 2016] POS Precision: 54159/57858 = 93.60676 POS Prec(OOV): 2301/2816 = 81.71165 ***** Iteration #6 ***** [Fri May 6 15:09:07 2016] instance num from train1: 46427 0 [Fri May 6 15:09:07 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 15:10:50 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 15:12:34 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 15:14:19 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 15:16:02 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 15:17:04 2016] Loss: 117222.11477 Feature L2-norm: 176.12213 Learning rate (eta): 0.02427 Total number of feature updates: 278562.00000 save parameters to "./parameters.006.gz" .............................................. parameters::save [nnz=48797690/50823396] 0 [Fri May 6 15:17:37 2016] 1000 2000 instance num: 2061 [Fri May 6 15:17:39 2016] POS Precision: 54313/57858 = 93.87293 POS Prec(OOV): 2313/2816 = 82.13778 *** mbr decoding *** 0 [Fri May 6 15:17:39 2016] 1000 2000 instance num: 2061 [Fri May 6 15:17:41 2016] POS Precision: 54329/57858 = 93.90058 POS Prec(OOV): 2312/2816 = 82.10227 ***** Iteration #7 ***** [Fri May 6 15:17:41 2016] instance num from train1: 46427 0 [Fri May 6 15:17:42 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 15:19:06 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 15:20:35 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 15:22:04 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 15:23:33 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 15:24:30 2016] Loss: 110166.10848 Feature L2-norm: 186.02040 Learning rate (eta): 0.02415 Total number of feature updates: 324989.00000 save parameters to "./parameters.007.gz" .............................................. parameters::save [nnz=48797693/50823396] 0 [Fri May 6 15:25:04 2016] 1000 2000 instance num: 2061 [Fri May 6 15:25:06 2016] POS Precision: 54341/57858 = 93.92132 POS Prec(OOV): 2318/2816 = 82.31534 *** mbr decoding *** 0 [Fri May 6 15:25:06 2016] 1000 2000 instance num: 2061 [Fri May 6 15:25:09 2016] POS Precision: 54395/57858 = 94.01466 POS Prec(OOV): 2315/2816 = 82.20881 ***** Iteration #8 ***** [Fri May 6 15:25:09 2016] instance num from train1: 46427 0 [Fri May 6 15:25:09 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 15:26:37 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 15:28:05 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 15:29:35 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 15:31:03 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 15:32:01 2016] Loss: 104114.94522 Feature L2-norm: 195.20527 Learning rate (eta): 0.02404 Total number of feature updates: 371416.00000 save parameters to "./parameters.008.gz" .............................................. parameters::save [nnz=48797693/50823396] 0 [Fri May 6 15:32:34 2016] 1000 2000 instance num: 2061 [Fri May 6 15:32:37 2016] POS Precision: 54459/57858 = 94.12527 POS Prec(OOV): 2318/2816 = 82.31534 *** mbr decoding *** 0 [Fri May 6 15:32:37 2016] 1000 2000 instance num: 2061 [Fri May 6 15:32:39 2016] POS Precision: 54471/57858 = 94.14601 POS Prec(OOV): 2316/2816 = 82.24432 ***** Iteration #9 ***** [Fri May 6 15:32:39 2016] instance num from train1: 46427 0 [Fri May 6 15:32:40 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 15:34:07 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 15:35:37 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 15:37:06 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 15:38:35 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 15:39:33 2016] Loss: 98964.49310 Feature L2-norm: 203.71096 Learning rate (eta): 0.02392 Total number of feature updates: 417843.00000 save parameters to "./parameters.009.gz" .............................................. parameters::save [nnz=48797693/50823396] 0 [Fri May 6 15:40:07 2016] 1000 2000 instance num: 2061 [Fri May 6 15:40:09 2016] POS Precision: 54499/57858 = 94.19441 POS Prec(OOV): 2321/2816 = 82.42188 *** mbr decoding *** 0 [Fri May 6 15:40:09 2016] 1000 2000 instance num: 2061 [Fri May 6 15:40:12 2016] POS Precision: 54493/57858 = 94.18404 POS Prec(OOV): 2323/2816 = 82.49290 ***** Iteration #10 ***** [Fri May 6 15:40:12 2016] instance num from train1: 46427 0 [Fri May 6 15:40:12 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 15:41:40 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 15:43:10 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 15:44:41 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 15:46:10 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 15:47:07 2016] Loss: 94746.41014 Feature L2-norm: 211.75802 Learning rate (eta): 0.02381 Total number of feature updates: 464270.00000 save parameters to "./parameters.010.gz" .............................................. parameters::save [nnz=48797693/50823396] 0 [Fri May 6 15:47:41 2016] 1000 2000 instance num: 2061 [Fri May 6 15:47:43 2016] POS Precision: 54404/57858 = 94.03021 POS Prec(OOV): 2329/2816 = 82.70597 *** mbr decoding *** 0 [Fri May 6 15:47:43 2016] 1000 2000 instance num: 2061 [Fri May 6 15:47:46 2016] POS Precision: 54457/57858 = 94.12182 POS Prec(OOV): 2333/2816 = 82.84801 ***** Iteration #11 ***** [Fri May 6 15:47:46 2016] instance num from train1: 46427 0 [Fri May 6 15:47:46 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 15:49:14 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 15:50:43 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 15:52:11 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 15:53:39 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 15:54:37 2016] Loss: 90856.14816 Improvement ratio: 2.42319 Feature L2-norm: 219.32712 Learning rate (eta): 0.02370 Total number of feature updates: 510697.00000 save parameters to "./parameters.011.gz" .............................................. parameters::save [nnz=48797693/50823396] 0 [Fri May 6 15:55:10 2016] 1000 2000 instance num: 2061 [Fri May 6 15:55:13 2016] POS Precision: 54519/57858 = 94.22897 POS Prec(OOV): 2330/2816 = 82.74148 *** mbr decoding *** 0 [Fri May 6 15:55:13 2016] 1000 2000 instance num: 2061 [Fri May 6 15:55:16 2016] POS Precision: 54529/57858 = 94.24626 POS Prec(OOV): 2326/2816 = 82.59943 ***** Iteration #12 ***** [Fri May 6 15:55:16 2016] instance num from train1: 46427 0 [Fri May 6 15:55:16 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 15:56:44 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 15:58:13 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 15:59:42 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 16:01:16 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 16:02:23 2016] Loss: 87376.05892 Improvement ratio: 1.09155 Feature L2-norm: 226.50376 Learning rate (eta): 0.02358 Total number of feature updates: 557124.00000 save parameters to "./parameters.012.gz" .............................................. parameters::save [nnz=48797693/50823396] 0 [Fri May 6 16:02:57 2016] 1000 2000 instance num: 2061 [Fri May 6 16:02:59 2016] POS Precision: 54463/57858 = 94.13219 POS Prec(OOV): 2324/2816 = 82.52841 *** mbr decoding *** 0 [Fri May 6 16:02:59 2016] 1000 2000 instance num: 2061 [Fri May 6 16:03:02 2016] POS Precision: 54478/57858 = 94.15811 POS Prec(OOV): 2326/2816 = 82.59943 ***** Iteration #13 ***** [Fri May 6 16:03:02 2016] instance num from train1: 46427 0 [Fri May 6 16:03:02 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 16:04:37 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 16:06:12 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 16:07:47 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 16:09:20 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 16:10:19 2016] Loss: 84277.07318 Improvement ratio: 0.83594 Feature L2-norm: 233.35216 Learning rate (eta): 0.02347 Total number of feature updates: 603551.00000 save parameters to "./parameters.013.gz" .............................................. parameters::save [nnz=48797693/50823396] 0 [Fri May 6 16:10:53 2016] 1000 2000 instance num: 2061 [Fri May 6 16:10:55 2016] POS Precision: 54493/57858 = 94.18404 POS Prec(OOV): 2326/2816 = 82.59943 *** mbr decoding *** 0 [Fri May 6 16:10:55 2016] 1000 2000 instance num: 2061 [Fri May 6 16:10:58 2016] POS Precision: 54541/57858 = 94.26700 POS Prec(OOV): 2331/2816 = 82.77699 ***** Iteration #14 ***** [Fri May 6 16:10:58 2016] instance num from train1: 46427 0 [Fri May 6 16:10:58 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 16:12:37 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 16:14:20 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 16:16:03 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 16:17:46 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 16:18:52 2016] Loss: 81641.11132 Improvement ratio: 0.69258 Feature L2-norm: 239.88767 Learning rate (eta): 0.02336 Total number of feature updates: 649978.00000 save parameters to "./parameters.014.gz" .............................................. parameters::save [nnz=48797693/50823396] 0 [Fri May 6 16:19:26 2016] 1000 2000 instance num: 2061 [Fri May 6 16:19:28 2016] POS Precision: 54506/57858 = 94.20651 POS Prec(OOV): 2336/2816 = 82.95455 *** mbr decoding *** 0 [Fri May 6 16:19:28 2016] 1000 2000 instance num: 2061 [Fri May 6 16:19:31 2016] POS Precision: 54543/57858 = 94.27046 POS Prec(OOV): 2334/2816 = 82.88352 ***** Iteration #15 ***** [Fri May 6 16:19:31 2016] instance num from train1: 46427 0 [Fri May 6 16:19:31 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 16:21:04 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 16:22:42 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 16:24:24 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 16:26:06 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 16:27:12 2016] Loss: 79209.93776 Improvement ratio: 0.59425 Feature L2-norm: 246.14040 Learning rate (eta): 0.02326 Total number of feature updates: 696405.00000 save parameters to "./parameters.015.gz" .............................................. parameters::save [nnz=48797693/50823396] 0 [Fri May 6 16:27:48 2016] 1000 2000 instance num: 2061 [Fri May 6 16:27:50 2016] POS Precision: 54561/57858 = 94.30157 POS Prec(OOV): 2342/2816 = 83.16761 *** mbr decoding *** 0 [Fri May 6 16:27:50 2016] 1000 2000 instance num: 2061 [Fri May 6 16:27:53 2016] POS Precision: 54595/57858 = 94.36033 POS Prec(OOV): 2336/2816 = 82.95455 ***** Iteration #16 ***** [Fri May 6 16:27:53 2016] instance num from train1: 46427 0 [Fri May 6 16:27:53 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 16:29:34 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 16:31:16 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 16:32:58 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 16:34:41 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 16:35:47 2016] Loss: 76942.25410 Improvement ratio: 0.52351 Feature L2-norm: 252.15842 Learning rate (eta): 0.02315 Total number of feature updates: 742832.00000 save parameters to "./parameters.016.gz" .............................................. parameters::save [nnz=48797694/50823396] 0 [Fri May 6 16:36:21 2016] 1000 2000 instance num: 2061 [Fri May 6 16:36:24 2016] POS Precision: 54560/57858 = 94.29984 POS Prec(OOV): 2336/2816 = 82.95455 *** mbr decoding *** 0 [Fri May 6 16:36:24 2016] 1000 2000 instance num: 2061 [Fri May 6 16:36:26 2016] POS Precision: 54593/57858 = 94.35687 POS Prec(OOV): 2342/2816 = 83.16761 ***** Iteration #17 ***** [Fri May 6 16:36:26 2016] instance num from train1: 46427 0 [Fri May 6 16:36:27 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 16:38:01 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 16:39:34 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 16:41:06 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 16:42:38 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 16:43:39 2016] Loss: 74949.08287 Improvement ratio: 0.46988 Feature L2-norm: 257.92991 Learning rate (eta): 0.02304 Total number of feature updates: 789259.00000 save parameters to "./parameters.017.gz" .............................................. parameters::save [nnz=48797694/50823396] 0 [Fri May 6 16:44:13 2016] 1000 2000 instance num: 2061 [Fri May 6 16:44:15 2016] POS Precision: 54447/57858 = 94.10453 POS Prec(OOV): 2340/2816 = 83.09659 *** mbr decoding *** 0 [Fri May 6 16:44:15 2016] 1000 2000 instance num: 2061 [Fri May 6 16:44:17 2016] POS Precision: 54469/57858 = 94.14256 POS Prec(OOV): 2336/2816 = 82.95455 ***** Iteration #18 ***** [Fri May 6 16:44:17 2016] instance num from train1: 46427 0 [Fri May 6 16:44:18 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 16:45:50 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 16:47:17 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 16:48:45 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 16:50:13 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 16:51:10 2016] Loss: 73022.06131 Improvement ratio: 0.42580 Feature L2-norm: 263.50142 Learning rate (eta): 0.02294 Total number of feature updates: 835686.00000 save parameters to "./parameters.018.gz" .............................................. parameters::save [nnz=48797694/50823396] 0 [Fri May 6 16:51:44 2016] 1000 2000 instance num: 2061 [Fri May 6 16:51:46 2016] POS Precision: 54640/57858 = 94.43811 POS Prec(OOV): 2346/2816 = 83.30966 *** mbr decoding *** 0 [Fri May 6 16:51:46 2016] 1000 2000 instance num: 2061 [Fri May 6 16:51:49 2016] POS Precision: 54650/57858 = 94.45539 POS Prec(OOV): 2339/2816 = 83.06108 ***** Iteration #19 ***** [Fri May 6 16:51:49 2016] instance num from train1: 46427 0 [Fri May 6 16:51:49 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 16:53:17 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 16:54:44 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 16:56:12 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 16:57:40 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 16:58:37 2016] Loss: 71277.27320 Improvement ratio: 0.38844 Feature L2-norm: 268.86445 Learning rate (eta): 0.02283 Total number of feature updates: 882113.00000 save parameters to "./parameters.019.gz" .............................................. parameters::save [nnz=48797696/50823396] 0 [Fri May 6 16:59:11 2016] 1000 2000 instance num: 2061 [Fri May 6 16:59:13 2016] POS Precision: 54552/57858 = 94.28601 POS Prec(OOV): 2331/2816 = 82.77699 *** mbr decoding *** 0 [Fri May 6 16:59:13 2016] 1000 2000 instance num: 2061 [Fri May 6 16:59:16 2016] POS Precision: 54562/57858 = 94.30329 POS Prec(OOV): 2335/2816 = 82.91903 ***** Iteration #20 ***** [Fri May 6 16:59:16 2016] instance num from train1: 46427 0 [Fri May 6 16:59:16 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 17:00:44 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 17:02:12 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 17:03:44 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 17:05:12 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 17:06:10 2016] Loss: 69620.09383 Improvement ratio: 0.36091 Feature L2-norm: 274.05702 Learning rate (eta): 0.02273 Total number of feature updates: 928540.00000 save parameters to "./parameters.020.gz" .............................................. parameters::save [nnz=48797696/50823396] 0 [Fri May 6 17:06:44 2016] 1000 2000 instance num: 2061 [Fri May 6 17:06:46 2016] POS Precision: 54506/57858 = 94.20651 POS Prec(OOV): 2331/2816 = 82.77699 *** mbr decoding *** 0 [Fri May 6 17:06:46 2016] 1000 2000 instance num: 2061 [Fri May 6 17:06:49 2016] POS Precision: 54531/57858 = 94.24971 POS Prec(OOV): 2337/2816 = 82.99006 ***** Iteration #21 ***** [Fri May 6 17:06:49 2016] instance num from train1: 46427 0 [Fri May 6 17:06:49 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 17:08:17 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 17:09:45 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 17:11:13 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 17:12:41 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 17:13:38 2016] Loss: 68263.44973 Improvement ratio: 0.33096 Feature L2-norm: 279.06396 Learning rate (eta): 0.02262 Total number of feature updates: 974967.00000 save parameters to "./parameters.021.gz" .............................................. parameters::save [nnz=48797696/50823396] 0 [Fri May 6 17:14:12 2016] 1000 2000 instance num: 2061 [Fri May 6 17:14:14 2016] POS Precision: 54487/57858 = 94.17367 POS Prec(OOV): 2338/2816 = 83.02557 *** mbr decoding *** 0 [Fri May 6 17:14:14 2016] 1000 2000 instance num: 2061 [Fri May 6 17:14:17 2016] POS Precision: 54520/57858 = 94.23070 POS Prec(OOV): 2334/2816 = 82.88352 ***** Iteration #22 ***** [Fri May 6 17:14:17 2016] instance num from train1: 46427 0 [Fri May 6 17:14:17 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 17:15:52 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 17:17:25 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 17:18:59 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 17:20:31 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 17:21:30 2016] Loss: 66908.53173 Improvement ratio: 0.30590 Feature L2-norm: 283.91417 Learning rate (eta): 0.02252 Total number of feature updates: 1021394.00000 save parameters to "./parameters.022.gz" .............................................. parameters::save [nnz=48797696/50823396] 0 [Fri May 6 17:22:06 2016] 1000 2000 instance num: 2061 [Fri May 6 17:22:08 2016] POS Precision: 54543/57858 = 94.27046 POS Prec(OOV): 2341/2816 = 83.13210 *** mbr decoding *** 0 [Fri May 6 17:22:08 2016] 1000 2000 instance num: 2061 [Fri May 6 17:22:11 2016] POS Precision: 54560/57858 = 94.29984 POS Prec(OOV): 2339/2816 = 83.06108 ***** Iteration #23 ***** [Fri May 6 17:22:11 2016] instance num from train1: 46427 0 [Fri May 6 17:22:11 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 17:23:37 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 17:25:05 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 17:26:43 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 17:28:24 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 17:29:30 2016] Loss: 65651.33735 Improvement ratio: 0.28371 Feature L2-norm: 288.61323 Learning rate (eta): 0.02242 Total number of feature updates: 1067821.00000 save parameters to "./parameters.023.gz" .............................................. parameters::save [nnz=48797696/50823396] 0 [Fri May 6 17:30:06 2016] 1000 2000 instance num: 2061 [Fri May 6 17:30:08 2016] POS Precision: 54618/57858 = 94.40008 POS Prec(OOV): 2344/2816 = 83.23864 *** mbr decoding *** 0 [Fri May 6 17:30:08 2016] 1000 2000 instance num: 2061 [Fri May 6 17:30:11 2016] POS Precision: 54642/57858 = 94.44156 POS Prec(OOV): 2338/2816 = 83.02557 ***** Iteration #24 ***** [Fri May 6 17:30:11 2016] instance num from train1: 46427 0 [Fri May 6 17:30:11 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 17:31:44 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 17:33:16 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 17:34:49 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 17:36:21 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 17:37:25 2016] Loss: 64505.79448 Improvement ratio: 0.26564 Feature L2-norm: 293.17488 Learning rate (eta): 0.02232 Total number of feature updates: 1114248.00000 save parameters to "./parameters.024.gz" .............................................. parameters::save [nnz=48797696/50823396] 0 [Fri May 6 17:38:00 2016] 1000 2000 instance num: 2061 [Fri May 6 17:38:02 2016] POS Precision: 54602/57858 = 94.37243 POS Prec(OOV): 2340/2816 = 83.09659 *** mbr decoding *** 0 [Fri May 6 17:38:02 2016] 1000 2000 instance num: 2061 [Fri May 6 17:38:05 2016] POS Precision: 54629/57858 = 94.41910 POS Prec(OOV): 2346/2816 = 83.30966 ***** Iteration #25 ***** [Fri May 6 17:38:05 2016] instance num from train1: 46427 0 [Fri May 6 17:38:05 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 17:39:43 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 17:41:25 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 17:43:07 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 17:44:49 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 17:45:56 2016] Loss: 63368.15101 Improvement ratio: 0.25000 Feature L2-norm: 297.60375 Learning rate (eta): 0.02222 Total number of feature updates: 1160675.00000 save parameters to "./parameters.025.gz" .............................................. parameters::save [nnz=48797696/50823396] 0 [Fri May 6 17:46:30 2016] 1000 2000 instance num: 2061 [Fri May 6 17:46:32 2016] POS Precision: 54598/57858 = 94.36552 POS Prec(OOV): 2342/2816 = 83.16761 *** mbr decoding *** 0 [Fri May 6 17:46:32 2016] 1000 2000 instance num: 2061 [Fri May 6 17:46:35 2016] POS Precision: 54623/57858 = 94.40872 POS Prec(OOV): 2341/2816 = 83.13210 ***** Iteration #26 ***** [Fri May 6 17:46:35 2016] instance num from train1: 46427 0 [Fri May 6 17:46:35 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 17:48:13 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 17:49:55 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 17:51:38 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 17:53:20 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 17:54:24 2016] Loss: 62384.91290 Improvement ratio: 0.23335 Feature L2-norm: 301.89808 Learning rate (eta): 0.02212 Total number of feature updates: 1207102.00000 save parameters to "./parameters.026.gz" .............................................. parameters::save [nnz=48797696/50823396] 0 [Fri May 6 17:54:58 2016] 1000 2000 instance num: 2061 [Fri May 6 17:55:00 2016] POS Precision: 54590/57858 = 94.35169 POS Prec(OOV): 2333/2816 = 82.84801 *** mbr decoding *** 0 [Fri May 6 17:55:00 2016] 1000 2000 instance num: 2061 [Fri May 6 17:55:03 2016] POS Precision: 54621/57858 = 94.40527 POS Prec(OOV): 2331/2816 = 82.77699 ***** Iteration #27 ***** [Fri May 6 17:55:03 2016] instance num from train1: 46427 0 [Fri May 6 17:55:03 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 17:56:29 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 17:57:57 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 17:59:25 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 18:00:52 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 18:01:49 2016] Loss: 61391.22513 Improvement ratio: 0.22084 Feature L2-norm: 306.07591 Learning rate (eta): 0.02203 Total number of feature updates: 1253529.00000 save parameters to "./parameters.027.gz" .............................................. parameters::save [nnz=48797696/50823396] 0 [Fri May 6 18:02:23 2016] 1000 2000 instance num: 2061 [Fri May 6 18:02:25 2016] POS Precision: 54612/57858 = 94.38971 POS Prec(OOV): 2347/2816 = 83.34517 *** mbr decoding *** 0 [Fri May 6 18:02:25 2016] 1000 2000 instance num: 2061 [Fri May 6 18:02:28 2016] POS Precision: 54626/57858 = 94.41391 POS Prec(OOV): 2346/2816 = 83.30966 ***** Iteration #28 ***** [Fri May 6 18:02:28 2016] instance num from train1: 46427 0 [Fri May 6 18:02:28 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 18:03:55 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 18:05:23 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 18:06:51 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 18:08:33 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 18:09:39 2016] Loss: 60531.06121 Improvement ratio: 0.20636 Feature L2-norm: 310.14477 Learning rate (eta): 0.02193 Total number of feature updates: 1299956.00000 save parameters to "./parameters.028.gz" .............................................. parameters::save [nnz=48797696/50823396] 0 [Fri May 6 18:10:13 2016] 1000 2000 instance num: 2061 [Fri May 6 18:10:15 2016] POS Precision: 54579/57858 = 94.33268 POS Prec(OOV): 2345/2816 = 83.27415 *** mbr decoding *** 0 [Fri May 6 18:10:15 2016] 1000 2000 instance num: 2061 [Fri May 6 18:10:18 2016] POS Precision: 54599/57858 = 94.36724 POS Prec(OOV): 2341/2816 = 83.13210 ***** Iteration #29 ***** [Fri May 6 18:10:18 2016] instance num from train1: 46427 0 [Fri May 6 18:10:18 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 18:11:45 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 18:13:13 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 18:14:42 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 18:16:10 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 18:17:06 2016] Loss: 59724.57244 Improvement ratio: 0.19343 Feature L2-norm: 314.10408 Learning rate (eta): 0.02183 Total number of feature updates: 1346383.00000 save parameters to "./parameters.029.gz" .............................................. parameters::save [nnz=48797696/50823396] 0 [Fri May 6 18:17:42 2016] 1000 2000 instance num: 2061 [Fri May 6 18:17:44 2016] POS Precision: 54565/57858 = 94.30848 POS Prec(OOV): 2342/2816 = 83.16761 *** mbr decoding *** 0 [Fri May 6 18:17:44 2016] 1000 2000 instance num: 2061 [Fri May 6 18:17:47 2016] POS Precision: 54572/57858 = 94.32058 POS Prec(OOV): 2339/2816 = 83.06108 ***** Iteration #30 ***** [Fri May 6 18:17:47 2016] instance num from train1: 46427 0 [Fri May 6 18:17:47 2016] 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 [Fri May 6 18:19:15 2016] 11000 12000 13000 14000 15000 16000 17000 18000 19000 20000 [Fri May 6 18:20:43 2016] 21000 22000 23000 24000 25000 26000 27000 28000 29000 30000 [Fri May 6 18:22:12 2016] 31000 32000 33000 34000 35000 36000 37000 38000 39000 40000 [Fri May 6 18:23:40 2016] 41000 42000 43000 44000 45000 46000 instance num: 46427 [Fri May 6 18:24:37 2016] Loss: 58908.25326 Improvement ratio: 0.18184 Feature L2-norm: 317.95624 Learning rate (eta): 0.02174 Total number of feature updates: 1392810.00000 save parameters to "./parameters.030.gz" .............................................. parameters::save [nnz=48797696/50823396] 0 [Fri May 6 18:25:14 2016] 1000 2000 instance num: 2061 [Fri May 6 18:25:16 2016] POS Precision: 54612/57858 = 94.38971 POS Prec(OOV): 2341/2816 = 83.13210 *** mbr decoding *** 0 [Fri May 6 18:25:16 2016] 1000 2000 instance num: 2061 [Fri May 6 18:25:19 2016] POS Precision: 54625/57858 = 94.41218 POS Prec(OOV): 2343/2816 = 83.20312 SGD terminated with the maximum number of iterations waiting for 20 thread(s) to exit