ALE clustering Gutenberg Children Books 2019-03-26 MWC=1

"Gutenberg Children Books" corpus, new "LG-E-noQuotes" dataset (GC_LGEnglish_noQuotes_fullyParsed.ull),
trash filter off: min_word_count = 1, max_sentence_length off; 2000/1000/500/50 clusters; Link Grammar 5.5.1.
**

This notebook is shared as static ALE-GCB-LG-E-noQuotes-MWC=1-2019-03-26.html.
Output data shared via ALE-GCB-LG-E-noQuotes-MWC=1-2019-03-26 directory.

Basic settings

In [1]:
import os, sys, time
module_path = os.path.abspath(os.path.join('..'))
if module_path not in sys.path: sys.path.append(module_path)
from src.grammar_learner.utl import UTC, test_stats
from src.grammar_learner.read_files import check_dir, check_corpus
from src.grammar_learner.write_files import list2file
from src.grammar_learner.widgets import html_table
from src.grammar_learner.pqa_table import table_rows, params, wide_rows
tmpath = module_path + '/tmp/'
check_dir(tmpath, True, 'none')
start = time.time()
runs = (1,1)
print(UTC(), ':: module_path:', module_path)
2019-03-26 08:16:34 UTC :: module_path: /home/obaskov/94/language-learning

Corpus test settings

In [2]:
corpus = 'GCB' # 'Gutenberg-Children-Books-Caps' 
dataset = 'LG-E-noQuotes'  # 'LG-E-clean'
kwargs = {
    # 'max_sentence_length' :   25  ,
    # 'max_unparsed_words'  :   0   ,
    'left_wall'     :   ''          ,
    'period'        :   False       ,
    'context'       :   1           ,
    'min_word_count':   1           ,
    'word_space'    :   'sparse'    ,
    'clustering'    :   ['agglomerative', 'ward'],
    'clustering_metric' : ['silhouette', 'cosine'],
    'cluster_range' :   2000        ,   # 2000/1000/500/50/20
    'top_level'     :   0.01        ,
    'grammar_rules' :   2           ,
    'max_disjuncts' :   1000000     ,   # off
    'stop_words'    :   []          ,
    'tmpath'        :   tmpath      ,
    'verbose'       :   'log+'      ,
    'template_path' :   'poc-turtle',
    'linkage_limit' :   1000        }
rp = module_path + '/data/' + corpus + '/LG-E-noQuotes/'
cp = rp  # corpus path = reference_path
runs = (1,1)
out_dir = module_path + '/output/' + 'ALE-GCB-LG-E-noQuotes-MWC=1-' + str(UTC())[:10]
if check_corpus(rp, 'min'): print(UTC(), '\n', out_dir)
2019-03-26 08:16:34 UTC 
 /home/obaskov/94/language-learning/output/ALE-GCB-LG-E-noQuotes-MWC=1-2019-03-26

Tests: min_word_count = 1; 2000/1000/500/50 clusters

In [3]:
%%capture
table = []
kwargs['cluster_range'] = 2000
line = [['ALE2000', corpus, dataset, 0, 0, 'none']]
a, _, header, log, rules = wide_rows(line, out_dir, cp, rp, runs, **kwargs)
header[0] = 'Cell'
table.extend(a)
In [4]:
display(html_table([header] + a)); print(test_stats(log))
CellCorpusParsingSpaceLinkageAffinityG12nThresholdRulesMWCNNSIPAPQF1Top 5 cluster sizes
ALE2000GCBLG-E-noQuotescALWEdwardeuclideannone---20001---0.063%60%0.66[3315, 601, 588, 521, 481]
Cleaned dictionary: 22641 words, grammar learn time: 02:29:01, grammar test time: 00:22:56
In [6]:
display(html_table([header] + a)); print(test_stats(log))
CellCorpusParsingSpaceLinkageAffinityG12nThresholdRulesMWCNNSIPAPQF1Top 5 cluster sizes
ALE1000GCBLG-E-noQuotescALWEdwardeuclideannone---10001---0.065%62%0.68[4151, 1007, 686, 653, 398]
Cleaned dictionary: 22641 words, grammar learn time: 02:25:52, grammar test time: 00:22:03
In [8]:
display(html_table([header] + a)); print(test_stats(log))
CellCorpusParsingSpaceLinkageAffinityG12nThresholdRulesMWCNNSIPAPQF1Top 5 cluster sizes
ALE500GCBLG-E-noQuotescALWEdwardeuclideannone---5001---0.068%63%0.69[5292, 1438, 823, 821, 649]
Cleaned dictionary: 22641 words, grammar learn time: 02:15:41, grammar test time: 00:26:47
In [12]:
display(html_table([header] + a)); print(test_stats(log))
CellCorpusParsingSpaceLinkageAffinityG12nThresholdRulesMWCNNSIPAPQF1Top 5 cluster sizes
ALE50GCBLG-E-noQuotescALWEdwardeuclideannone---501---0.089%61%0.64[10050, 7610, 1683, 1370, 539]
Cleaned dictionary: 22641 words, grammar learn time: 02:09:50, grammar test time: 01:10:48

Save results

In [13]:
display(html_table([header] + table))
CellCorpusParsingSpaceLinkageAffinityG12nThresholdRulesMWCNNSIPAPQF1Top 5 cluster sizes
ALE2000GCBLG-E-noQuotescALWEdwardeuclideannone---20001---0.063%60%0.66[3315, 601, 588, 521, 481]
ALE1000GCBLG-E-noQuotescALWEdwardeuclideannone---10001---0.065%62%0.68[4151, 1007, 686, 653, 398]
ALE500GCBLG-E-noQuotescALWEdwardeuclideannone---5001---0.068%63%0.69[5292, 1438, 823, 821, 649]
ALE50GCBLG-E-noQuotescALWEdwardeuclideannone---501---0.089%61%0.64[10050, 7610, 1683, 1370, 539]
In [14]:
print(UTC(), ':: 2000/1000/500/50 finished, elapsed', str(round((time.time()-start)/3600.0, 1)), 'hours')
table_str = list2file(table, out_dir + '/all_tests_table.txt')
print('Results saved to', out_dir + '/all_tests_table.txt')
2019-03-26 19:59:38 UTC :: 2000/1000/500/50 finished, elapsed 11.7 hours
Results saved to /home/obaskov/94/language-learning/output/ALE-GCB-LG-E-noQuotes-MWC=1-2019-03-26/all_tests_table.txt