From d70b96295d768aa0c80bf66639ad7a56bdef92a8 Mon Sep 17 00:00:00 2001 From: Terry Truong Date: Sun, 2 Oct 2022 12:23:19 +1100 Subject: Add gen_pop_data.py --- backend/hist_data/README.md | 25 ++++++++++++++++++------- 1 file changed, 18 insertions(+), 7 deletions(-) (limited to 'backend/hist_data/README.md') diff --git a/backend/hist_data/README.md b/backend/hist_data/README.md index c55549e..5b64462 100644 --- a/backend/hist_data/README.md +++ b/backend/hist_data/README.md @@ -18,27 +18,38 @@ This directory holds files used to generate the history database data.db. - If 3, same as 1, but 'end' and 'end_upper' are 'preferably Gregorian'. For example, Galileo Galilei's birth date appears 'preferably Julian', but his death date does not. - If 0, they denote a number of years CE (if positive) or BCE (if negative). +- `pop`:
+ Format: `id INT PRIMARY KEY, pop INT`
+ Associates each event with a popularity measure (currently an average monthly viewcount) # Generating the Database +## Environment +Some of the scripts use third-party packages: +- `jdcal`: For date conversion +- `indexed_bzip2`: For parallelised bzip2 processing. +- `mwxml`, `mwparserfromhell`: For parsing Wikipedia dumps. +- `requests`: For downloading data. + ## Generate Event Data 1. Obtain a Wikidata JSON dump in wikidata/, as specified in it's README. 1. Run `gen_events_data.py`, which creates `data.db`, and adds the `events` table. -## Generate Description Data -1. Obtain an enwiki dump in enwiki/, as specified in the README. -1. In enwiki/, run `gen_dump_index.db.py`, which generates a database for indexing the dump. -1. In enwiki/, run `gen_desc_data.py`, which extracts page descriptions into a database. -1. Run - ## Generate Popularity Data 1. Obtain 'page view files' in enwiki/, as specified in it's README. -1. Run +1. Run `gen_pop_data.py`, which adds the `pop` table, using data in enwiki/ and the `events` table. ## Generate Image Data and Popularity Data 1. In enwiki/, run `gen_img_data.py` which looks at pages in the dump that match entries in `events`, looks for infobox image names, and stores them in an image database. + Uses popularity data in enwiki/ to find the top N events in each event category. 1. In enwiki/, run `download_img_license_info.py`, which downloads licensing info for found images, and adds them to the image database. 1. In enwiki/, run `download_imgs.py`, which downloads images into enwiki/imgs/. +1. Run + +## Generate Description Data +1. Obtain an enwiki dump in enwiki/, as specified in the README. +1. In enwiki/, run `gen_dump_index.db.py`, which generates a database for indexing the dump. +1. In enwiki/, run `gen_desc_data.py`, which extracts page descriptions into a database. 1. Run -- cgit v1.2.3