From 7bd6631e33b38fca6229ec950d3d902f9e15d003 Mon Sep 17 00:00:00 2001 From: Terry Truong Date: Sat, 4 Feb 2023 22:28:53 +1100 Subject: Use requirements.txt for backend dependencies --- backend/requirements.txt | 15 +++++++++++++++ backend/tol_data/README.md | 7 ------- 2 files changed, 15 insertions(+), 7 deletions(-) create mode 100644 backend/requirements.txt (limited to 'backend') diff --git a/backend/requirements.txt b/backend/requirements.txt new file mode 100644 index 0000000..b269027 --- /dev/null +++ b/backend/requirements.txt @@ -0,0 +1,15 @@ +# For encoding data to send from server +jsonpickle==3.0.1 + +# For parsing Wikipedia dumps +mwxml==0.3.3 +mwparserfromhell==0.6.4 + +# For parallelised bzip2 processing +indexed-bzip2==1.4.0 + +# For downloading data +requests==2.28.2 + +# For image processing +Pillow==9.4.0 diff --git a/backend/tol_data/README.md b/backend/tol_data/README.md index f1bbf6b..3830b19 100644 --- a/backend/tol_data/README.md +++ b/backend/tol_data/README.md @@ -56,13 +56,6 @@ As a warning, the whole process takes a lot of time and file space. The tree will probably have about 2.6 million nodes. Downloading the images takes several days, and occupies over 200 GB. -## Environment -Some of the scripts use third-party packages: -- `indexed_bzip2`: For parallelised bzip2 processing. -- `requests`: For downloading data. -- `Pillow`: For image processing. -- `mwxml`, `mwparserfromhell`: For parsing Wikipedia dumps. - ## Generate Tree Structure Data 1. Obtain 'tree data files' in otol/, as specified in it's README. 2. Run `gen_otol_data.py`, which creates data.db, and adds the `nodes` and `edges` tables, -- cgit v1.2.3