Difference between revisions of "Py-3rdparty-mediawiki"

From BITPlan mediawiki-japi Wiki
Jump to navigation Jump to search
 
(38 intermediate revisions by the same user not shown)
Line 2: Line 2:
 
= What is it =
 
= What is it =
 
Extended functionality for  
 
Extended functionality for  
# pywikibot
+
# [https://www.mediawiki.org/wiki/Manual:Pywikibot/de pywikibot]
 
# mwclient
 
# mwclient
  
Line 9: Line 9:
  
 
= Installation =
 
= Installation =
 +
== via pip ==
 
<source lang='bash'>
 
<source lang='bash'>
git clone https://github.com/WolfgangFahl/py-3rdparty-mediawiki
+
pip install py-3rdparty-mediawiki
./install
+
# alternatively if your pip is not a python3 pip
 +
pip3 install py-3rdparty-mediawiki
 
</source>
 
</source>
 
+
=== upgrade ===
= wikipush / wikuser script =
 
This script is in the scripts directory. It should be linked with two names
 
 
<source lang='bash'>
 
<source lang='bash'>
ln wikipush wikiuser
+
pip install py-3rdparty-mediawiki -U
 +
# alternatively if your pip is not a python3 pip
 +
pip3 install py-3rdparty-mediawiki -U
 
</source>
 
</source>
and be added to your path e.g by doing a symbolic link
+
 
 +
== Via Source code ==
 
<source lang='bash'>
 
<source lang='bash'>
cd $HOME/bin
+
git clone https://github.com/WolfgangFahl/py-3rdparty-mediawiki
ln -s $HOME/Documents/pyworkspace/py-3rdparty-mediawiki/scripts/wikipush .
+
./install
ln -s $HOME/Documents/pyworkspace/py-3rdparty-mediawiki/scripts/wikiuser .
 
 
</source>
 
</source>
  
 +
= wikipush / wikibackup / wikiedit / wiknuke / wikirestore/ wikiquery / wikiupload / wikiuser scripts =
 +
== Setup method ==
 +
If you installed with the method above console_script will have been added to your environment. You can e.g. check
 +
<source lang='bash' highlight='1'>
 +
which wikipush
 +
/Users/wf/Library/Python/3.8/bin/wikipush
 +
</source>
 +
and there should be a wikipush script in your path.
 +
== shared script ==
 +
This script is the base script called "wikipush" which can be used using different names by hard linking as outlined above. This approach is deprecated and therefore incomplete as of 2020-12
 
<source lang='bash'>
 
<source lang='bash'>
 
#!/bin/bash
 
#!/bin/bash
 
# WF 2020-10-31
 
# WF 2020-10-31
 
# wrapper for wikipush python
 
# wrapper for wikipush python
script=$(readlink -nf $BASH_SOURCE)
+
script=$(python -c "import os;import sys;print (os.path.realpath(sys.argv[1]))" $BASH_SOURCE)
 
scriptname=$(basename $script)
 
scriptname=$(basename $script)
 
scriptdir=$(dirname $script)
 
scriptdir=$(dirname $script)
Line 39: Line 51:
 
python -m wikibot.wikipush "$@"
 
python -m wikibot.wikipush "$@"
 
;;
 
;;
 +
  "wikiedit")
 +
  python -m wikibot.wikiedit "$@"
 +
  ;;
 
   "wikiuser")
 
   "wikiuser")
 
python -m wikibot.wikiuser "$@"
 
python -m wikibot.wikiuser "$@"
 
;;
 
;;
 +
  "wikinuke")
 +
  python -m wikibot.wikinuke "$@"
 +
  ;;
 +
  "wikiupload")
 +
  python -m wikibot.wikiupload "$@"
 +
  ;;
 
   *)
 
   *)
echo "undefined script behavior:  $scriptname"
+
echo "undefined wikipush script behavior:  $scriptname"
 
;;
 
;;
 
esac
 
esac
Line 49: Line 70:
  
 
= WikiPush =
 
= WikiPush =
= WikiUser =
+
WikiPush allows to copy pages from one wiki to another including the images on the page.
 +
To identify yourself you use the credential property files created with the wikiuser script (using python) or the Mediawiki-Japi {{Link|target=CommandLine}}
 +
== usage ==
 +
<source lang='bash'>
 +
wikipush -h
 +
family and mylang are not set.
 +
Defaulting to family='test' and mylang='test'.
 +
usage: wikipush.py [-h] [-d] [-V] [-l] [-f] [-i] [-q QUERY] -s SOURCE -t
 +
                  TARGET [-p PAGES [PAGES ...]]
 +
 
 +
Created on 2020-10-29
 +
 
 +
  Created by Wolfgang Fahl on 2020-10-31.
 +
  Copyright 2020 Wolfgang Fahl. All rights reserved.
 +
 
 +
  Licensed under the Apache License 2.0
 +
  http://www.apache.org/licenses/LICENSE-2.0
 +
 
 +
  Distributed on an "AS IS" basis without warranties
 +
  or conditions of any kind, either express or implied.
 +
 
 +
USAGE
 +
 
 +
optional arguments:
 +
  -h, --help            show this help message and exit
 +
  -d, --debug          set debug level [default: None]
 +
  -V, --version        show program's version number and exit
 +
  -l, --login          login to source wiki for access permission
 +
  -f, --force          force to overwrite existing pages
 +
  -i, --ignore          ignore upload warnings e.g. duplicate images
 +
  -q QUERY, --query QUERY
 +
                        select pages with given SMW ask query
 +
  -s SOURCE, --source SOURCE
 +
                        source wiki id
 +
  -t TARGET, --target TARGET
 +
                        target wiki id
 +
  -p PAGES [PAGES ...], --pages PAGES [PAGES ...]
 +
                        list of page Titles to be pushed
 +
</source>
 +
== Example ==
 +
<source lang='bash' highlight='1'>
 +
wikipush -s smw -t test2 -q "[[Category:City]]|limit=5"
 +
family and mylang are not set.
 +
Defaulting to family='test' and mylang='test'.
 +
copying 4 pages from smw to test2
 +
copying Demo:Tokyo ...✅
 +
copying image File:SMW-Info-button.png ...✅
 +
copying image File:Tokyo-Tsukishima-0011.jpg ...✅
 +
copying Vienna ...✅
 +
copying Warsaw ...✅
 +
copying image File:6140285934 02e81b845f z.jpg ...✅
 +
copying Demo:Würzburg ...✅
 +
</source>
 +
= wikiquery =
 +
wikiquery allows to send SMW ask-query via commandline and get the results in json or csv format. With the query division parameter the limits of SMW for the maximum  amount of displayed
 +
results can be overcome. E.g. if you set
 +
<pre>
 +
$smwgQMaxInlineLimit=1500;
 +
$smwgQMaxInlineLimitSets=1500;
 +
$smwgQMaxLimit = 5000;
 +
</pre>
 +
You'll be able to get more than 1500/5000 results.
 +
== usage ==
 +
<source lang='bash' highlight='1'>
 +
wikiquery -h
 +
usage: wikiquery [-h] [-d] [-V] [-l] -s SOURCE [--format FORMAT]
 +
                [--entityName ENTITYNAME] [--limit LIMIT] [--progress]
 +
                [-q QUERY] [--queryFile QUERYFILE] [-qf QUERYFIELD]
 +
                [-p PAGES [PAGES ...]] [-ui] [-qd QUERYDIVISION]
 +
 
 +
wikipush
 +
 
 +
  Created by Wolfgang Fahl on 2020-10-31.
 +
  Copyright 2020 Wolfgang Fahl. All rights reserved.
 +
 
 +
  Licensed under the Apache License 2.0
 +
  http://www.apache.org/licenses/LICENSE-2.0
 +
 
 +
  Distributed on an "AS IS" basis without warranties
 +
  or conditions of any kind, either express or implied.
 +
 
 +
optional arguments:
 +
  -h, --help            show this help message and exit
 +
  -d, --debug          set debug level [default: False]
 +
  -V, --version        show program's version number and exit
 +
  -l, --login          login to source wiki for access permission
 +
  -s SOURCE, --source SOURCE
 +
                        source wiki id
 +
  --format FORMAT      format to use for query result csv,json,xml,ttl or
 +
                        wiki
 +
  --entityName ENTITYNAME
 +
                        name of the entites that are queried - only needed for
 +
                        some output formats - default is 'data'
 +
  --limit LIMIT        limit for query
 +
  --progress            shows progress for query
 +
  -q QUERY, --query QUERY
 +
                        select pages with given SMW ask query
 +
  --queryFile QUERYFILE
 +
                        file the query should be read from
 +
  -qf QUERYFIELD, --queryField QUERYFIELD
 +
                        query result field which contains page
 +
  -p PAGES [PAGES ...], --pages PAGES [PAGES ...]
 +
                        list of page Titles to be pushed
 +
  -ui, --withGUI        Pop up GUI for selection
 +
  -qd QUERYDIVISION, --queryDivision QUERYDIVISION
 +
                        divide query into equidistant subintervals to limit
 +
                        the result size of the individual queries
 +
</source>
 +
== Examples ==
 +
=== query1.ask ===
 +
<source lang='bash'>
 +
{{#ask: [[IsA::Event]][[Acronym::~ES*]][[start date::>2018]][[start date::<2019]]
 +
| mainlabel=pageTitle
 +
| ?Title = title
 +
| ?Event in series = series
 +
| ?ordinal=ordinal
 +
| ?Homepage = homepage
 +
| format=table
 +
}}
 +
</source>
 +
 
 +
=== csv ===
 +
<source lang='bash' highlight='1'>
 +
wikiquery -s or --queryFile query1.ask --format csv
 +
pageTitle;title;series;ordinal;homepage
 +
ESA 2018;26th Annual European Symposium on Algorithms;ESA;None;http://algo2018.hiit.fi/esa/
 +
ESEC/FSE 2018;26th ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering (ESEC/FSE);ESEC/FSE;None;https://2018.fseconference.org/
 +
ESOP 2018;27th European Symposium on Programming;ESOP;None;https://etaps.org/2018/esop
 +
ESORICS 2018;23rd European Symposium on Research in Computer Security,;ESORICS;None;None
 +
ESSCIRC 2018;44th European Solid-State Circuits Conference;ESSCIRC;None;None
 +
ESWC 2018;15th European Semantic Web Symposium (ESWS);ESWC;None;http://2018.eswc-conferences.org/
 +
</source>
 +
 
 +
=== json ===
 +
<source lang='bash'>
 +
wikiquery -s or --queryFile query1.ask --format json
 +
</source>
 +
<source lang='json'>
 +
{
 +
  "data": [
 +
      {
 +
        "pageTitle": "ESA 2018",
 +
        "title": "26th Annual European Symposium on Algorithms",
 +
        "series": "ESA",
 +
        "ordinal": null,
 +
        "homepage": "http://algo2018.hiit.fi/esa/"
 +
      },
 +
      {
 +
        "pageTitle": "ESEC/FSE 2018",
 +
        "title": "26th ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering (ESEC/FSE)",
 +
        "series": "ESEC/FSE",
 +
        "ordinal": null,
 +
        "homepage": "https://2018.fseconference.org/"
 +
      },
 +
      {
 +
        "pageTitle": "ESOP 2018",
 +
        "title": "27th European Symposium on Programming",
 +
        "series": "ESOP",
 +
        "ordinal": null,
 +
        "homepage": "https://etaps.org/2018/esop"
 +
      },
 +
      {
 +
        "pageTitle": "ESORICS 2018",
 +
        "title": "23rd European Symposium on Research in Computer Security,",
 +
        "series": "ESORICS",
 +
        "ordinal": null,
 +
        "homepage": null
 +
      },
 +
      {
 +
        "pageTitle": "ESSCIRC 2018",
 +
        "title": "44th European Solid-State Circuits Conference",
 +
        "series": "ESSCIRC",
 +
        "ordinal": null,
 +
        "homepage": null
 +
      },
 +
      {
 +
        "pageTitle": "ESWC 2018",
 +
        "title": "15th European Semantic Web Symposium (ESWS)",
 +
        "series": "ESWC",
 +
        "ordinal": null,
 +
        "homepage": "http://2018.eswc-conferences.org/"
 +
      }
 +
  ]
 +
}
 +
</source>
 +
 
 +
= wikibackup =
 +
== usage ==
 +
<source lang='bash'>
 +
wikibackup -h
 +
usage: wikibackup [-h] [-d] [-V] [-g] [-l] -s SOURCE [-wi]
 +
                  [--backupPath BACKUPPATH] [--limit LIMIT] [--progress]
 +
                  [-q QUERY] [--queryFile QUERYFILE] [-qf QUERYFIELD]
 +
                  [-p PAGES [PAGES ...]] [-ui] [-qd QUERYDIVISION]
 +
 
 +
wikipush
 +
 
 +
  Created by Wolfgang Fahl on 2020-10-31.
 +
  Copyright 2020 Wolfgang Fahl. All rights reserved.
 +
 
 +
  Licensed under the Apache License 2.0
 +
  http://www.apache.org/licenses/LICENSE-2.0
 +
 
 +
  Distributed on an "AS IS" basis without warranties
 +
  or conditions of any kind, either express or implied.
 +
 
 +
optional arguments:
 +
  -h, --help            show this help message and exit
 +
  -d, --debug          set debug level [default: False]
 +
  -V, --version        show program's version number and exit
 +
  -g, --git            use git for version control
 +
  -l, --login          login to source wiki for access permission
 +
  -s SOURCE, --source SOURCE
 +
                        source wiki id
 +
  -wi, --withImages    copy images on the given pages
 +
  --backupPath BACKUPPATH
 +
                        path where the backup should be stored
 +
  --limit LIMIT        limit for query
 +
  --progress            shows progress for query
 +
  -q QUERY, --query QUERY
 +
                        select pages with given SMW ask query
 +
  --queryFile QUERYFILE
 +
                        file the query should be read from
 +
  -qf QUERYFIELD, --queryField QUERYFIELD
 +
                        query result field which contains page
 +
  -p PAGES [PAGES ...], --pages PAGES [PAGES ...]
 +
                        list of page Titles to be pushed
 +
  -ui, --withGUI        Pop up GUI for selection
 +
  -qd QUERYDIVISION, --queryDivision QUERYDIVISION
 +
                        divide query into equidistant subintervals to limit
 +
                        the result size of the individual queries
 +
</source>
 +
 
 +
= WikiNuke =
 +
wikinukes.py allows mass deletion of pages
 +
== usage ==
 +
<source lang='bash'>
 +
usage: wikinuke.py [-h] [-d] [-V] [-f] [-q QUERY] [-qf QUERYFIELD] -t TARGET [-p PAGES [PAGES ...]]
 +
 
 +
Created on 2020-11-12
 +
 
 +
  Created by Wolfgang Fahl on 2020-10-31.
 +
  Copyright 2020 Wolfgang Fahl. All rights reserved.
 +
 
 +
  Licensed under the Apache License 2.0
 +
  http://www.apache.org/licenses/LICENSE-2.0
 +
 
 +
  Distributed on an "AS IS" basis without warranties
 +
  or conditions of any kind, either express or implied.
 +
 
 +
USAGE
 +
 
 +
optional arguments:
 +
  -h, --help            show this help message and exit
 +
  -d, --debug          set debug level [default: None]
 +
  -V, --version        show program's version number and exit
 +
  -f, --force          force to delete pages - default is 'dry' run only listing pages
 +
  -q QUERY, --query QUERY
 +
                        select pages with given SMW ask query
 +
  -qf QUERYFIELD, --queryField QUERYFIELD
 +
                        query result field which contains page
 +
  -t TARGET, --target TARGET
 +
                        target wiki id
 +
  -p PAGES [PAGES ...], --pages PAGES [PAGES ...]
 +
                        list of page Titles to be pushed
 +
 
 +
</source>
 +
== Example ==
 +
The default behavior is a dry run only listing whether the pages exist
 +
<source lang='bash' highlight='1'>
 +
wikinuke -t test -p deleteMe1 deleteMe2 deleteMe3
 +
deleting 3 pages in test (dry run)
 +
1/3 (  33%): deleting deleteMe1 ...👍
 +
2/3 (  67%): deleting deleteMe2 ...👍
 +
3/3 ( 100%): deleting deleteMe3 ...👍
 +
</source>
 +
After checking you might want to (carefully) use the "-f" option to actually force the deletion:
 +
<source lang='bash' highlight='1'>
 +
wikinuke -t test -p deleteMe1 deleteMe2 deleteMe3 -f
 +
deleting 3 pages in test (forced)
 +
1/3 (  33%): deleting deleteMe1 ...✅
 +
2/3 (  67%): deleting deleteMe2 ...✅
 +
3/3 ( 100%): deleting deleteMe3 ...✅
 +
</source>
 +
 
 +
= WikiEdit =
 +
wikiedit.py  allows mass editing of pages using python regular expressions
 +
== usage ==
 +
<source lang='bash'>
 +
wikiedit -h
 +
usage: wikiedit.py [-h] [-d] [-V] --search SEARCH --replace REPLACE [-f] [-q QUERY] [-qf QUERYFIELD] -t TARGET
 +
                  [-p PAGES [PAGES ...]]
 +
 
 +
Created on 2020-11-12
 +
 
 +
  Created by Wolfgang Fahl on 2020-10-31.
 +
  Copyright 2020 Wolfgang Fahl. All rights reserved.
 +
 
 +
  Licensed under the Apache License 2.0
 +
  http://www.apache.org/licenses/LICENSE-2.0
 +
 
 +
  Distributed on an "AS IS" basis without warranties
 +
  or conditions of any kind, either express or implied.
 +
 
 +
USAGE
 +
 
 +
optional arguments:
 +
  -h, --help            show this help message and exit
 +
  -d, --debug          set debug level [default: None]
 +
  -V, --version        show program's version number and exit
 +
  --search SEARCH      search pattern
 +
  --replace REPLACE    replace pattern
 +
  -f, --force          force to edit pages - default is 'dry' run only listing pages
 +
  -q QUERY, --query QUERY
 +
                        select pages with given SMW ask query
 +
  -qf QUERYFIELD, --queryField QUERYFIELD
 +
                        query result field which contains page
 +
  -t TARGET, --target TARGET
 +
                        target wiki id
 +
  -p PAGES [PAGES ...], --pages PAGES [PAGES ...]
 +
                        list of page Titles to be pushed
 +
</source>
 +
 
 +
== example ==
 +
<source lang='bash' highlight='1'>
 +
wikiedit -t test -q "[[isA::CFP]]"  --search "CALL FOR PAPER" --replace "CFP"
 +
editing 1 pages in test (dry run)
 +
1/1 ( 100%): editing CALL FOR PAPER Journal: Advances in Multimedia - An International Journal (AMIJ) ...👍 |isA=CFP
 +
-|Acronym=CALL FOR PAPER Journal: Advances in Multimedia - An International Journal (AMIJ)
 +
-|Title=CALL FOR PAPER Journal: Advances in Multimedia - An International Journal (AMIJ)
 +
+|Acronym=CFP Journal: Advances in Multimedia - An International Journal (AMIJ)
 +
+|Title=CFP Journal: Advances in Multimedia - An International Journal (AMIJ)
 +
|Start date=2010/11/01
 +
}}
 +
-CALL FOR PAPER
 +
+CFP
 +
Journal: Advances in Multimedia - An International Journal (AMIJ)
 +
</source>
 +
 
 +
= wikirestore =
 +
Tool to restore wiki pages from an local backup, created with wikibackup, to an destination wiki.
 +
 
 +
== Arguments ==
 +
{| class="wikitable"
 +
|-
 +
!| Argument
 +
!| Description
 +
|-
 +
| -s
 +
| Source wiki - Only used to query page names. The queried page names will then be looked up in the backup.
 +
|-
 +
| -t
 +
| Target wiki - The backup is restored in this wiki
 +
|-
 +
| -q
 +
| SMW query to select the pages to be restored. Note that the query is only used to select the page names the actual backup is then restored from the local backup.
 +
|-
 +
| -p
 +
| Names of the pages to be restored
 +
|-
 +
| --backupPath
 +
| define location of the backup. Default is the default backup location of the target wiki.
 +
|-
 +
 
 +
|}
 +
If argument '''-s''' is used a page query is executed therefore all arguments related to an page query can be used such as '''-ui''' and '''--limit'''.
 +
 
 +
== Examples ==
 +
=== --backupPath ===
 +
Use this argument to define a different backup folder
 +
====wikibackup====
 +
<syntaxhighlight lang="shell" line='line'>
 +
$ wikibackup -s orth --backupPath "/home/user/wikibackup/orth_copy" -q "[[isA::Event]]" --limit 10
 +
 
 +
downloading 10 pages from orth to /home/user/wikibackup/orth_copy
 +
1/10 (  10%): downloading " DBKDA 2021" ...✅
 +
2/10 (  20%): downloading "ENERGY 2021" ...✅
 +
3/10 (  30%): downloading "ICAS 2021" ...✅
 +
4/10 (  40%): downloading "ICNS 2021" ...✅
 +
5/10 (  50%): downloading 2021 ICIMP ...✅
 +
6/10 (  60%): downloading 3DUI 2020 ...✅
 +
7/10 (  70%): downloading 3IA 2009 ...✅
 +
8/10 (  80%): downloading 3PGIC 2010 ...✅
 +
9/10 (  90%): downloading 4S4D 2017 ...✅
 +
10/10 ( 100%): downloading 5GU 2017 ...✅
 +
</syntaxhighlight>
 +
 
 +
====wikirestore====
 +
<syntaxhighlight lang="shell">
 +
$ wikirestore -t orth --backupPath "/home/user/wikibackup/orth_copy"
 +
 
 +
restoring 10 pages from /home/user/wikibackup/orth_copy to orth
 +
1/10 (  10%): restore 2021 ICIMP ...✅
 +
2/10 (  20%): restore "ICNS 2021" ...✅
 +
3/10 (  30%): restore 3PGIC 2010 ...✅
 +
4/10 (  40%): restore 4S4D 2017 ...✅
 +
5/10 (  50%): restore "ENERGY 2021" ...✅
 +
6/10 (  60%): restore 3DUI 2020 ...✅
 +
7/10 (  70%): restore " DBKDA 2021" ...✅
 +
8/10 (  80%): restore 3IA 2009 ...✅
 +
9/10 (  90%): restore "ICAS 2021" ...✅
 +
10/10 ( 100%): restore 5GU 2017 ...✅
 +
</syntaxhighlight>
 +
 
 +
=== Scenario: Restore triangle ===
 +
<syntaxhighlight lang="shell">
 +
$ wikirestore -s or -q "[[isA:Event]]" -t orth --backupPath "/home/user/wikibackup/orth_copy"
 +
</syntaxhighlight>
 +
With this command we query all page names that are an Event from the wiki '''or''' and restore them in the wiki '''orth''' with the version of the page that is stored in '''/home/user/wikibackup/orth_copy'''.
 +
 
 +
= wikiupload =
 +
wikiupload.py allows to mass upload files
 +
== usage ==
 +
<source lang='bash'>
 +
wikiupload -h
 +
usage: wikiupload.py [-h] [-d] [-V] --files FILES [FILES ...] [-f] -t TARGET
 +
 
 +
Created on 2020-11-12
 +
 
 +
  Created by Wolfgang Fahl on 2020-10-31.
 +
  Copyright 2020 Wolfgang Fahl. All rights reserved.
 +
 
 +
  Licensed under the Apache License 2.0
 +
  http://www.apache.org/licenses/LICENSE-2.0
 +
 
 +
  Distributed on an "AS IS" basis without warranties
 +
  or conditions of any kind, either express or implied.
 +
 
 +
USAGE
 +
 
 +
optional arguments:
 +
  -h, --help            show this help message and exit
 +
  -d, --debug          set debug level [default: None]
 +
  -V, --version        show program's version number and exit
 +
  --files FILES [FILES ...]
 +
                        list of files to be uploaded
 +
  -f, --force          force to (re)upload existing files - default is false
 +
  -t TARGET, --target TARGET
 +
                        target wiki id
 +
</source>
 +
 
 +
== example ==
 +
<source lang='bash' highlight='1'>
 +
wikiupload -t test --files car.png
 +
uploading 1 files to test
 +
1/1 ( 100%): uploading car.png ...✅
 +
</source>
 +
 
 +
= wikiuser =
 +
wikiuser.py  creates credential files and assigns a WikiId under which you can now operate. This simplifies access to your wiki.
 +
The credential file is compatible to the Java Mediawiki-Japi see {{Link|target=CommandLine#Credential_mode}}
 +
 
 +
== usage ==
 +
<source lang='bash'>
 +
wikiuser -h
 +
usage: wikiuser.py [-h] [-d] [-V] [-e EMAIL] [-f FILEPATH] [-l URL]
 +
                  [-s SCRIPTPATH] [-p PASSWORD] [-u USER] [-v VERSION]
 +
                  [-w WIKIID] [-y]
 +
 
 +
WikiUser credential handling
 +
 
 +
  Created by Wolfgang Fahl on 2020-10-31.
 +
  Copyright 2020 Wolfgang Fahl. All rights reserved.
 +
 
 +
  Licensed under the Apache License 2.0
 +
  http://www.apache.org/licenses/LICENSE-2.0
 +
 
 +
  Distributed on an "AS IS" basis without warranties
 +
  or conditions of any kind, either express or implied.
 +
 
 +
USAGE
 +
 
 +
optional arguments:
 +
  -h, --help            show this help message and exit
 +
  -d, --debug          set debug level [default: None]
 +
  -V, --version        show program's version number and exit
 +
  -e EMAIL, --email EMAIL
 +
                        email of the user
 +
  -f FILEPATH, --file FILEPATH
 +
                        ini-file path
 +
  -l URL, --url URL    url of the wiki
 +
  -s SCRIPTPATH, --scriptPath SCRIPTPATH
 +
                        script path
 +
  -p PASSWORD, --password PASSWORD
 +
                        password
 +
  -u USER, --user USER  os user id
 +
  -v VERSION, --wikiVersion VERSION
 +
                        version of the wiki
 +
  -w WIKIID, --wikiId WIKIID
 +
                        wiki Id
 +
  -y, --yes            immediately store without asking
 +
</source>
 +
=== Example ===
 +
E.g. if you have an account on www.semantic-mediawiki.org you can start wikiuser in interactive mode.
 +
 
 +
<source lang='bash' highlight='1'>
 +
wikiuser
 +
email: john@doe.com
 +
scriptPath: /w
 +
user: jd
 +
url: http://www.semantic-mediawiki.org
 +
version: Mediawiki 1.33
 +
wikiId: smw
 +
password: *****
 +
shall i store jd smw? yes/no y/ny
 +
</source>
 +
 
 +
Now you can e.g. use "smw" as the wikiid for this wiki when using wikipush
  
 
= Prerequisites =
 
= Prerequisites =
You might want to prepare some credential ini files with the Mediawiki-Japi [[CommandLine]].
+
You might want to prepare some credential ini files with the wikiuser script or Mediawiki-Japi [[CommandLine]].
  
 
== user-config.py ==
 
== user-config.py ==
Line 61: Line 589:
 
# 'put_throttle' seconds.
 
# 'put_throttle' seconds.
 
put_throttle = 0
 
put_throttle = 0
 +
# avoid warnings ...
 +
family='bitplan'
 +
mylang='en'
 
</source>
 
</source>
 
The easiest way is to put it at $HOME/.pywikibot/user-config.py
 
The easiest way is to put it at $HOME/.pywikibot/user-config.py

Latest revision as of 10:17, 9 November 2021

Click here to comment

What is it

Extended functionality for

  1. pywikibot
  2. mwclient

Github

Installation

via pip

pip install py-3rdparty-mediawiki
# alternatively if your pip is not a python3 pip
pip3 install py-3rdparty-mediawiki

upgrade

pip install py-3rdparty-mediawiki -U
# alternatively if your pip is not a python3 pip
pip3 install py-3rdparty-mediawiki -U

Via Source code

git clone https://github.com/WolfgangFahl/py-3rdparty-mediawiki
./install

wikipush / wikibackup / wikiedit / wiknuke / wikirestore/ wikiquery / wikiupload / wikiuser scripts

Setup method

If you installed with the method above console_script will have been added to your environment. You can e.g. check

which wikipush
/Users/wf/Library/Python/3.8/bin/wikipush

and there should be a wikipush script in your path.

shared script

This script is the base script called "wikipush" which can be used using different names by hard linking as outlined above. This approach is deprecated and therefore incomplete as of 2020-12

#!/bin/bash
# WF 2020-10-31
# wrapper for wikipush python
script=$(python -c "import os;import sys;print (os.path.realpath(sys.argv[1]))" $BASH_SOURCE)
scriptname=$(basename $script)
scriptdir=$(dirname $script)
base=$scriptdir/..
export PYTHONPATH="${PYTHONPATH}:$base"
case $scriptname in
   "wikipush")
	python -m wikibot.wikipush "$@"
	;;
   "wikiedit")
   	python -m wikibot.wikiedit "$@"
   	;;
   "wikiuser")
	python -m wikibot.wikiuser "$@"
	;;
   "wikinuke")
   	python -m wikibot.wikinuke "$@"
   	;;
   "wikiupload")
   	python -m wikibot.wikiupload "$@"
   	;;
   *)
	echo "undefined wikipush script behavior:  $scriptname"
	;;
esac

WikiPush

WikiPush allows to copy pages from one wiki to another including the images on the page. To identify yourself you use the credential property files created with the wikiuser script (using python) or the Mediawiki-Japi CommandLine

usage

wikipush -h
family and mylang are not set.
Defaulting to family='test' and mylang='test'.
usage: wikipush.py [-h] [-d] [-V] [-l] [-f] [-i] [-q QUERY] -s SOURCE -t
                   TARGET [-p PAGES [PAGES ...]]

Created on 2020-10-29

  Created by Wolfgang Fahl on 2020-10-31.
  Copyright 2020 Wolfgang Fahl. All rights reserved.

  Licensed under the Apache License 2.0
  http://www.apache.org/licenses/LICENSE-2.0

  Distributed on an "AS IS" basis without warranties
  or conditions of any kind, either express or implied.

USAGE

optional arguments:
  -h, --help            show this help message and exit
  -d, --debug           set debug level [default: None]
  -V, --version         show program's version number and exit
  -l, --login           login to source wiki for access permission
  -f, --force           force to overwrite existing pages
  -i, --ignore          ignore upload warnings e.g. duplicate images
  -q QUERY, --query QUERY
                        select pages with given SMW ask query
  -s SOURCE, --source SOURCE
                        source wiki id
  -t TARGET, --target TARGET
                        target wiki id
  -p PAGES [PAGES ...], --pages PAGES [PAGES ...]
                        list of page Titles to be pushed

Example

wikipush -s smw -t test2 -q "[[Category:City]]|limit=5"
family and mylang are not set.
Defaulting to family='test' and mylang='test'.
copying 4 pages from smw to test2
copying Demo:Tokyo ...✅
copying image File:SMW-Info-button.png ...✅
copying image File:Tokyo-Tsukishima-0011.jpg ...✅
copying Vienna ...✅
copying Warsaw ...✅
copying image File:6140285934 02e81b845f z.jpg ...✅
copying Demo:Würzburg ...✅

wikiquery

wikiquery allows to send SMW ask-query via commandline and get the results in json or csv format. With the query division parameter the limits of SMW for the maximum amount of displayed results can be overcome. E.g. if you set

$smwgQMaxInlineLimit=1500;
$smwgQMaxInlineLimitSets=1500;
$smwgQMaxLimit = 5000;

You'll be able to get more than 1500/5000 results.

usage

wikiquery -h
usage: wikiquery [-h] [-d] [-V] [-l] -s SOURCE [--format FORMAT]
                 [--entityName ENTITYNAME] [--limit LIMIT] [--progress]
                 [-q QUERY] [--queryFile QUERYFILE] [-qf QUERYFIELD]
                 [-p PAGES [PAGES ...]] [-ui] [-qd QUERYDIVISION]

wikipush

  Created by Wolfgang Fahl on 2020-10-31.
  Copyright 2020 Wolfgang Fahl. All rights reserved.

  Licensed under the Apache License 2.0
  http://www.apache.org/licenses/LICENSE-2.0

  Distributed on an "AS IS" basis without warranties
  or conditions of any kind, either express or implied.

optional arguments:
  -h, --help            show this help message and exit
  -d, --debug           set debug level [default: False]
  -V, --version         show program's version number and exit
  -l, --login           login to source wiki for access permission
  -s SOURCE, --source SOURCE
                        source wiki id
  --format FORMAT       format to use for query result csv,json,xml,ttl or
                        wiki
  --entityName ENTITYNAME
                        name of the entites that are queried - only needed for
                        some output formats - default is 'data'
  --limit LIMIT         limit for query
  --progress            shows progress for query
  -q QUERY, --query QUERY
                        select pages with given SMW ask query
  --queryFile QUERYFILE
                        file the query should be read from
  -qf QUERYFIELD, --queryField QUERYFIELD
                        query result field which contains page
  -p PAGES [PAGES ...], --pages PAGES [PAGES ...]
                        list of page Titles to be pushed
  -ui, --withGUI        Pop up GUI for selection
  -qd QUERYDIVISION, --queryDivision QUERYDIVISION
                        divide query into equidistant subintervals to limit
                        the result size of the individual queries

Examples

query1.ask

{{#ask: [[IsA::Event]][[Acronym::~ES*]][[start date::>2018]][[start date::<2019]] 
| mainlabel=pageTitle
| ?Title = title 
| ?Event in series = series 
| ?ordinal=ordinal 
| ?Homepage = homepage 
| format=table 
}}

csv

wikiquery -s or --queryFile query1.ask --format csv
pageTitle;title;series;ordinal;homepage
ESA 2018;26th Annual European Symposium on Algorithms;ESA;None;http://algo2018.hiit.fi/esa/
ESEC/FSE 2018;26th ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering (ESEC/FSE);ESEC/FSE;None;https://2018.fseconference.org/
ESOP 2018;27th European Symposium on Programming;ESOP;None;https://etaps.org/2018/esop
ESORICS 2018;23rd European Symposium on Research in Computer Security,;ESORICS;None;None
ESSCIRC 2018;44th European Solid-State Circuits Conference;ESSCIRC;None;None
ESWC 2018;15th European Semantic Web Symposium (ESWS);ESWC;None;http://2018.eswc-conferences.org/

json

wikiquery -s or --queryFile query1.ask --format json
{
   "data": [
      {
         "pageTitle": "ESA 2018",
         "title": "26th Annual European Symposium on Algorithms",
         "series": "ESA",
         "ordinal": null,
         "homepage": "http://algo2018.hiit.fi/esa/"
      },
      {
         "pageTitle": "ESEC/FSE 2018",
         "title": "26th ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering (ESEC/FSE)",
         "series": "ESEC/FSE",
         "ordinal": null,
         "homepage": "https://2018.fseconference.org/"
      },
      {
         "pageTitle": "ESOP 2018",
         "title": "27th European Symposium on Programming",
         "series": "ESOP",
         "ordinal": null,
         "homepage": "https://etaps.org/2018/esop"
      },
      {
         "pageTitle": "ESORICS 2018",
         "title": "23rd European Symposium on Research in Computer Security,",
         "series": "ESORICS",
         "ordinal": null,
         "homepage": null
      },
      {
         "pageTitle": "ESSCIRC 2018",
         "title": "44th European Solid-State Circuits Conference",
         "series": "ESSCIRC",
         "ordinal": null,
         "homepage": null
      },
      {
         "pageTitle": "ESWC 2018",
         "title": "15th European Semantic Web Symposium (ESWS)",
         "series": "ESWC",
         "ordinal": null,
         "homepage": "http://2018.eswc-conferences.org/"
      }
   ]
}

wikibackup

usage

wikibackup -h
usage: wikibackup [-h] [-d] [-V] [-g] [-l] -s SOURCE [-wi]
                  [--backupPath BACKUPPATH] [--limit LIMIT] [--progress]
                  [-q QUERY] [--queryFile QUERYFILE] [-qf QUERYFIELD]
                  [-p PAGES [PAGES ...]] [-ui] [-qd QUERYDIVISION]

wikipush

  Created by Wolfgang Fahl on 2020-10-31.
  Copyright 2020 Wolfgang Fahl. All rights reserved.

  Licensed under the Apache License 2.0
  http://www.apache.org/licenses/LICENSE-2.0

  Distributed on an "AS IS" basis without warranties
  or conditions of any kind, either express or implied.

optional arguments:
  -h, --help            show this help message and exit
  -d, --debug           set debug level [default: False]
  -V, --version         show program's version number and exit
  -g, --git             use git for version control
  -l, --login           login to source wiki for access permission
  -s SOURCE, --source SOURCE
                        source wiki id
  -wi, --withImages     copy images on the given pages
  --backupPath BACKUPPATH
                        path where the backup should be stored
  --limit LIMIT         limit for query
  --progress            shows progress for query
  -q QUERY, --query QUERY
                        select pages with given SMW ask query
  --queryFile QUERYFILE
                        file the query should be read from
  -qf QUERYFIELD, --queryField QUERYFIELD
                        query result field which contains page
  -p PAGES [PAGES ...], --pages PAGES [PAGES ...]
                        list of page Titles to be pushed
  -ui, --withGUI        Pop up GUI for selection
  -qd QUERYDIVISION, --queryDivision QUERYDIVISION
                        divide query into equidistant subintervals to limit
                        the result size of the individual queries

WikiNuke

wikinukes.py allows mass deletion of pages

usage

usage: wikinuke.py [-h] [-d] [-V] [-f] [-q QUERY] [-qf QUERYFIELD] -t TARGET [-p PAGES [PAGES ...]]

Created on 2020-11-12

  Created by Wolfgang Fahl on 2020-10-31.
  Copyright 2020 Wolfgang Fahl. All rights reserved.

  Licensed under the Apache License 2.0
  http://www.apache.org/licenses/LICENSE-2.0

  Distributed on an "AS IS" basis without warranties
  or conditions of any kind, either express or implied.

USAGE

optional arguments:
  -h, --help            show this help message and exit
  -d, --debug           set debug level [default: None]
  -V, --version         show program's version number and exit
  -f, --force           force to delete pages - default is 'dry' run only listing pages
  -q QUERY, --query QUERY
                        select pages with given SMW ask query
  -qf QUERYFIELD, --queryField QUERYFIELD
                        query result field which contains page
  -t TARGET, --target TARGET
                        target wiki id
  -p PAGES [PAGES ...], --pages PAGES [PAGES ...]
                        list of page Titles to be pushed

Example

The default behavior is a dry run only listing whether the pages exist

wikinuke -t test -p deleteMe1 deleteMe2 deleteMe3
deleting 3 pages in test (dry run)
1/3 (  33%): deleting deleteMe1 ...👍
2/3 (  67%): deleting deleteMe2 ...👍
3/3 ( 100%): deleting deleteMe3 ...👍

After checking you might want to (carefully) use the "-f" option to actually force the deletion:

wikinuke -t test -p deleteMe1 deleteMe2 deleteMe3 -f
deleting 3 pages in test (forced)
1/3 (  33%): deleting deleteMe1 ...✅
2/3 (  67%): deleting deleteMe2 ...✅
3/3 ( 100%): deleting deleteMe3 ...✅

WikiEdit

wikiedit.py allows mass editing of pages using python regular expressions

usage

wikiedit -h
usage: wikiedit.py [-h] [-d] [-V] --search SEARCH --replace REPLACE [-f] [-q QUERY] [-qf QUERYFIELD] -t TARGET
                   [-p PAGES [PAGES ...]]

Created on 2020-11-12

  Created by Wolfgang Fahl on 2020-10-31.
  Copyright 2020 Wolfgang Fahl. All rights reserved.

  Licensed under the Apache License 2.0
  http://www.apache.org/licenses/LICENSE-2.0

  Distributed on an "AS IS" basis without warranties
  or conditions of any kind, either express or implied.

USAGE

optional arguments:
  -h, --help            show this help message and exit
  -d, --debug           set debug level [default: None]
  -V, --version         show program's version number and exit
  --search SEARCH       search pattern
  --replace REPLACE     replace pattern
  -f, --force           force to edit pages - default is 'dry' run only listing pages
  -q QUERY, --query QUERY
                        select pages with given SMW ask query
  -qf QUERYFIELD, --queryField QUERYFIELD
                        query result field which contains page
  -t TARGET, --target TARGET
                        target wiki id
  -p PAGES [PAGES ...], --pages PAGES [PAGES ...]
                        list of page Titles to be pushed

example

wikiedit -t test -q "[[isA::CFP]]"  --search "CALL FOR PAPER" --replace "CFP"
editing 1 pages in test (dry run)
1/1 ( 100%): editing CALL FOR PAPER Journal: Advances in Multimedia - An International Journal (AMIJ) ...👍 |isA=CFP
-|Acronym=CALL FOR PAPER Journal: Advances in Multimedia - An International Journal (AMIJ) 
-|Title=CALL FOR PAPER Journal: Advances in Multimedia - An International Journal (AMIJ) 
+|Acronym=CFP Journal: Advances in Multimedia - An International Journal (AMIJ) 
+|Title=CFP Journal: Advances in Multimedia - An International Journal (AMIJ) 
 |Start date=2010/11/01
 }}
-CALL FOR PAPER
+CFP
 Journal: Advances in Multimedia - An International Journal (AMIJ)

wikirestore

Tool to restore wiki pages from an local backup, created with wikibackup, to an destination wiki.

Arguments

Argument Description
-s Source wiki - Only used to query page names. The queried page names will then be looked up in the backup.
-t Target wiki - The backup is restored in this wiki
-q SMW query to select the pages to be restored. Note that the query is only used to select the page names the actual backup is then restored from the local backup.
-p Names of the pages to be restored
--backupPath define location of the backup. Default is the default backup location of the target wiki.

If argument -s is used a page query is executed therefore all arguments related to an page query can be used such as -ui and --limit.

Examples

--backupPath

Use this argument to define a different backup folder

wikibackup

 1$ wikibackup -s orth --backupPath "/home/user/wikibackup/orth_copy" -q "[[isA::Event]]" --limit 10
 2
 3downloading 10 pages from orth to /home/user/wikibackup/orth_copy
 41/10 (  10%): downloading " DBKDA 2021" ...✅
 52/10 (  20%): downloading "ENERGY 2021" ...✅
 63/10 (  30%): downloading "ICAS 2021" ...✅
 74/10 (  40%): downloading "ICNS 2021" ...✅
 85/10 (  50%): downloading 2021 ICIMP ...✅
 96/10 (  60%): downloading 3DUI 2020 ...✅
107/10 (  70%): downloading 3IA 2009 ...✅
118/10 (  80%): downloading 3PGIC 2010 ...✅
129/10 (  90%): downloading 4S4D 2017 ...✅
1310/10 ( 100%): downloading 5GU 2017 ...✅

wikirestore

$ wikirestore -t orth --backupPath "/home/user/wikibackup/orth_copy"

restoring 10 pages from /home/user/wikibackup/orth_copy to orth
1/10 (  10%): restore 2021 ICIMP ...✅
2/10 (  20%): restore "ICNS 2021" ...✅
3/10 (  30%): restore 3PGIC 2010 ...✅
4/10 (  40%): restore 4S4D 2017 ...✅
5/10 (  50%): restore "ENERGY 2021" ...✅
6/10 (  60%): restore 3DUI 2020 ...✅
7/10 (  70%): restore " DBKDA 2021" ...✅
8/10 (  80%): restore 3IA 2009 ...✅
9/10 (  90%): restore "ICAS 2021" ...✅
10/10 ( 100%): restore 5GU 2017 ...✅

Scenario: Restore triangle

$ wikirestore -s or -q "[[isA:Event]]" -t orth --backupPath "/home/user/wikibackup/orth_copy"

With this command we query all page names that are an Event from the wiki or and restore them in the wiki orth with the version of the page that is stored in /home/user/wikibackup/orth_copy.

wikiupload

wikiupload.py allows to mass upload files

usage

wikiupload -h
usage: wikiupload.py [-h] [-d] [-V] --files FILES [FILES ...] [-f] -t TARGET

Created on 2020-11-12

  Created by Wolfgang Fahl on 2020-10-31.
  Copyright 2020 Wolfgang Fahl. All rights reserved.

  Licensed under the Apache License 2.0
  http://www.apache.org/licenses/LICENSE-2.0

  Distributed on an "AS IS" basis without warranties
  or conditions of any kind, either express or implied.

USAGE

optional arguments:
  -h, --help            show this help message and exit
  -d, --debug           set debug level [default: None]
  -V, --version         show program's version number and exit
  --files FILES [FILES ...]
                        list of files to be uploaded
  -f, --force           force to (re)upload existing files - default is false
  -t TARGET, --target TARGET
                        target wiki id

example

wikiupload -t test --files car.png
uploading 1 files to test
1/1 ( 100%): uploading car.png ...✅

wikiuser

wikiuser.py creates credential files and assigns a WikiId under which you can now operate. This simplifies access to your wiki. The credential file is compatible to the Java Mediawiki-Japi see CommandLine#Credential_mode

usage

wikiuser -h
usage: wikiuser.py [-h] [-d] [-V] [-e EMAIL] [-f FILEPATH] [-l URL]
                   [-s SCRIPTPATH] [-p PASSWORD] [-u USER] [-v VERSION]
                   [-w WIKIID] [-y]

WikiUser credential handling

  Created by Wolfgang Fahl on 2020-10-31.
  Copyright 2020 Wolfgang Fahl. All rights reserved.

  Licensed under the Apache License 2.0
  http://www.apache.org/licenses/LICENSE-2.0

  Distributed on an "AS IS" basis without warranties
  or conditions of any kind, either express or implied.

USAGE

optional arguments:
  -h, --help            show this help message and exit
  -d, --debug           set debug level [default: None]
  -V, --version         show program's version number and exit
  -e EMAIL, --email EMAIL
                        email of the user
  -f FILEPATH, --file FILEPATH
                        ini-file path
  -l URL, --url URL     url of the wiki
  -s SCRIPTPATH, --scriptPath SCRIPTPATH
                        script path
  -p PASSWORD, --password PASSWORD
                        password
  -u USER, --user USER  os user id
  -v VERSION, --wikiVersion VERSION
                        version of the wiki
  -w WIKIID, --wikiId WIKIID
                        wiki Id
  -y, --yes             immediately store without asking

Example

E.g. if you have an account on www.semantic-mediawiki.org you can start wikiuser in interactive mode.

wikiuser
email: john@doe.com
scriptPath: /w
user: jd
url: http://www.semantic-mediawiki.org
version: Mediawiki 1.33
wikiId: smw
password: *****
shall i store jd smw? yes/no y/ny

Now you can e.g. use "smw" as the wikiid for this wiki when using wikipush

Prerequisites

You might want to prepare some credential ini files with the wikiuser script or Mediawiki-Japi CommandLine.

user-config.py

pywikibot expects a user-config.py file. The minimum recommended file for intranet usecases is:

# https://stackoverflow.com/a/60885381/1497139
# Slow down the robot such that it never makes a second page edit within
# 'put_throttle' seconds.
put_throttle = 0
# avoid warnings ...
family='bitplan'
mylang='en'

The easiest way is to put it at $HOME/.pywikibot/user-config.py

Features

Encrypted credential handling

Py-3rdparty-mediawiki allows using pywikibot by simply giving each wiki an id and using the credential information created by MediaWiki-Japi. The needed family file is automatically created and registered. If you'd like to get a pure python solution for credential handling please file an issue on github - it's no big deal but i personally don't need it yet since i'm fine with the new CommandLine feature added recently.

Semantic MediaWiki API support

see https://github.com/WolfgangFahl/py-3rdparty-mediawiki/issues/1

Example

from wikibot.wikibot import WikiBot
wikibot=WikiBot.ofWikiId("test2")
wikibot.site ...