Bug 23073 - wiki.koha-community.org needs updating to a later version
Summary: wiki.koha-community.org needs updating to a later version
Status: CLOSED FIXED
Alias: None
Product: Koha
Classification: Unclassified
Component: OPAC (show other bugs)
Version: unspecified
Hardware: PC Windows
: P2 normal (vote)
Assignee: Thomas Dukleth
QA Contact: Testopia
URL:
Keywords:
Depends on:
Blocks:
 
Reported: 2019-06-06 21:41 UTC by David Nind
Modified: 2023-12-28 20:43 UTC (History)
9 users (show)

See Also:
Change sponsored?: ---
Patch complexity: ---
Documentation contact:
Documentation submission:
Text to go in the release notes:
Version(s) released in:


Attachments

Note You need to log in before you can comment on or make changes to this bug.
Description David Nind 2019-06-06 21:41:12 UTC
The version of Wikimedia used for https://wiki.koha-community.org needs updating.

Current version information: https://wiki.koha-community.org/wiki/Special:Version
Comment 1 David Nind 2019-06-06 21:45:58 UTC
See discussion on IRC:
- http://irc.koha-community.org/koha/2019-06-06#i_2149184 and
- http://irc.koha-community.org/koha/2019-06-06#i_2149465

A blocker for updating in the past seems to have been the CategoryTree extension and other extensions.
Comment 2 David Nind 2019-06-06 21:48:19 UTC
I am happy to help with testing, including doing a test upgrade locally to see what the issues are, and mapping out a plan for updating.

This would require a database dump and the site files - not sure what the security implications are of this as the database would include user name and password information.
Comment 3 Thomas Dukleth 2019-07-11 01:18:32 UTC
Perhaps the trend of consensus may point to starting over with a new instance of MediaWiki or Dokuwiki which may simplify the problem even if the choice would be an unfortunate consequence of my time having become less available for Koha in recent years.

Updating the Koha instance of MediaWiki version while retaining the current Postgres database implementation may prevent the best improvements to the Koha MediaWiki implementation as too many extensions, scripts, etc. do not work properly with Postgres.  If MediaWiki is updated and new content is added, there may be no good way to migrate to MySQL from that point.  See https://wiki.koha-community.org/wiki/Proposal_for_Wiki_Curator_17.05_Thomas_Dukleth#Migrating_to_MySQL .   

I only ever found one set of scripts for migrating a MediaWiki instance from Postgres to MySQL, http://www.winterrodeln.org/trac/wiki/MediaWikiPostgresqlToMysql . 

First migrating to MySQL and then updating MediaWiki would allow using many advantages of extensions such as SemanticMediaWiki which do not work properly and are not supported in Postgres even if has been possible to install the extension at one point.  We could have a good system of faceted categories which would be easier and more useful than the currently implemented more hierarchical categories modelled on what was implemented in the finding aid in Docuwiki which only supported hierarchical namespaces in browsable manner and thus lacked the flexibility provided by an extension such as SemanticMediaWiki.

Possibly breaking some installed extensions which can happen with any update is not a significant issue holding back upgrading but merely a reason to test updates before putting them into production.  The Koha MediaWiki instance did become briefly broken in the past by installed extensions when it was updated without testing and required some testing and modification of extensions which are not necessarily at issue going forward presently.  However, anything not core to the base software might lead to breakage in untested updates.

[While the details of issues relating to Dokuwiki should be out of scope for this bug, it is worthwhile noting the importance of not allowing an historical mistake in MediaWiki database choice and the subsequent neglect of the Koha MediaWiki instance to become a permanent mistake preventing the use of features dependent on MySQL.  In the practise of users on the previous Koha Wiki implemented in Dokuwiki, most pages seemed to be frustratingly lost to anything but guessing query terms in a search.  People did not have the habit of tagging pages at all when created and when people did tag them it was too often in a manner which did not aid findability.  Dokuwiki also did not provide any easy means for a wiki maintainer to find untagged pages and then add a tag or tags for findability, although, some people in the Dokuwiki community have been raising the issue for a while.  See https://github.com/dokufreaks/plugin-tag/issues/77 and https://forum.dokuwiki.org/thread/9473 .]
Comment 4 Thomas Dukleth 2019-08-07 21:28:12 UTC
[Correcting my hurried mistaken reading, I had misread the units for the Koha wiki database dump.  It is only about 100 MB not GB irrespective of the verbosity of the dump format.  I had incorrectly reported the units in the 1 August Documentation meeting, https://meetings.koha-community.org/2019/documentation_irc_meeting_1_august_2019.2019-08-01-13.02.log.html .]

In preparing to test migrating the MediaWiki database from Postgres to MySQL, I am now reverting to Postgres 9.6 in Debian Squeeze LTS for migration testing as postgres-common and postgresql-11 in Debian Buster currently have some bugs which we do not need trip over or otherwise pollute the results.  "pg_upgradecluster writes data_directory to postgresql.auto.conf, and gets confused by it on the next upgrade [DATA LOSS]", https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=931635 .  "ALTER TABLE statements causing "relation already exists" errors when some indexes exist", https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=932247 .  It is possible to work around those bugs in Debian Buster but it seems preferable to avoid issues where things are less widely tested and where staying closer to the old version with an old migration script may work better.

After migrating to MySQL, Debian Buster may be less problematic.  Maria DB MySQL packages in Buster for use after migrating to MySQL are probably much more widely tested than Postgres packages have apparently been.
Comment 5 Thomas Dukleth 2019-08-08 02:41:45 UTC
s/Squeeze LTS/Stretch LTS/g # above

[In my previous comment I mistakenly referred to Squeeze LTS when Stretch LTS was intended as would be evident from contextual clues.  I should have also included the numeric Debian version number 9 for better clarity.]
Comment 6 Thomas Dukleth 2019-08-14 12:59:39 UTC
I am preparing scripts to automate the conversion of the Koha MediaWiki database from Postgres to MySQL.

Multiple methods for reproducing the MediaWiki Postgres database from different forms of dump files are working well.  Using the full dump gives me more confidence with the requirement that more preparation of database users and associated username databases is necessary to avoid errors.  The Postgres database restoration with the separate SQL structure and dump files require constraint options such as --data-only --disable-triggers --superuser to avoid errors with pg_restore.

Philipp Spitzer's psqltomysql.py is written in a very generic abstracted matter and easy to modify with a few changes for both his database names, usernames, and adjustmemts where in Postgres we do not have the conventional form of username is the same as database name,   http://www.winterrodeln.org/trac/browser/servermediawiki/trunk/pgsqltomysql/pgsqltomysql.py .

The are a few problematic MediaWiki Postgres database tables for migration to MySQL where an obvious generic or MySQL centric SQL structure file is not present.  It may be necessary to partly convert from the Postgres SQL structure dump.

Table anonymized_mwuser is owned by one particular database username which might be a legacy from the very first day of testing MediaWiki for Koha for which I started over the next day.  All other tables have a different common username for the owner.  I have not found any proper source for the anonymized_mwuser table.

I have not found an easy to use MySQL structure file for the Semantic MediaWiki tables which have been essentially unused in consequence of Semantic MediaWiki not being well supported with Postgres.  Semantic MediaWiki was released with SMW_Postgres_Schema.sql but no corresponding MySQL centric file unlike other extensions which add SQL tables.  MySQL Semantic MediaWiki tables seem to be built from the Semantic MediaWiki installation script but that cannot be run on an empty database for the structure and I have not found an obvious database structure component in the includes files in the code.  A workaround for preserving any existing data is to retrieve the structure from a MySQL SQL structure dump after installing Semantic MediaWiki on an otherwise empty MySQL based MediaWiki installation being careful of course to match the old versions we have for successful database migration.  A similar workaround might be to temporarily drop the Semantic MediaWiki tables from the temporary Postgres database restoration; reinstall Semantic MediaWiki from the working MySQL conversion where no Semantic MediaWiki tables have been converted; dump the MySQL SQL structure; and then completely restart the database conversion using a newly restored temporary Postgres database and the newly obtained MySQL SQL structure dump.  We might always choose not to preserve Semantic MediaWiki data under the presumption that there was no significant work done to make use of Semantic MediaWiki under Postgres where it would not work well: drop the Semantic MediaWiki tables from database conversion; uninstall Semantic MediaWiki; and then start again with Semantic MediaWiki by installing Semantic MediaWiki with no pre-existing Semantic MediaWiki data after upgrading MediaWiki to the current version in MySQL.  The non-preservation option for Semantic MediaWiki might be best given known limitations of Semantic MediaWiki under Postgres, the peculiar dichotomy with how Postgres is treated, and the potential importance of Semantic MediaWiki for a flexible faceted system for finding content.
Comment 7 Thomas Dukleth 2019-09-04 13:21:59 UTC
I successfully changed variables and added some additional conditions to avoid NULL values as necessary from the original form of the Postgres to MySQL conversion script, http://www.winterrodeln.org/trac/browser/servermediawiki/trunk/pgsqltomysql/pgsqltomysql.py .  I also added the Postgres schema for the wiki which is "mediawiki", the same as the database name in the case of the Koha Wiki, and not the "public" schema.

I should now be reporting that it takes X minutes to run a script to convert the wiki database on my test system, however, there is an error similar to what was found to be SQL statement quotation or comma separation problem addressed at https://stackoverflow.com/questions/41475309/psycopg-error-column-does-not-exist and https://stackoverflow.com/questions/41804213/psycopg2-programmingerror-column-your-name-does-not-exist .

I am printing the generated SQL statements to try to trace the problem.  I may try using the generated SQL statements directly with suggested quotation to avoid problems where the SQL statements as stored in a variable elude what may be needed quotation.
Comment 8 Thomas Dukleth 2019-09-05 12:37:57 UTC
From printing variables, I seem to have confirmed that the problem with the Postgres to MySQL conversion script is failure to use proper quotation of the data and/or SQL statement as a whole, http://www.winterrodeln.org/trac/browser/servermediawiki/trunk/pgsqltomysql/pgsqltomysql.py .  It had been evident at glance that quoting had been originally considered for column names.  Nothing seems to be magically quoting the data so the originator of the script presumably had some well trusted content.

Consequently, I seem to need to escape any quotes within the column data; quote the data from each column; possibly avoid quoting numeric values to avoid a data type problem; and possibly quote the entire SQL statement.
Comment 9 Thomas Dukleth 2019-09-11 20:53:32 UTC
Database migration is working without any Python or MySQL errors in my current current revisions to Philipp Spitzer's Postgres to MySQL conversion script, http://www.winterrodeln.org/trac/browser/servermediawiki/trunk/pgsqltomysql/pgsqltomysql.py .  There are still some 'data truncated' warnings for some columns which I working through to correct for issues such as whether default values should be NULL, '', or 0.

The "psycopg2.ProgrammingError: column Host does not exist" error for non-existent column Host when retrieving database values from Python was not related to any lack of quotation for SQL statements in the original script from Philipp Spitzer.  Lack of quotation may be a concern especially for string values in the input data which I have endeavoured to correct to the extent permitted by the data structure required by the Python module MySQLdb.  However, if the data values could be trusted, avoiding string quotation had avoided some "data truncated" warnings.

Lack of database constraint which I subsequently added, had led to the SQL statement for collecting column names for each wiki database table as reported in information_schema.COLUMNS mistakenly collecting columns from some MySQL system user table when collecting column names for the wiki user table.  Evidently at the time the script was originally written, a MySQL system user table was not an issue or the database constraint in the MySQLdb connection was sufficient.
Comment 10 Thomas Dukleth 2019-10-03 18:50:16 UTC
Database migration from MySQL now functions with not only no errors but also with no warnings.

I have resolved all 'data truncated' warnings for various columns.  In some cases, setting a default value appropriate to a particular MySQL column as NULL, 0, or '' was sufficient.  In a few cases, I have altered the MySQL column sizes to accommodate a larger data size than the defaults set by Mediawiki for MySQL.  The difference which has allowed Postgres columns having larger data size than the size defined for for the same column in MySQL is that the Postgres columns at issue have a type text without any size constraint while the same column in MySQL often has a small size constraint.

No longer having 'data truncated' warnings is very good but a little more checking should be done.

For the cases of default value issues, it will be better to recheck my particular choices in the problematic columns for whether NULL, 0, or '' correctly align with default values set for the columns for MySQL.  Despite now avoiding all warnings, I initially took a trial and error approach to solve as much as I could in a sprint without sleep shortly after eliminating all errors and it is possible a choice or choices might not be quite correctly aligned or at least after doing my course of jury duty I cannot remember perfectly well how thoroughly I had checked before jury duty.

For the cases of Postgres having a larger data size than defined for the column in MySQL all the alterations I made to the matching MySQL columns size limits were made by trial and error and should at least be more closely aligned with what is actually necessary.  Unlike the earlier errors, the 'data truncated' warnings do not give helpful information to identify offending rows.  Particular MediaWiki implementations might need modification of the column size defaults to accommodate actual use and the case of Wikipedia has many but the modifications should not be unnecessary it is possible that some particular rows might have some buggy data from a code bug or a buggy user action.   

The columns for which I altered the maximum size may not generally have great consequence but there is one key column or key-like column affected, categorylinks.cl_sortkey for sorting titles properly with efficiency within a category.  The cause of exceeding the size limit could be how we use the wiki with some informative verbose titles, however, it may that any extra long page titles are an accident or traces left over from spammers where the content has presumably been deleted but traces are left in the database.  Running updateCollation.php to recompute sort keys for all pages would allow setting the column size back to VARCHAR(70) without an associated sorting problem.

In two other cases, the column size limits which I found to work are well more than an order of magnitude greater than the column size defined for MediaWiki.

I will try to query for rows which have excessive data size in the particular columns if I can find an efficient method to do so.  It may be helpful to use some combination of aligning the column size better for the actual data and/or and choosing to truncate some column data without triggering any warning where it is obvious that excessive data was created by some obvious accident, such as leaning on the keyboard without intent.
Comment 11 Thomas Dukleth 2019-10-03 19:00:20 UTC
The first sentence of my previous comment should have read as follows.

Database migration from Postgres to MySQL now functions with not only no errors but also with no warnings.
Comment 12 David Nind 2019-10-08 23:11:37 UTC
That's awesome! Thanks Thomas for all your work on this.
Comment 13 Thomas Dukleth 2019-10-10 17:16:08 UTC
I have successfully tested for the following changes in converting the Koha Wiki database from Postgres to MySQL.  Postgres MediaWiki defaults had text types without any size constraint while the same columns in MySQL had strong default size constraints.

We have the example of actual use in Postgres where it is evident that there are some circumstances where perfectly reasonable and correct use exceeds the MySQL size constraints and there seems to be no problem with altering the column size in MySQL to accommodate legitimate existing data use.  At the same time, it seems appropriate to truncate the data for the few accidents where someone pasted wiki page content mistakenly into the comments field.

Some cases where the column size default is too small in MySQL but the data is automatically generated or essentially automatically generated indicate that the MySQL column size constraint was not well planned.

In one case where the comment in a comment column is legitimate but exceeds the MySQL column size by a few characters which can be truncated without loss of understanding, it seems appropriate to truncate the column for the one row which goes over the size limit. If the MySQL size change of the same data type would be from. 

In the SQL statements below the comment "Original" immediately precedes an SQL statement for the original structure of the column.  If the original structure statement is not commented out it is restored, otherwise the next SQL statement after comments provides the structure for the column with an increased size.

For the three columns with a size increase from the default, it will be necessary in MediaWiki upgrades to check whether the size may be decreased by the upgrade if that would ever happen.
     
-- Original
-- ALTER TABLE `account_credentials` MODIFY COLUMN `acd_comment` varchar(255) NOT NULL DEFAULT '';
-- Allow for sufficiently greater than largest legitimate found data
-- length of 472.  An explanatory greeting message to a user about a
-- login problem seems necessary and appropriate
ALTER TABLE `account_credentials` MODIFY COLUMN `acd_comment` varchar(511) NOT NULL DEFAULT '';

-- Original
ALTER TABLE `archive` MODIFY COLUMN `ar_comment` tinyblob NOT NULL;
-- Truncated in database migration Python script for greater than largest
-- length for TINYBLOB, 255 bytes including one extra byte for data size.
--
-- Wrong field use mistake with illegitimate found data length 5446.
-- Column truncation affecting 3 rows.

-- Original
-- ALTER TABLE `categorylinks` MODIFY COLUMN `cl_sortkey` varchar(70) CHARACTER SET utf8mb4 COLLATE utf8mb4_bin NOT NULL DEFAULT '';
-- Allow for sufficiently greater than largest legitimate found data
-- length of 165.  Column data is automatically generated from page
-- titles and used as a key for page title sort order which would
-- obviously not work properly if shorter than the longest page title.
-- Template titles are the longest but there are many others over the
-- default.
ALTER TABLE `categorylinks` MODIFY COLUMN `cl_sortkey` varchar(200) CHARACTER SET utf8mb4 COLLATE utf8mb4_bin NOT NULL DEFAULT '';

-- Original
ALTER TABLE `filearchive` MODIFY COLUMN `fa_description` tinyblob;
-- Truncated in database migration Python script for greater than largest
-- length for default size TINYBLOB, 255 bytes including one extra byte
-- for data size.
--
-- Wrong field use mistake with illegitimate found data length 5447.
-- Column truncation affecting 3 rows.

-- Original
-- ALTER TABLE `logging` MODIFY COLUMN `log_comment` varchar(255) NOT NULL DEFAULT '';
-- Truncated in database migration Python script for sufficiently greater
-- than largest legitimate found data length of 278.  Many essentially
-- automatically filled uses which slightly exceed 255 characters.
--
-- An alternative to increasing the size of the column to accommodate
-- legitimate use could be changing some essentially automatically filled
-- text patterns in the data to some more abbreviated form.
--
-- Wrong field use mistake with illegitimate found data length 5447.
-- Column truncation affecting 3 rows.
ALTER TABLE `logging` MODIFY COLUMN `log_comment` varchar(300) NOT NULL DEFAULT '';

-- Original
ALTER TABLE `revision` MODIFY COLUMN `rev_comment` tinyblob NOT NULL;
-- Truncated in database migration Python script for greater than the
-- largest length for default size TINYBLOB, 255 bytes including one
-- extra byte for data size.
-- 
-- Largest legitimate found data length of 259.
--
-- Column truncation affecting 1 row cutting off the end of a word but not
-- affecting understandability of comment.  At the end of the comment the
-- word "breadcrumbs."  would be truncated as "breadcr" truncating four
-- characters and the full stop at the end of the sentence.
Comment 14 Thomas Dukleth 2019-11-12 23:46:47 UTC
There is much to report about MediaWiki encoding data for MySQL in binary form even in current versions of MediaWiki to compensate for the historic lack of proper UTF8 support in MySQL.  I have some half started reports from weeks ago about how I needed to build a Debian old version time machine to create the MediaWiki binary tables properly for the version of MediaWiki from which we are upgrading.  I will try to finish those reports when I am more awake.

Meanwhile, I have been pressing on with automating the whole process and I should report something before falling asleep again. 

The following is the successful output from a script which I wrote to manage the steps of database migration with a friendly configuration file, much error checking, etc.  There are still a few elements to add such as the initial pgrestore step and reading the configuration file from the Python script which does the data migration as distinct from the Perl script which manages most of the steps preparing for data migration and checks the results.

"If you have reached this point without errors and you
subsequently confirm that no errors or warnings have been logged,
then the Koha MediaWiki database should have been successfully
migrated.If you have reached this point without errors and you
subsequently confirm that no errors or warnings have been logged,
then the Koha MediaWiki database should have been successfully
migrated.

Remember that before any use MediaWiki and installed extensions
must be migrated to a version supported by your PHP version andand MySQL version.

Remember that before any use, MediaWiki and installed extensions
must be upgraded to a version supported by your current PHP and
MySQL versions."

Depending on how awake I am later, I will test upgrading the test installation of MediaWiki and extensions to their current versions.
Comment 15 Thomas Dukleth 2020-02-06 22:44:04 UTC
As is well known, databases such as MySQL and Postgres do not have any magical diff friendly dump which would have appropriate granularity to use in an automated comparison to find any actual problem without being overwhelmed with noise from necessary differences between databases where the essential data is the same for relevant columns but where there are necessary database specific differences such as an extra indexing column, etc.  Similar to what I had done for some part of debugging the Python script for the database migration, I am working on API based automated select of for every column and every row to write out each row equivalent of data between the two databases so that actual data is compared with appropriate granularity and not whole rows.  Such an approach should be helpful to automatically check that the migrated data is complete and spot any unlikely but possible issues, such as with encoding of names from CJK languages which some users have, which would never be a problem for true UTF8 in Postgres but for which the MediaWiki solution to historic MySQL 3 byte limit for MySQL uft8 character set encoding is using binary data type for MySQL columns.

[Encoding differences have been well addressed by insertion to MediaWiki supported binary data type columns for use in MySQL some time ago as will be explained retrospectively in a more detailed comment complete with references to an old version Debian time machine allowing the MySQL database to be created correctly for where we landed with MediaWiki when apprehending the database choice mistake.]

The issue here is about an automated test for data comparison in which data has been read from both the originating Postgres database and the migrated MySQL database, then granularly saved in full UTF8 for diff comparison.
Comment 16 Thomas Dukleth 2021-04-08 15:24:01 UTC
Some not quite complete code to test migrating the database and upgrading the wiki in an overly large 475 MiB archive with some significant file redundancy is currently available at , https://test01.agogme.com/koha_migrate_mwiki_db_and_upgrade_test.tgz .  I will reduce redundancy in the course of updated versions.  Anyone interested in testing would need to separately obtain a dump of the Koha Wiki to avoid publicly compromising Koha Wiki usernames and passwords.
Comment 17 Thomas Dukleth 2021-05-05 15:43:51 UTC
A much improved version of code and Koha Wiki files to test migrating the database and upgrading the wiki in a now much more manageable 123 MiB archive including automated support for copying wiki files to configured locations eliminating redundancy from the previous version is currently available at , https://test01.agogme.com/koha_migrate_mwiki_db_and_upgrade_test.tgz .  The migration and update code is still a work in progress for having every both detail correct and known to be correct.  Migration to a proper MediaWiki supported MySQL binary character encoding database runs without errors or warnings as it had at the end of 2019 but with some minor 2020 improvements and major 2021 improvements in managing database column types.  The wiki upgrade code from 2021 works but the upgrade instance needs some evident configuration, etc. improvements to take advantage of features in the current MediaWiki version whether outdated content is maintained as a static archive or as a dynamic archive with mass tagging of outdated pages in the same wiki along with current pages with a distinct search query page for accessing old content as I am preparing to test.   

All changes from the previously posted version are improvements to automation and sharing in a useful manner.  Improvements for the resulting migrated upgrade are forthcoming after some more changes improving automation of the installation.

Anyone interested in testing would need to separately obtain a full dump of the Koha Wiki to avoid publicly compromising Koha Wiki usernames and passwords which are not contained in the published archive.  Similarly as I should have mentioned previously, the issue of resetting wgSecretKey in LocalSettings.php could be avoided by obtaining a copy of the Koha Wiki LocalSettings.php .
Comment 18 David Nind 2021-05-13 11:30:33 UTC
Thanks Thomas for your continued work on this!
Comment 19 Yolanda Marcos 2021-05-18 13:37:14 UTC
Hi all,

We have updated to version 20.11 of Koha and in one of our programs the items (holding) are not displayed from the "MARC view" of the OPAC, however we can see the items from the "Normal view". Can someone tell us why it may be?

Thanks in advance

Best regards
Comment 20 David Nind 2021-05-18 19:44:35 UTC
Hi Yolanda.

I don't think your query relates to this bug, which is about updating the version of the software used for wiki.koha-community.org

I don't know the answer to your query, but others on the general mailing list (https://koha-community.org/support/koha-mailing-lists/), or IRC (https://koha-community.org/get-involved/irc/) may be able to help. 

It does sound odd..

David
Comment 21 Thomas Dukleth 2021-06-02 13:22:54 UTC
Some important bug fixes and some improvements with support for excluding old archived content from simple search are included in this version of code and Koha Wiki files to test migrating the database and upgrading the wiki from the previous version is currently available at , https://test01.agogme.com/koha_migrate_mwiki_db_and_upgrade_test.tgz .  More changes improving automation of the installation and validation are needed.

Changes support a dynamic archive with mass tagging of outdated pages in the same wiki also holding current pages with a distinct search query page for accessing old content.

Anyone interested in testing would need to separately obtain a full dump of the Koha Wiki to avoid publicly compromising Koha Wiki usernames and passwords which are not contained in the published archive.  Similarly as I should have mentioned previously, the issue of resetting wgSecretKey in LocalSettings.php could be avoided by obtaining a copy of the Koha Wiki LocalSettings.php .

From the current changelog.

In README.txt
-------------

Specified required system packages somewhat more completely.

Added RESTBase section.

In apt configuration example files
----------------------------------

Added support for MediaWiki VisualEditor extension which relies upon the
MediaWiki RESTBase API service which needs nodejs version 6+ and uses
Apache Cassandra.  nodejs 6+ needs the stretch-backports repository
for Installation in Debian Stretch.  No successful RESTBase configuration
found for Apache and thus may require forthcoming NGINX configuration.

In webserver configuration example files
----------------------------------

Added support for MediaWiki VisualEditor extension which relies upon the
MediaWiki RESTBase API service which needs nodejs version 6+ and uses
Apache Cassandra.  nodejs 6+ needs the stretch-backports repository
for Installation in Debian Stretch.  No successful RESTBase configuration
found for Apache and thus may require forthcoming NGINX configuration.

In koha_migrate_mwiki_db_and_upgrade.pl
---------------------------------------

Uncommented a few lines for apt update and downloading files which had
been temporarily commented out for testing to avoid requesting
network recources too frequently with a rapid testing cycle.

Specified required system packages somewhat more completely.

Fixed escaping characters for setting Debian version in special PHP
apt repository.

Added use of more variables to control file locations.

Excluded Semantic MediaWiki tables from the database dump used in building
the particular MySQL MediaWiki database for upgrading to the current
MediaWiki version thus discarding populated data before upgrade to save
unnecessary extra work.  Semantic MediaWiki had never functioned properly
in Postgres and failed command line tests.  Attempting to properly
preserve unused Semantic MediaWiki data would require additional work to
first upgrade to MediaWiki 1.24 before upgrading to the current version
with no actual benefit as we have not been using the broken extension in
Postgres.  In any case, the data is recreated with reinstallation of
Semantic MediaWiki.

Added support for MediaWiki VisualEditor extension which relies upon the
MediaWiki RESTBase API service which needs nodejs version 6+ and uses
Apache Cassandra.  nodejs 6+ needs the stretch-backports repository
for Installation in Debian Stretch.

koha_mediawiki_pgsqltomysql.py
------------------------------

Fixed treatment of NULL values which had left some NULLs not converting
properly and led to bad strftime conversion for columns with NULL values
in database migration from Postgres to MySQL.

koha_mwiki_instances_postinstall.sh
-----------------------------------

Added installation of Semantic MediaWiki extension.

Added specification of Vector skin default.

Added MassEditRegex extension.

Added installation of MessageBox templates and needed modules.

SearchBox.mustache
-----------------------------------

Added JavaScript to exclude Content_Old category from simple search box
queries with Vector skin allowing archived content to be excluded from
searches.
Comment 22 Thomas Dukleth 2021-10-20 12:53:34 UTC
Currently correcting some minor bugs revealed when replacing archive files distinguished between binary files and files otherwise inappropriate for version control and version controlled files.  Previous testing rerunning script changes but without replacing files excluded from version control had missed such bugs.  The issue makes it more obvious that my previous conclusion over several iterations testing version control file inclusion that more unmodified non-binary files our existing installation archive are inappropriate for inclusion in version control.
Comment 23 Thomas Dukleth 2022-02-02 11:29:59 UTC
Unmodified text files which are not part of code used for MediaWiki database migration and updating are now included in the "binary" [non-version controlled] files archive allowing version controlled code for database migration and wiki updating to not be lost in a larger set of unrelated files.  Such unmodified text files along with binary files and symlinks previously excluded from version control as inappropriate for version control are now included in  https://test01.agogme.com/koha_migrate_mwiki_db_and_upgrade_test.tgz .  [Symlinks are mostly built by the migration and update code but we have also started with some created for the legacy wiki.]

A link to the version control repository with the unmodified text files excluded will be rebuilt and posted after another run of tests.
Comment 24 Thomas Dukleth 2022-06-22 09:22:26 UTC
Important bug fix for installing a MediaWiki compliant stable version of PHP Composer completed a couple of months ago and finally enough incremental changes for excluding files inappropriate for version control through present and past commits.  Carefully examining past versions for uniformity of version control exclusion was tedious but there should be no significant distractions for a file disappearing as outside the scope of version control and then reappearing in a later version when exclusion was missed.  Files excluded as inappropriate for version control are currently available at https://test01.agogme.com/koha_migrate_mwiki_db_and_upgrade_test.tgz .  Code and configuration files to test migrating the MediaWiki database and upgrading the wiki are currently available at git://test01.agogme.com/koha-migrate-mwiki-db-and-upgrade-test.git .

Anyone interested in testing the code would need to separately obtain a full dump of the Koha Wiki to avoid publicly compromising Koha Wiki usernames and passwords which are not contained in the published archive.  The issue of resetting wgSecretKey in LocalSettings.php could be avoided by obtaining a copy of the Koha Wiki LocalSettings.php .  I have prepared a special nonpublic tar archive for Koha developers interested in testing. 

Mason James has been working on code to automatically validate the database migration.

Test instances have been mostly up for over a year.  They may frequently go down or become slow when: regenerated for code testing; or excess RAM or CPU use running some memory intensive processes unrelated to Koha on the VPS.  Non-Koha use of the VPS starting too many processes still occasionally leads the OOM killer to stop mysqld for which I should add a monitoring and restart service for when I forget to check after such excess RAM usage.

Reinstalled test instance of the Koha Wiki using Postgres at https://koha-mw-pg-test01.agogme.com/ .

Koha test wiki migrated to MySQL at https://koha-mw-my-test01.agogme.com/ .

Koha MySQL test wiki upgraded to MediaWiki 1.35 LTS at https://koha-mw-my-test01-upgr.agogme.com/ .

Please note that some adjustment of the upgraded wiki, such as the superseded language change template at the bottom of the home page, may be outside the ordinary scope of an automated upgrade process and require manual correction.  The job queue may be full for an extended period after reinstalling or upgrading a test wiki instance because reindexing is necessary.  Ordering of search query result set improves with actual use but appears to be ordered by page creation initially after reindexing rather than page modification or other potentially more useful relevancy ranking which develops over time as the wiki instance is used.

From the current changelog.
In README.txt
-------------

Clarified language and instruction variously.

Added an explanation in installation instructions about obtaining files
from a current git repository, an archive of files inappropriate for
version control, database dump, etc.

Added guidance in installation instructions to clarify what files
currently need to be copied manually and what are examples needing
modification to previous guidance about what is installed automatically.

Added clarification for webserver configuration including testing Apache
and Nginx at the same time with different webserver ports.

Added the importance of starting the Perl script
koha_migrate_mwiki_db_and_upgrade.pl from the shell script
koha_migrate_mwiki_db_and_upgrade_startup.sh to provide better management
and needed logging.

Added a note to the configuration section noting that each of the main
scripts has minimal internal configuration variables for finding the main
configuration and other scripts.

Clarified function of MediaWiki RESTBase in relation to VisualEditor
extension.  VisualEditor works fine without any of the subtle difficulties of
configuring RESTBase to respond when proxied through the webserver.  The
core extension Parsoid is sufficient to run VisualEditor with no
installation or configuration needed.  The only disadvantage without
RESTBase is that diffs may have some extra whitespace or such for
imperfect diffs if switching back and forth between VisualEditor and
WikiText editor before saving.  Given the difficulties of configuring
the wiki editor and Wikitext source editor.  Choose
elude many people.

Added a section for configuration sharing security giving guidance
and a script for substituting example values for security sensitive values
in some configuration files when committing to public source code
repositories.

In koha_migrate_mwiki_db_and_upgrade_startup.sh
---------------------------------------

Added bash script to control startup of migration and installation to log
STDOUT and STDERR.

In koha_migrate_mwiki_db_and_upgrade.pl
---------------------------------------

Corrected for temporary file inadvertantly created when creating download
directory for upgraded MediaWiki installation archive.

Corrections for moving binary files and other files not suitable for
version control to a separate archive from version control files.

Improved elapsed time reporting.

Standard date and time reporting.

In LocalSettings.php
--------------------

Commented out RESTBase configuration which has subtle configuration
difficulties and substituted Parsoid configuration without RESTBase as
sufficient for VisualEditor extension.

Added localhost specification for page editing as the editor runs on
localhost.  Configuring page editing to specify the IP address from
which the editor runs corrects for a problem otherwise preventing
editing of spam protected pages with editing restrictions such as we have
on the home page to prevent newly created accounts from immediately
editing the home page.

SearchBox.mustache
------------------

Changed the category excluded from the SimpleSearch form for dynamically
archived content to Obsolete which has already been in use.

koha_mwiki_instances_postinstall.sh
-----------------------------------

Corrected installation of PHP composer to function more resiliantly in a
script and set version to a stable point which avoids errors with
MediaWiki.

In etc/apache2
--------------

Changed listen port numbers for example Apache 2 configuration files to
allow testing along with testing Nginx.

In etc/nginx
------------

Example nginx configuration files.

In koha_migrate_mwiki_db_and_upgrade_FOREXPORT.sh
---------------------------------------

Added bash script to remove nonpublic elements from config files
substituting public configuration examples for each of the nonpublic
values in the following configuration line.  The preceding line with
a public configuration example starts with "# FOREXPORT # ".
Comment 25 Victor Grousset/tuxayo 2022-06-23 02:16:34 UTC
Issue found.

1. https://koha-mw-my-test01-upgr.agogme.com/wiki/Community_Facebook
2. click on a image
3. File not found.
Comment 26 Victor Grousset/tuxayo 2022-06-23 02:18:55 UTC
Why do we need «-"[[Category:Obsolete]]" AND» added automatically on searches again?

I already heard about that a long time ago but forgot.
Comment 27 Thomas Dukleth 2022-07-06 07:17:29 UTC
The intent of adding an automatic exclusion for obsolete pages from the simple search box with -"[[Category:Obsolete]]" is for the simplest searches to exclude old obsolete pages so as not to confuse users or be found inappropriately in the most basic search result set.  Old obsolete content pages can still be found using the advanced search including found explicitly by searching for "[[Category:Obsolete]]" without the exclusion operator prefix and thus acts as an archive for such content.

Old obsolete pages are noted with a prominent notice via a template if they are encountered.  Such pages should be updated if they can be, but are otherwise available to consult most importantly for valuable information they often contain which is not yet present in current pages.

There should be a more elegant method to address the issue but migrating the database and upgrading has priority.
Comment 28 Thomas Dukleth 2022-07-06 07:43:42 UTC
I am investigating why the full view of at least some image files in the form of /wiki/File:some_file are not displaying for even the direct restore of the Koha MediaWiki Postgres instance.  The images are actually all present.  There maybe a missing dependency or permission problem.
Comment 29 Thomas Dukleth 2022-07-06 08:09:50 UTC
Preserving version control changes after changing to the convention for hyphens or dashes instead of underscores in program directories and filenames has required replacing the Git archive with filenames in the history adjusted for dashes.  Otherwise, moving files loses version control.

Code and configuration files to test migrating the MediaWiki database and upgrading the wiki are currently available at the same location which was originally created with hyphens but previously contained directories and filenames with underscores, git://test01.agogme.com/koha-migrate-mwiki-db-and-upgrade-test.git .  Files excluded as inappropriate for version control are currently available with hyphens having replaced underscores at https://test01.agogme.com/koha-migrate-mwiki-db-and-upgrade-test.tgz .

Anyone interested in testing the code would need to separately obtain a full dump of the Koha Wiki to avoid publicly compromising Koha Wiki usernames and passwords which are not contained in the published archive.  The issue of resetting wgSecretKey in LocalSettings.php could be avoided by obtaining a copy of the Koha Wiki LocalSettings.php .  I have prepared a special nonpublic tar archive for Koha developers interested in testing. 

Test instances were down for some extended periods in the past few days while I was fixing bugs and regenerating the test instances.  Please note as before that newly reinstalled MediaWiki instances take time for indexing etc. to catch up over multiple days and indexing requires actual use to weight the result set properly.
.  They may frequently go down or become slow when: regenerated for code testing; or excess RAM or CPU use running some memory intensive processes unrelated to Koha on the VPS.  Non-Koha use of the VPS starting too many processes still occasionally leads the OOM killer to stop mysqld for which I should add a monitoring and restart service for when I forget to check after such excess RAM usage.

Reinstalled test instance of the Koha Wiki using Postgres at https://koha-mw-pg-test01.agogme.com/ .

Koha test wiki migrated to MySQL at https://koha-mw-my-test01.agogme.com/ .

Koha MySQL test wiki upgraded to MediaWiki 1.35 LTS at https://koha-mw-my-test01-upgr.agogme.com/ .

Please note that some adjustment of the upgraded wiki, such as the superseded language change template at the bottom of the home page, may be outside the ordinary scope of an automated upgrade process and require manual correction.  The job queue may be full for an extended period after reinstalling or upgrading a test wiki instance because reindexing is necessary.  Ordering of search query result set improves with actual use but appears to be ordered by page creation initially after reindexing rather than page modification or other potentially more useful relevancy ranking which develops over time as the wiki instance is used.

From the current changelog.

In README.txt
-------------

Changed use of underscores to hyphens for file and directory names
for compliance with common conventions.

Clarified language and instruction variously.

In koha_migrate_mwiki_db_and_upgrade.ini
---------------------------------------

Changed use of underscores to hyphens for file and directory names
for compliance with common conventions.

Corrected location of debugging output logs for
koha-mediawiki-pgsqltomysql.py .

Added operating system version configuration for improving installation.

In koha-migrate-mwiki-db-and-upgrade-startup.sh
-----------------------------------------------

Changed use of underscores to hyphens for file and directory names
for compliance with common conventions.

Decreased verbosity of bash startup script when calling without
arguments for which koha-migrate-mwiki-db-and-upgrade.pl responds with a
usage reminder leaving response from the bash startup script unnecessary.

In koha-migrate-mwiki-db-and-upgrade.pl
---------------------------------------

Changed use of underscores to hyphens for file and directory names
for compliance with common conventions.

Corrected for temporary file inadvertantly created when creating download
directory for upgraded MediaWiki installation archive.

Corrected for setting permissions of pgpass file too early for newly
created pgpass file which led to failure of pg_restore.

Added removeal of pgpass file after use by pg_restore.

Corrected typo for binary in mysql command option
--default-character-set .

Corrections for excessive copying of files from the production archive to
the MediaWiki upgrade test instance when scope of binary files and other
files not suitable for version control had been moved to a separate
archive from version control files.

Added additional section heading comments for demarcating more code
sections for better readability of code and logs.

Added operating system version configuration for improving installation.

Removed redundant code section previously included with the intention
of chaning the section for additional MediaWiki testing instances.

In koha-mediawiki-pgsqltomysql.py
---------------------------------

Changed use of underscores to hyphens for file and directory names
for compliance with common conventions.

Corrected location of debugging logs.

In koha-mwiki-instances-postinstall.sh
--------------------------------------

Changed use of underscores to hyphens for file and directory names
for compliance with common conventions.

Added calls to runJobs.php.

Clarified section heading comments demarcating code
sections for better readability of code and logs.

In koha-migrate-mwiki-db_and-upgrade-forexport.sh
-------------------------------------------------

Substituted lowercase in filename for compliance with common
conventions.

Changed use of underscores to hyphens for file and directory names
for compliance with common conventions.

In get-mwiki-mbox-templates.sh
------------------------------

Changed use of underscores to hyphens for file and directory names
for compliance with common conventions.

Added explanatory header.

------------------------------------------

Pending Bug Fixes.

Full View of Saved Files Not Displaying.

I am investigating why the full view of at least some image files in the form of /wiki/File:some_file are not displaying for even the direct restore of the Koha MediaWiki Postgres instance.

PHP LTS on sury.org Only Supports Debian Stable and Old Stable.

Debian PHP maintainer, Ondřej Surý, has been continually supplying PHP LTS updates for PHP 5.6 and PHP 7.4 as well as other versions of PHP with ongoing patches which have been helpful when testing database migration and upgrades using Debian 9.  Sometime in the past two weeks, Debian 9 support was dropped on sury.org .  The sury.org packages for Debian 9 are still available at archive.org and other possible archives and continue to run on my Debian 9 test server.  The sury.org packages for Debian 11 stable and 10 old stable continue to be available.  Debian 9 had been especially good for testing because it was much less likely to have too many changes in database versions or other dependencies which might break an old version of MediaWiki which we have stuck in Postgres which first needs conversion to MySQL. 

As with any project, MediaWiki uses stable dependencies and takes time to adopt new versions of dependencies which may introduce breaking changes.  The current Koha Wiki using MediaWiki 1.16 mistakenly using Postgres was designed for PHP 5.3 but seems to be managing fine with PHP 5.6.  MediaWiki 1.35 LTS uses PHP 7.3 or 7.4.  No versions of MediaWiki yet use PHP 8.  The main issue is the utility of having PHP 5.6 on a system where to help verify the correctness of database migration before upgrading MediaWiki.

There are several workarounds.  Please keep in mind that moving the Koha wiki to Debian 10 or 11 at least after upgrading to MediaWiki 1.35 LTS is presumed in any case.

The installations of PHP 5.6 and PHP 7.4 continue to work on my test instance of Debian 9 but cannot be as easily replicated on another instance of Debian 9.  An alternative archive might be used in the place of sury.org .  Nevertheless, my test instance of Debian 9 could be used for both MediaWiki database migration from Postgres to MySQL and MediaWiki upgrade to 1.35 LTS if really necessary. 

The best option is probably the following:  Debian 8 with PHP 5.6 will work for MediaWiki 1.16 database migration from Postgres to MySQL and may even be better than Debian 9 for database migration.  Debian 10 or 11 might be used for upgrade to MediaWiki 1.35 LTS from a a migrated MySQL database dump and associated files from the Koha MediaWiki instance.  Ondřej Surý continues to provide PHP LTS support for the current Debian stable 11 and Debian old stable 10.  The scripts which currently run database migration and upgrade sequentially would need at least functional division to support upgrading on a different server from the database migration.  In the past few days, I have improved section headings for easier log reading and to help more easily identify code sections which would need at least a functional distinction between migration and upgrade tasks if running distinctly on different virtual servers with different Debian versions.  

An easier option might still work with migration and upgrade.  Running PHP 5.6 and 7.4 for everything starting on Debian 10 might work but the risk increases for Postgres or MySQL versions in Debian 10 being too far out of sink with MediaWiki 1.16 for database migration to work correctly especially considering post-database migration MediaWiki maintenance scripts required to have a working instances of MediaWiki even if merely used to assure working MediaWiki instances before upgrading to MediaWiki 1.35 LTS.

Ondřej Surý provides commercial support with Freexian for PHP LTS releases for Debian 8 and 9 but that is probably not a proper option for Koha.  See https://php.freexian.com/ .

We could always compile PHP versions from source as needed.

pg_restore ftell Warning.

I had not remembered well my previous investigation into the ftell warning when running pg_restore.  The database restore completes and appears to be correct.

PGPASSFILE=$pgpass_file_path pg_restore -h localhost -p 5432 -U postgres -w -d database_name < $pgdump_file_path

"pg_restore: [custom archiver] WARNING: ftell mismatch with expected position --ftell used"

The pg_restore command ftell warning may be an incorrect presumption of data and dump size when the dump file is piped.  The warning may be likely to be a false alarm.

The workaround of specifying the dump file for pg_restore with the -f argument gives an error when also specifying the database to which to restore everything.  The database restore fails.

PGPASSFILE=$pgpass_file_path pg_restore -h localhost -p 5432 -U postgres -w -d database_name -f $pgdump_file_path

"pg_restore: options -d/--dbname and -f/--file cannot be used together"

Not specifying the restore database has complicated restore in long past testing partly because we had made a mistake initially of not following the database convention of specifying the database and the database user as the same name and even changed the database user to be distinct from the database name which can lead to tricky issues in restore where the database has two users with imperfectly overlapping permissions which need to be specified with excess care.

There may not be a completely satisfactory workaround for the warning which may merely be a false alarm.  Some test might be devised to verify that the issue is a false alarm.

Maintenance Job Control.

I added some support for running MediaWiki maintenance/runJobs.php to the post install shell script for MediaWiki test instances but I need to add control for maintenance/runJobs.php as an additional separate recurring process.
Comment 30 Victor Grousset/tuxayo 2022-07-10 01:14:45 UTC
(In reply to Thomas Dukleth from comment #27)
> The intent of adding an automatic exclusion for obsolete pages from the
> simple search box with -"[[Category:Obsolete]]" is for the simplest searches
> to exclude old obsolete pages so as not to confuse users or be found
> inappropriately in the most basic search result set.  Old obsolete content
> pages can still be found using the advanced search including found
> explicitly by searching for "[[Category:Obsolete]]" without the exclusion
> operator prefix and thus acts as an archive for such content.
> 
> Old obsolete pages are noted with a prominent notice via a template if they
> are encountered.  Such pages should be updated if they can be, but are
> otherwise available to consult most importantly for valuable information
> they often contain which is not yet present in current pages.

Ok it's just a bonus feature set up on the same occasion as the upgrade, not related to it.
Comment 31 Thomas Dukleth 2022-08-24 21:32:24 UTC
Using Debian 8 for database migration from Postgres to MySQL avoids warnings and errors from pg_restore which is better than having some potential unidentified problem.  The MySQL wiki dump is copied from Debian 8 to Debian 10 or 11.  Debian 10 has been tested for wiki upgrade using the MySQL wiki dump from and should work with Debian 11.

A test run of the Canasta MediaWiki Docker image for fairly fulsome use of MediaWiki pending.  If the test run with Canasta works, we should upgrade for production the next available day.

Code and configuration files to test migrating the MediaWiki database and upgrading the wiki are currently available at the same location, git://test01.agogme.com/koha-migrate-mwiki-db-and-upgrade-test.git .  Files excluded as inappropriate for version control are currently available at the same location, https://test01.agogme.com/koha-migrate-mwiki-db-and-upgrade-test.tgz .

Anyone interested in testing the code would need to separately obtain a full dump of the Koha Wiki to avoid publicly compromising Koha Wiki usernames and passwords which are not contained in the published archive.  The issue of resetting wgSecretKey in LocalSettings.php could be avoided by obtaining a copy of the Koha Wiki LocalSettings.php .  I have prepared a special nonpublic tar archive for Koha developers interested in testing.

Please note as before that newly reinstalled MediaWiki instances take time for indexing etc. to catch up over multiple days and indexing requires actual use to weight the result set properly.  They may frequently go down or become slow when: regenerated for code testing; or excess RAM or CPU use running some memory intensive processes unrelated to Koha on the VPS.  Non-Koha use of the VPS starting too many processes still occasionally leads the OOM killer to stop mysqld for which I should add a monitoring and restart service for when I forget to check after such excess RAM usage.

Reinstalled test instance of the Koha Wiki using Postgres on Debian 8 with Nginx at https://koha-mw-pg-test02.agogme.com/ .

Koha test wiki migrated to MySQL on Debian 8 with Nginx at https://koha-mw-my-test02.agogme.com/ .

Koha MySQL test wiki upgraded to MediaWiki 1.35 LTS on Debian 10 with Nginx at https://koha-mw-my-test00-upgr.agogme.com/ .

We will likely continue to use Apache in production until or unless we have the resolution for wiki full image page working in Ngnix which currently returns a 404 error for such pages.  For any deficiency of Nginx, try Apache by adding port 9443 to any page such as https://koha-mw-my-test00-upgr.agogme.com:9443/wiki/Some_Page .  MediaWiki configuration will force the port back to the port back to the configured HTTPS port 443 and does not support running the same wiki instance simultaneously for different webservers on different ports.

Please note that some adjustment of the upgraded wiki, such as the superseded language change template at the bottom of the home page, may be outside the ordinary scope of an automated upgrade process and require manual correction.  The job queue may be full for an extended period after reinstalling or upgrading a test wiki instance because reindexing is necessary.  Ordering of search query result set improves with actual use but may be partly ordered by page creation initially after reindexing rather than page modification or other potentially more useful relevancy ranking which develops over time as the wiki instance is used.  Postgres based queries to which we may be accustomed do automatic word stemming and index page titles.  MySQL based MediaWiki allows support for many options including Apache Lucene indexing via Elastic Search or Open Search.

Dynamic archiving to exclude obsolete pages using "-[[Category:Obsolete]]" requires all pages to have at least one category which is generally necessary in any case even if there is insufficient time to select an appropriate category assigning a placeholder category for category unselected such as [[Category:Empty]] would be helpful which can later be replaced with an appropriate choice.

From the current changelog.

In /etc examples
----------------

Moved example files in the /etc directory to os-variants directory for
Debian version specific use.

In README.txt
-------------

Added apt instruction for packages libconfig-simple-perl rsync gnupg
postgresql-contrib libdbd-pg-perl python python-psycopg2 python-mysqldb
python3 .

Added additional package installation for several php and other packages
which had been configured manually in advance of previous testing where
they had been selected from packages filing requirements for the Debian
distribution mediawiki package.

Correction for Debian 10 and later substitute libnode-dev for nodejs-dev .

Added os-variants.

Added tested operating system version divided use where database migration
from Postgres to MySQL is done in Debian 8 to avoid warnings and errors
from using pg_restore for restoring the Postgres production database
starting point.  Wiki upgrade is then done on Debian 10 which has been
tested using the MySQL databse dumped from Debian 8.  Upgrade could also
be done on Debian 11 untested.

Clarified language and instruction variously.

In koha_migrate_mwiki_db_and_upgrade.ini
---------------------------------------

Added human readable postupgrade MySQL dump path.

In koha-migrate-mwiki-db-and-upgrade-startup.sh
-----------------------------------------------

Added support for additional necessary automatic package installation
for libconfig-simple-perl .

Added option for using database dump from old operating system version.

Added upgrade option now used for automated upgrade.

In koha-migrate-mwiki-db-and-upgrade.pl
---------------------------------------

Changed use of underscore to hyphen for automatically created Apache
Casandra apt repository list file.

Added support for additional necessary automatic package installation
for rsync gnupg libdbd-pg-perl python-psycopg2 python-mysqldb.

Corrected misspelling with extra dash in automated installation of
packages php5.6-mysql php5.6-pgsql php7.4-mysql php7.4-pgsql .

Added support for additional necessary automatic package installation
for several php and other packages which had been configured manually
in advance of previous testing where they had been selected from
packages filing requirements for the Debian distribution mediawiki
package.

Correction for Debian 10 and later substitute libnode-dev for nodejs-dev .

Corrected for some instances missing apt-get -y option to ensure package
installation works non-interactively.

Corrected automated Apache configuration for Debian 8 using php-fpm.

Commented out call to
koha-mediawiki-mysql-alter-col-for-migration-from-postgres.sql .

Added os-variants directory for installation use.

Added option for using database dump from old operating system version.

Added upgrade option now used for automated upgrade.

Added additional postupgrade MySQL dumps.

In koha-mediawiki-pgsqltomysql.py
---------------------------------

Corrected for file exists error when creating a directory which already
exists for log files.

Activated additional file truncation code to avoid need to increase
any column sizes as the Perl script call to
koha-mediawiki-mysql-alter-col-for-migration-from-postgres.sql is
now commented out.

In koha-mwiki-instances-postinstall.sh
--------------------------------------

Added prospective --server argument to update.php calls to avoid
possible "PHP Notice: Undefined index: SERVER_NAME".

In LocalSettings.php
--------------------

Added !isset condition for REMOTE_ADDR to update.php reporting
"PHP Notice: Undefined index: REMOTE_ADDR".

Moved LocalSettings.php files to os-variants directory.

In koha-migrate-mwiki-db_and-upgrade-forexport.sh
-------------------------------------------------

Added path variables for files in os-variants directory.

------------------------------------------

Pending Bug Fixes.

Full View of Saved Files Not Displaying.

The full view of image files in the form of /wiki/File:some_file are not displaying.  The suggested Nginx fix requires more attention.  The easy workaround is to revert to Apache.

Maintenance Job Control.

I added some support for running MediaWiki maintenance/runJobs.php to the post install shell script for MediaWiki test instances but I need to add control for maintenance/runJobs.php as an additional separate recurring process which is actually fairly trivial.  Using the Canasta Docker image for MediaWiki provides some convenient default configuration for some things such as job control.
Comment 32 Owen Leonard 2022-10-25 12:21:02 UTC
> https://lists.katipo.co.nz/pipermail/koha/2022-October/058585.html

Is it a known issue that searching fails? I see this message: "An error has occurred while searching: We could not complete your search due to a temporary problem. Please try again later."
Comment 33 Caroline Cyr La Rose 2022-10-25 12:59:45 UTC
(In reply to Owen Leonard from comment #32)
> > https://lists.katipo.co.nz/pipermail/koha/2022-October/058585.html
> 
> Is it a known issue that searching fails? I see this message: "An error has
> occurred while searching: We could not complete your search due to a
> temporary problem. Please try again later."

I also had this problem.
- Entered "Meeting" in the search box at the top of the main page
- Pressed "Enter"
- Got a message "An error has occurred while searching: We could not complete your search due to a temporary problem. Please try again later."
Comment 34 David Nind 2022-10-28 22:37:45 UTC
It looks like there are several attempts to create accounts - see the user creation log https://wiki.koha-community.org/wiki/Special:Log/newusers

I don't have access to see whether they are real accounts that have been created or not.

These may have already been deleted as "No credentials found for this user. Check that the name is spelled correctly." is the message when checking accounts on the https://wiki.koha-community.org/wiki/Special:UserCredentials page.

Nothing has shown up in the moderation queue https://wiki.koha-community.org/wiki/Special:ConfirmAccounts - this is as expected, as until the email is sorted anyone requesting an account gets:
"Error sending mail:
Unknown error in PHP's mail() function."
Comment 35 Katrin Fischer 2022-10-30 11:17:00 UTC
When I tested an earlier version of the updated wiki, I fell in love with the category editor. I see no way to edit/fix the categories on the pages now.
Comment 36 Katrin Fischer 2022-10-30 11:20:58 UTC
(In reply to Katrin Fischer from comment #35)
> When I tested an earlier version of the updated wiki, I fell in love with
> the category editor. I see no way to edit/fix the categories on the pages
> now.

Ok, maybe just the optics changed a little? you need to edit the page in order to get to it. Will experiment more.
Comment 37 David Nind 2023-05-09 22:10:33 UTC
Hi Thomas.

Could you have a look at the account request settings?

Lately, very few new account requests are coming through for approval.

In the Recent changes page, there are lots of accounts being created (see the recent changes page - User creation log entries). Most are red linked, but these have not been coming through the Confirm account requests page (https://wiki.koha-community.org/wiki/Special:ConfirmAccounts).

Also, there are some spam pages that need deleting - see 4 May 2023 on the recent changes page.

See also, the discussion on the development mailing list:
https://lists.koha-community.org/pipermail/koha-devel/2023-May/048145.html

Thanks!

David
Comment 38 Thomas Dukleth 2023-05-11 17:10:50 UTC
Wiki account creation bypassing the ConfirmAccount extension was possible when email from the container was working due to a bug for which ConfirmAccount is incompatible with the current version of MediaWiki.  Yesterday, I applied the workaround to add to LocalSettings.php.

$wgGroupPermissions['*']['createaccount'] = false;

Broken email service for the wiki because of complications authenticating to the SMTP server from the Docker container in addition to previous testing configuration remaining  in LocalSettings.php meant that there were very few spam accounts created which were actually functional.  If the accounts had been functional, we would have found the problem shortly after the upgraded wiki went live.

Given the similarity of spam messages and timing there may have only been one or two spammers or spambots even with hundreds of suspicious non-working accounts created.

There were about 20 spam accounts which had mostly just created some spam content in the wiki user page for the account and some which created a spam wiki page.  5 accounts before May which did not attract much notice and about 15 from 3 and 4 May which made the problem obvious.  All spam content has been deleted and the accounts blocked.  Spam accounts were included in recent created users with contributions, https://wiki.koha-community.org/wiki/Special:ListUsers?username=&group=&editsOnly=1&creationSort=1&desc=1&wpsubmit=&wpFormIdentifier=mw-listusers-form&limit=50 .

Thanks to Katrin Fischer and especially David Nind for blocking a few hundred accounts which had almost all likely never functioned but had been created automatically until the bug in ConfirmAccount had the workaround applied and could have been activated.  I paused after the first hundred or so such accounts.  Suspected spam accounts were included in all recently created users, https://wiki.koha-community.org/wiki/Special:ListUsers?username=&group=&creationSort=1&desc=1&wpsubmit=&wpFormIdentifier=mw-listusers-form&limit=50 .  We used a manual process one account at a time to block suspicious accounts.  Legitimate accounts with contributions could be recognised but it is possible that we inadvertently blocked a legitimate user account which had not yet been used to create content.  David Nind proposed to write a message to the mailing list informing anyone who might have been inadvertently affected to raise attention to their account being improperly blocked.

The Wikimedia Foundation uses the UserCheck extension to help manage spam account blocking but it is not working properly inside the Koha Docker container where all users appear to have logged in from the same local IP address instead of an external IP address.  Other extensions which had helped in combating WikiMedia spam no longer function or do not scale better than the manual process which we used.  Direct database manipulation to block accounts could be possible but would need extra careful checking and the problem was small enough to manage manually via the web user interface.  Using Docker is nice but there are some Docker specific bugs.
Comment 39 David Nind 2023-05-14 19:40:49 UTC
Message sent to the general mailing list: https://lists.katipo.co.nz/pipermail/koha/2023-May/059454.html
Comment 40 David Nind 2023-05-14 19:41:23 UTC
Since the Wiki upgrade is completed, I've closed this bug.