If we are adding new records there is no need to check the holds queue, the records can't have holds yet
Created attachment 176186 [details] [review] Bug 37564: Prevent bulkmarcimport from generating holds queue jobs We shouldn't need to update the holds queue for such command line imports.
Created attachment 176188 [details] [review] Bug 37564:[23.11.x] Prevent bulkmarcimport from generating holds queue jobs We shouldn't need to update the holds queue for such command line imports.
Here is the test plan I was trying to follow: - RealTimeHoldsQueue = Enable - Delete any background jobs: DELETE FROM background_jobs; - Check the number of holds queue jobs in the database: SELECT COUNT(*) FROM background_jobs WHERE type = 'update_holds_queue_for_biblios'; - Export a record with items as XML from Koha - Delete the exported record from Koha - Import the record, using bulkmarcimport.pl - Check the jobs in the database, there should be one per item imported - Delete the imported record - Delete all background jobs - Apply the patch - Import the record again - Check the number of holds queue jobs in the database again But when I do the import I end up with this in the database: MariaDB [koha_kohadev]> SELECT * FROM biblio_metadata WHERE biblionumber = 441\G *************************** 1. row *************************** id: 439 biblionumber: 441 format: marcxml schema: MARC21 metadata: <?xml version="1.0" encoding="UTF-8"?> <record xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.loc.gov/MARC21/slim http://www.loc.gov/standards/marcxml/schema/MARC21slim.xsd" xmlns="http://www.loc.gov/MARC21/slim"> <leader>00080 a2200049 4500</leader> <controlfield tag="005">20250109090642.0</controlfield> <datafield tag="999" ind1=" " ind2=" "> <subfield code="c">441</subfield> <subfield code="d">441</subfield> </datafield> </record> timestamp: 2025-01-09 09:06:42 record_source_id: NULL 1 row in set (0.000 sec) ...and I don't have the time to investigate that right now.
I have confirmed the problem with bulkmarcimport.pl and reported it as a separate bug: Bug 38962
My problem with running bulmarcimport.pl was of the PEBKAC type... The test plan should be OK.
I had a go at testing the patch for main (thanks Magnus for the test plan!). However, it still seems to be updating the holds queue. Testing notes (using KTD): 1. Enable the real times hold queue: RealTimeHoldsQueue = Enable 2. Delete any background jobs: DELETE FROM background_jobs; 3. Check the number of holds queue jobs in the database: SELECT COUNT(*) FROM background_jobs WHERE type = update_holds_queue_for_biblios'; 4. Export a record with items as XML from Koha. 5. Delete the exported record from Koha 6. Import the record, using bulkmarcimport.pl: misc/migration_tools/bulkmarcimport.pl -b -v -m=MARCXML --file my_record.marcxml 7. Check the jobs in the database, there should be one per item imported. 8. Delete the imported record 9. Delete all background jobs (see step 2). 10. Apply the patch 11. Import the record again (see step 6) 12. Check the number of holds queue jobs in the database again - should be 0. Here is the database entries after the patch is applied (probably hard to read): | id | status | progress | size | borrowernumber | type | queue | data | context | enqueued_on | started_on | ended_on | +----+----------+----------+------+----------------+--------------------------------+---------+--------------------------------------------------------------------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+---------------------+---------------------+---------------------+ | 3 | finished | 1 | 1 | NULL | update_holds_queue_for_biblios | default | {"messages":[{"type":"success","code":"holds_queue_updated","biblio_id":439}],"report":{"total_biblios":1,"total_success":1},"biblio_ids":[439]} | {"cardnumber":null,"number":null,"interface":"commandline","firstname":"CLI","desk_name":null,"desk_id":null,"surname":"CLI","register_name":null,"branch":null,"register_id":null,"branchname":null,"shibboleth":null,"emailaddress":null,"id":null,"flags":null} | 2025-01-24 18:51:49 | 2025-01-24 18:51:50 | 2025-01-24 18:51:50 | +----+----------+----------+------+----------------+--------------------------------+---------+--------------------------------------------------------------------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+---------------------+---------------------+---------------------+ Other notes: 1. I reindexed after deleting the record, probably diidn't need to: koha-rebuild-zebra -d -f -v kohadev 2. I used a different record after applying the patch, as I had an error when trying to delete the imported record.
I see the same thing as David, there are still backgroundjobs with type = update_holds_queue_for_biblios after applying the patch.
Created attachment 177199 [details] [review] Bug 37564: (follow-up) Skip holds queue for adding biblios Martin's patch added skip_holds_queue for ModBiblio, this patch adds it for AddBiblio as well Testing notes (using KTD): 1. Enable the real times hold queue: RealTimeHoldsQueue = Enable 2. Delete any background jobs: DELETE FROM background_jobs; 3. Check the number of holds queue jobs in the database: SELECT COUNT(*) FROM background_jobs WHERE type = update_holds_queue_for_biblios'; 4. Export a record with items as XML from Koha. 5. Delete the exported record from Koha 6. Import the record, using bulkmarcimport.pl: misc/migration_tools/bulkmarcimport.pl -b -v -m=MARCXML --file my_record.marcxml 7. Check the jobs in the database, there should be one per item imported. 8. Delete the imported record 9. Delete all background jobs (see step 2). 10. Apply the patch 11. Import the record again (see step 6) 12. Check the number of holds queue jobs in the database again - should be 0. 13. Import biblio and use matchign option to overlay previosu import 14. Check the number of holds queue jobs in the database again - should be 0.
I'm getting a syntax error after applying the patches: misc/migration_tools/bulkmarcimport.pl -b -v -m=MARCXML --file bib-11.marcxml syntax error at misc/migration_tools/bulkmarcimport.pl line 143, near "$skip_indexing skip_holds_queue" BEGIN not safe after errors--compilation aborted at misc/migration_tools/bulkmarcimport.pl line 152.
Created attachment 177210 [details] [review] Bug 37564: (follow-up) Fix missing comma This should fix a syntax error in misc/migration_tools/bulkmarcimport.pl To verify: $ perl -c misc/migration_tools/bulkmarcimport.pl
Sorry, this is still not working for me. I import a record: $ perl /usr/share/koha/bin/migration_tools/bulkmarcimport.pl -b -v -m MARCXML --file lanark.marcxml Characteristic MARC flavour: MARC21 Use of uninitialized value in concatenation (.) or string at /usr/share/perl5/MARC/File/XML.pm line 399, <GEN3> chunk 3. . 1 MARC records done in 0.28852391242981 seconds And then there are holds jobs: MariaDB [koha_kohadev]> select enqueued_on, type, queue, status from background_jobs; +---------------------+--------------------------------+---------------+----------+ | enqueued_on | type | queue | status | +---------------------+--------------------------------+---------------+----------+ | 2025-01-29 07:52:33 | update_elastic_index | elastic_index | finished | | 2025-01-29 07:52:33 | update_holds_queue_for_biblios | default | finished | | 2025-01-29 07:52:33 | update_elastic_index | elastic_index | finished | | 2025-01-29 07:52:33 | update_holds_queue_for_biblios | default | finished | | 2025-01-29 07:52:33 | update_elastic_index | elastic_index | finished | | 2025-01-29 07:52:33 | update_holds_queue_for_biblios | default | finished | | 2025-01-29 07:52:33 | update_elastic_index | elastic_index | finished | | 2025-01-29 07:52:33 | update_holds_queue_for_biblios | default | finished | +---------------------+--------------------------------+---------------+----------+ 8 rows in set (0.000 sec)