Bug 30822

Summary: BatchCommit does not deal with indexation correctly
Product: Koha Reporter: Martin Renvoize <martin.renvoize>
Component: Architecture, internals, and plumbingAssignee: Martin Renvoize <martin.renvoize>
Status: RESOLVED FIXED QA Contact: Marcel de Rooy <m.de.rooy>
Severity: normal    
Priority: P5 - low CC: dcook, fridolin.somers, jonathan.druart, joonas.kylmala, julian.maurice, kyle, lucas, m.de.rooy, martin.renvoize, nick, tomascohen
Version: unspecified   
Hardware: All   
OS: All   
See Also: https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=29440
https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=30824
Change sponsored?: --- Patch complexity: Small patch
Documentation contact: Documentation submission:
Text to go in the release notes:
Version(s) released in:
22.11.00
Bug Depends on: 27344    
Bug Blocks: 29440, 27341, 27421, 33019    
Attachments: Bug 30822: Make BatchCommitRecords update the index in one request
Bug 30822: Clarify that BatchCommitItems is a private function
Bug 30822: Make BatchCommitRecords update the index in one request
Bug 30822: Clarify that BatchCommitItems is a private function
Bug 30822: Make BatchCommitRecords update the index in one request
Bug 30822: Clarify that BatchCommitItems is a private function
Bug 30822: Make BatchCommitRecords update the index in one request
Bug 30822: Clarify that BatchCommitItems is a private function

Description Martin Renvoize 2022-05-20 11:31:52 UTC
AddBiblio must implement skip_index and we should update the search engine's index after all the biblio records have been imported.

This also applies for bulkmarcimport.
Comment 1 Katrin Fischer 2022-05-20 12:57:43 UTC
Is there a reason indexing could not take place during the import and would have to be pushed after the task is completed? Just wondering if this would not slow down things for big imports like you would do with bulkmarcimport.
Comment 2 Martin Renvoize 2022-05-20 13:29:43 UTC
(In reply to Katrin Fischer from comment #1)
> Is there a reason indexing could not take place during the import and would
> have to be pushed after the task is completed? Just wondering if this would
> not slow down things for big imports like you would do with bulkmarcimport.

See discussion on bug 30465 for background on this exact issue...

As far as I can tell there's already a bug in the existing implementation here anyway.. as I just mentioned in bug 29440 if `BiblioAddsAuthorities` is enabled we're using the search indexes to find authority matches as part of the AddBiblio action.. if we're calling a whole bunch of AddBiblio's in a row in a loop then the index will likely be out of date for at least some of these searches and as such authority matches won't work as expected.. we have a race condition.
Comment 3 Martin Renvoize 2022-05-20 13:30:41 UTC
What we really need is task dependencies... then the Authority linking can be a task that's dependent on the rebuild having been completed.
Comment 4 Martin Renvoize 2022-05-20 16:06:32 UTC
Created attachment 135251 [details] [review]
Bug 30822: Make BatchCommitRecords update the index in one request

When committing staged marc imports to the catalogue we will often be
importing a batch of records. We don't want to send one index request
per biblio affected, we want to index them all after the records have
been modified otherwise we will end up with multiple tasks per record
(when items are also affected).

Test plan:
1) Use the stage marc record tool to stage and commit a set of records and
confirm the behaviour remains correct.
2) If using Elastic, check that only one indexing job is queued to take
place resulting from the committed import.
Comment 5 Joonas Kylmälä 2022-05-26 11:55:07 UTC
Good improvement! Just a small nit: as BatchCommitItems is a public function it should take either skip_record_index as a parameter or document clearly that it doesn't do indexing and it is left to the caller.
Comment 6 Martin Renvoize 2022-05-26 12:01:02 UTC
(In reply to Joonas Kylmälä from comment #5)
> Good improvement! Just a small nit: as BatchCommitItems is a public function
> it should take either skip_record_index as a parameter or document clearly
> that it doesn't do indexing and it is left to the caller.

Good Catch..

I'm wondering if the third option is to rename the function to be 'private'.. it appears to only be called inside this module and isn't exported either.  Certainly some POD around it would also be sensible though.
Comment 7 Martin Renvoize 2022-05-26 12:04:57 UTC
Created attachment 135372 [details] [review]
Bug 30822: Clarify that BatchCommitItems is a private function

BatchCommitItems is only being used within this module and isn't
mentioned in EXPORT_OK. This patch simply renames it to
_batchCommitItems to take the _ standard for private functions and also
adds a little hint to the POD of the function to clarify that the caller
must trigger a re-index.
Comment 8 Joonas Kylmälä 2022-06-12 09:40:22 UTC
Created attachment 135963 [details] [review]
Bug 30822: Make BatchCommitRecords update the index in one request

When committing staged marc imports to the catalogue we will often be
importing a batch of records. We don't want to send one index request
per biblio affected, we want to index them all after the records have
been modified otherwise we will end up with multiple tasks per record
(when items are also affected).

Test plan:
1) Use the stage marc record tool to stage and commit a set of records and
confirm the behaviour remains correct.
2) If using Elastic, check that only one indexing job is queued to take
place resulting from the committed import.

Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Comment 9 Joonas Kylmälä 2022-06-12 09:40:26 UTC
Created attachment 135964 [details] [review]
Bug 30822: Clarify that BatchCommitItems is a private function

BatchCommitItems is only being used within this module and isn't
mentioned in EXPORT_OK. This patch simply renames it to
_batchCommitItems to take the _ standard for private functions and also
adds a little hint to the POD of the function to clarify that the caller
must trigger a re-index.

JK: Amended patch to rename also the function in t/db_dependent/ImportBatch.t
    and fix typo "commiting" => "commiting"

Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Comment 10 Jonathan Druart 2022-06-16 10:49:42 UTC
I think you did better than me on https://bugs.koha-community.org/bugzilla3/attachment.cgi?id=136129

But maybe we need that part, for biblio deletions, what do you think?

+    if ( $record_type eq 'biblio' && ( @updated_biblionumbers || @deleted_biblionumbers ) ) {
+        my $indexer = Koha::SearchEngine::Indexer->new({ index => $Koha::SearchEngine::BIBLIOS_INDEX });
+        if ( @deleted_biblionumbers ) {
+            $indexer->index_records( \@deleted_biblionumbers, "recordDelete", "biblioserver" );
+        } else {
+            $indexer->index_records( \@updated_biblionumbers, "specialUpdate", "biblioserver" );
+        }
+    }
Comment 11 David Cook 2022-06-17 05:14:04 UTC
Went to apply it as a dependency of bug 27421 and noticed patches don't apply anymore:

Bug 30822 - BatchCommit does not deal with indexation correctly

135963 - Bug 30822: Make BatchCommitRecords update the index in one request
135964 - Bug 30822: Clarify that BatchCommitItems is a private function

Apply? [(y)es, (n)o, (i)nteractive] y
Applying: Bug 30822: Make BatchCommitRecords update the index in one request
Using index info to reconstruct a base tree...
M       C4/ImportBatch.pm
Falling back to patching base and 3-way merge...
Auto-merging C4/ImportBatch.pm
CONFLICT (content): Merge conflict in C4/ImportBatch.pm
error: Failed to merge in the changes.
Patch failed at 0001 Bug 30822: Make BatchCommitRecords update the index in one request
Comment 12 Jonathan Druart 2022-06-17 06:43:16 UTC
Created attachment 136218 [details] [review]
Bug 30822: Make BatchCommitRecords update the index in one request

When committing staged marc imports to the catalogue we will often be
importing a batch of records. We don't want to send one index request
per biblio affected, we want to index them all after the records have
been modified otherwise we will end up with multiple tasks per record
(when items are also affected).

Test plan:
1) Use the stage marc record tool to stage and commit a set of records and
confirm the behaviour remains correct.
2) If using Elastic, check that only one indexing job is queued to take
place resulting from the committed import.

Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Comment 13 Jonathan Druart 2022-06-17 06:43:20 UTC
Created attachment 136219 [details] [review]
Bug 30822: Clarify that BatchCommitItems is a private function

BatchCommitItems is only being used within this module and isn't
mentioned in EXPORT_OK. This patch simply renames it to
_batchCommitItems to take the _ standard for private functions and also
adds a little hint to the POD of the function to clarify that the caller
must trigger a re-index.

JK: Amended patch to rename also the function in t/db_dependent/ImportBatch.t
    and fix typo "commiting" => "commiting"

Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Comment 14 Marcel de Rooy 2022-06-17 07:18:54 UTC
QA: Looking here
Comment 15 Marcel de Rooy 2022-06-17 07:47:45 UTC
(In reply to Martin Renvoize from comment #6)
> (In reply to Joonas Kylmälä from comment #5)
> > Good improvement! Just a small nit: as BatchCommitItems is a public function
> > it should take either skip_record_index as a parameter or document clearly
> > that it doesn't do indexing and it is left to the caller.
> 
> Good Catch..
> 
> I'm wondering if the third option is to rename the function to be
> 'private'.. it appears to only be called inside this module and isn't
> exported either.  Certainly some POD around it would also be sensible though.

Just a thought. Not sure if we should invest time in renaming C4 functions private rather than getting them out of C4 ;)
There will be a bunch more. Wouldnt rename them all.
This is done, no problem.
Comment 16 Marcel de Rooy 2022-06-17 08:25:15 UTC
There is something funny going on here. Might be a config issue, but while the worker is running, I dont see any import finished but constantly a new process is spinning up for the background-job-progress ? Testing with Zebra only.
Comment 17 Marcel de Rooy 2022-06-17 09:24:54 UTC
(In reply to Jonathan Druart from comment #10)

> But maybe we need that part, for biblio deletions, what do you think?
> 
> +    if ( $record_type eq 'biblio' && ( @updated_biblionumbers ||
> @deleted_biblionumbers ) ) {
> +        my $indexer = Koha::SearchEngine::Indexer->new({ index =>
> $Koha::SearchEngine::BIBLIOS_INDEX });
> +        if ( @deleted_biblionumbers ) {
> +            $indexer->index_records( \@deleted_biblionumbers,
> "recordDelete", "biblioserver" );
> +        } else {
> +            $indexer->index_records( \@updated_biblionumbers,
> "specialUpdate", "biblioserver" );
> +        }
> +    }

I agree that for consistency, we should address that too. But we could say that this can be done on a new report. The title of this report is scoped to BatchCommitRecords. And we are talking BatchRevert here.
Comment 18 Marcel de Rooy 2022-06-17 09:26:31 UTC
(In reply to Marcel de Rooy from comment #16)
> There is something funny going on here. Might be a config issue, but while
> the worker is running, I dont see any import finished but constantly a new
> process is spinning up for the background-job-progress ? Testing with Zebra
> only.

The worker is not yet relevant here.
But what needs attention (somewhere), is that if BatchCommit silently fails, the js on the manage import page keeps polling for the status of something that crashed.
Comment 19 Marcel de Rooy 2022-06-17 09:27:35 UTC
Created attachment 136247 [details] [review]
Bug 30822: Make BatchCommitRecords update the index in one request

When committing staged marc imports to the catalogue we will often be
importing a batch of records. We don't want to send one index request
per biblio affected, we want to index them all after the records have
been modified otherwise we will end up with multiple tasks per record
(when items are also affected).

Test plan:
1) Use the stage marc record tool to stage and commit a set of records and
confirm the behaviour remains correct.
2) If using Elastic, check that only one indexing job is queued to take
place resulting from the committed import.

Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Comment 20 Marcel de Rooy 2022-06-17 09:27:39 UTC
Created attachment 136248 [details] [review]
Bug 30822: Clarify that BatchCommitItems is a private function

BatchCommitItems is only being used within this module and isn't
mentioned in EXPORT_OK. This patch simply renames it to
_batchCommitItems to take the _ standard for private functions and also
adds a little hint to the POD of the function to clarify that the caller
must trigger a re-index.

JK: Amended patch to rename also the function in t/db_dependent/ImportBatch.t
    and fix typo "commiting" => "commiting"

Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Comment 21 Martin Renvoize 2022-06-22 10:00:18 UTC
Thanks guys, sorry I didn't get back to this one.

Yes, I can take care of batchRevert next unless someone else wants to jump on that one.

As for other failures I'm wondering if to some extent that's looked at in Bug 29325 - commit_file.pl error 'Already in a transaction'?
Comment 22 Tomás Cohen Arazi 2022-06-22 13:01:14 UTC
Pushed to master for 22.11.

Nice work everyone, thanks!
Comment 23 Lucas Gass 2022-07-29 15:10:58 UTC
Does not apply cleanly to 22.05.x, no backport. Please rebase if needed
Comment 24 Nick Clemens 2023-08-18 08:51:10 UTC
*** Bug 26543 has been marked as a duplicate of this bug. ***