Bug 31351

Summary: Worker dies on reindex job when operator last name/first name/branch name contains non-ASCII chars
Product: Koha Reporter: Peter Vashchuk <stalkernoid>
Component: Architecture, internals, and plumbingAssignee: Peter Vashchuk <stalkernoid>
Status: RESOLVED FIXED QA Contact: Marcel de Rooy <m.de.rooy>
Severity: major    
Priority: P5 - low CC: arthur.suzuki, dcook, fridolin.somers, hagud, jonathan.druart, joonas.kylmala, lucas, m.de.rooy, martin.renvoize, nugged, tomascohen
Version: unspecified   
Hardware: All   
OS: All   
Change sponsored?: --- Patch complexity: Small patch
Documentation contact: Documentation submission:
Text to go in the release notes:
Version(s) released in:
22.11.00, 22.05.06
Bug Depends on: 30360, 30889, 31223    
Bug Blocks: 32370, 30943, 32242    
Attachments: Bug 31351: encode bytes to unicode in JSON for background tasks context
Bug 31351: encode bytes to unicode in JSON for background tasks context
Bug 31351: encode bytes to unicode in JSON for background tasks context
Bug 31351: ALTERNATIVE Koha::BackgroundJob: Let database connection object handle utf8 transcoding
Bug 31351: Koha::BackgroundJob: Let database connection object handle utf8 transcoding
Bug 31351: Koha::BackgroundJob: Let database connection object handle utf8 transcoding
Bug 31351: (QA follow-up) Use $self->json in Background modules
Bug 13351: (QA follow-up) Adjust tests accordingly
Bug 31351: (QA follow-up) Extend the encode/decode test
Bug 31351: (QA follow-up) Adjust tests accordingly
Bug 31351: (QA follow-up) Extend the encode/decode test

Description Peter Vashchuk 2022-08-12 14:29:42 UTC
"malformed UTF-8 character in JSON string, at character offset 38 (before "\x{fffd}"register_id...") at /home/vagrant/kohaclone/Koha/BackgroundJob.pm line 170."

Steps to reproduce:
1. Go to your user settings and change your current logged user's name/surname so it contains non-ASCII characters, for example, "ä" (U+00E4).
2. Log out and log back in with the same user so the session will be refreshed with updated info.
3. Open any biblio record in marc-record edit mode and save it back: just to initiate ElasticSearch background reindex job creation. 
4. Open background jobs admin page, and you will see there "Update Elasticsearch index" job that is in the status "New". Normal behaviour for it is to already get to "Finished" status, as this job is simple and going to be completed quickly, but it's going to remain in "New" status and never get to the "Finished" state. That's why:
5. Stop Koha-worker from the root user: `koha-worker --stop %YOUR_KOHA_INSTANCE%` (%YOUR_KOHA_INSTANCE% is your Koha instance name, for ex. 'kohadev'), and stop RabbitMQ service inside the machine from root: `service rabbitmq-server stop`, then if you run misc/background_jobs_worker.pl inside of your koha machine from under koha user, that script won't be able to connect to RabbitMQ (it warns about this) so it will process tasks one-by-one itself in "no-server-available" mode, and thus you will be able to find the problem, why background job task remains "New": you will see the "malformed UTF-8 character in JSON string" error.
6. Apply the patch.
7. Run background_jobs_worker.pl script again, ensure that after it warns about "Connection refused" to RabbitMQ server, it processes all "New" tasks again like in step 5, but the "malformed UTF-8 character in JSON string" error is gone and if you check background jobs admin page, the status of that same job is now "Finished". This means since now the bug is solved, so:
8. Start RabbitMQ service inside the machine from root: `service rabbitmq-server start`. Start Koha-worker from the root user: `koha-worker --start %YOUR_KOHA_INSTANCE%`.
8. Repeat from step 3: edit biblio and save the record again, to create another background job.
9. Check the background job page and see that the new job now has "Finished" status, so the patch works.
Comment 1 Peter Vashchuk 2022-08-12 14:34:33 UTC
Created attachment 139058 [details] [review]
Bug 31351: encode bytes to unicode in JSON for background tasks context

"malformed UTF-8 character in JSON string, at character offset 38 (before "\x{fffd}"register_id...") at /home/vagrant/kohaclone/Koha/BackgroundJob.pm line 170."

Steps to reproduce:
1. Go to your user settings and change your current logged user's name/surname so it contains non-ASCII character, for example "ä" (U+00E4).
2. Log out and log back in with the same user so session will be refreshed with updated info.
3. Open any biblio record in marc-record edit mode and save it back: just to initiate ElasticSearch background reindex job creation.
4. Open background jobs admin page, you will see there "Update Elasticsearch index" job that is in status "New". Normal behaviour for it is to already get to "Finished" status, as this job is simple and going to be completed quickly, but it's going to remain in "New" status and never get to the "Finished" state. That's why:
5. Stop Koha-worker from the root user: `koha-worker --stop %YOUR_KOHA_INSTANCE%` (%YOUR_KOHA_INSTANCE% is your Koha instance name, for ex. 'kohadev'), and stop RabbitMQ service inside machine from root: `service rabbitmq-server stop`, then if you run misc/background_jobs_worker.pl inside of your koha machine from under koha user, that script won't be able to connect to RabbitMQ (it warns about this) so it will process tasks one-by-one itself in "no-server-available" mode, and thus you will be able to find the problem, why background job task remains "New": you will see the "malformed UTF-8 character in JSON string" error.
6. Apply the patch.
7. Run background_jobs_worker.pl script again, ensure that after it warns about "Connection refused" to RabbitMQ server, it processes all "New" tasks again like in step 5, but the "malformed UTF-8 character in JSON string" error is gone and if you check background jobs admin page, the status of that same job is now "Finished". This means since now the bug solved, so:
8. Start RabbitMQ service inside machine from root: `service rabbitmq-server start`. Start Koha-worker from the root user: `koha-worker --start %YOUR_KOHA_INSTANCE%`.
8. Repeat from step 3: edit biblio and save the record again, to create another background job.
9. Check that the background job page and see that new job now has "Finished" status, so patch works.
Comment 2 David Nind 2022-08-18 22:18:07 UTC
Created attachment 139438 [details] [review]
Bug 31351: encode bytes to unicode in JSON for background tasks context

"malformed UTF-8 character in JSON string, at character offset 38 (before "\x{fffd}"register_id...") at /home/vagrant/kohaclone/Koha/BackgroundJob.pm line 170."

Steps to reproduce:
1. Go to your user settings and change your current logged user's name/surname so it contains non-ASCII character, for example "ä" (U+00E4).
2. Log out and log back in with the same user so session will be refreshed with updated info.
3. Open any biblio record in marc-record edit mode and save it back: just to initiate ElasticSearch background reindex job creation.
4. Open background jobs admin page, you will see there "Update Elasticsearch index" job that is in status "New". Normal behaviour for it is to already get to "Finished" status, as this job is simple and going to be completed quickly, but it's going to remain in "New" status and never get to the "Finished" state. That's why:
5. Stop Koha-worker from the root user: `koha-worker --stop %YOUR_KOHA_INSTANCE%` (%YOUR_KOHA_INSTANCE% is your Koha instance name, for ex. 'kohadev'), and stop RabbitMQ service inside machine from root: `service rabbitmq-server stop`, then if you run misc/background_jobs_worker.pl inside of your koha machine from under koha user, that script won't be able to connect to RabbitMQ (it warns about this) so it will process tasks one-by-one itself in "no-server-available" mode, and thus you will be able to find the problem, why background job task remains "New": you will see the "malformed UTF-8 character in JSON string" error.
6. Apply the patch.
7. Run background_jobs_worker.pl script again, ensure that after it warns about "Connection refused" to RabbitMQ server, it processes all "New" tasks again like in step 5, but the "malformed UTF-8 character in JSON string" error is gone and if you check background jobs admin page, the status of that same job is now "Finished". This means since now the bug solved, so:
8. Start RabbitMQ service inside machine from root: `service rabbitmq-server start`. Start Koha-worker from the root user: `koha-worker --start %YOUR_KOHA_INSTANCE%`.
8. Repeat from step 3: edit biblio and save the record again, to create another background job.
9. Check that the background job page and see that new job now has "Finished" status, so patch works.

Signed-off-by: David Nind <david@davidnind.com>
Comment 3 Fridolin Somers 2022-08-20 07:53:40 UTC
Hi,

  my $context = decode_json encode_utf8 $self->context;

I think we prefer the syntax with parenthesis :
  my $context = decode_json(encode_utf8($self->context));
Comment 4 Joonas Kylmälä 2022-08-21 12:28:41 UTC
I'm wondering whether the fix is correct, and whether the other encode_utf8 calls in the same file are the right way to deal with encoding. Shouldn't we already decode the data as Perl's internal string when we read the data? In this particular case it would mean that C4::Context->userenv should be made so that it contains unicode strings.
Comment 5 Peter Vashchuk 2022-09-06 11:36:03 UTC
Created attachment 140222 [details] [review]
Bug 31351: encode bytes to unicode in JSON for background tasks context

"malformed UTF-8 character in JSON string, at character offset 38 (before "\x{fffd}"register_id...") at /home/vagrant/kohaclone/Koha/BackgroundJob.pm line 170."

Steps to reproduce:
1. Go to your user settings and change your current logged user's name/surname so it contains non-ASCII character, for example "ä" (U+00E4).
2. Log out and log back in with the same user so session will be refreshed with updated info.
3. Open any biblio record in marc-record edit mode and save it back: just to initiate ElasticSearch background reindex job creation.
4. Open background jobs admin page, you will see there "Update Elasticsearch index" job that is in status "New". Normal behaviour for it is to already get to "Finished" status, as this job is simple and going to be completed quickly, but it's going to remain in "New" status and never get to the "Finished" state. That's why:
5. Stop Koha-worker from the root user: `koha-worker --stop %YOUR_KOHA_INSTANCE%` (%YOUR_KOHA_INSTANCE% is your Koha instance name, for ex. 'kohadev'), and stop RabbitMQ service inside machine from root: `service rabbitmq-server stop`, then if you run misc/background_jobs_worker.pl inside of your koha machine from under koha user, that script won't be able to connect to RabbitMQ (it warns about this) so it will process tasks one-by-one itself in "no-server-available" mode, and thus you will be able to find the problem, why background job task remains "New": you will see the "malformed UTF-8 character in JSON string" error.
6. Apply the patch.
7. Run background_jobs_worker.pl script again, ensure that after it warns about "Connection refused" to RabbitMQ server, it processes all "New" tasks again like in step 5, but the "malformed UTF-8 character in JSON string" error is gone and if you check background jobs admin page, the status of that same job is now "Finished". This means since now the bug solved, so:
8. Start RabbitMQ service inside machine from root: `service rabbitmq-server start`. Start Koha-worker from the root user: `koha-worker --start %YOUR_KOHA_INSTANCE%`.
8. Repeat from step 3: edit biblio and save the record again, to create another background job.
9. Check that the background job page and see that new job now has "Finished" status, so patch works.

Signed-off-by: David Nind <david@davidnind.com>
Comment 6 Marcel de Rooy 2022-09-06 11:37:51 UTC
I love these encoding problems :)
Comment 7 Marcel de Rooy 2022-09-06 12:26:56 UTC
Peter,
Your fix seems okay to me. But look in this module for decode_json and you will find another one ?
    return $self->data ? decode_json( $self->data ) : undef;

Could you have a look?
Comment 8 Joonas Kylmälä 2022-09-06 20:07:41 UTC
Created attachment 140236 [details] [review]
Bug 31351: ALTERNATIVE Koha::BackgroundJob: Let database connection object handle utf8 transcoding

Our database connections have been set up so that they will
automatically convert perl encoded strings to utf8 encoded strings
when sending data to database tables and decode the utf8 encoded
strings from the tables back to internal perl strings. As we can see
from a call in Koha/BackgroundJob.pm we are encoding the perl internal
string to utf8 and then decoding it back to perl internal string:

my $data_dump = decode_json encode_utf8 $self->data;

We can skip this unnecessary encode<->decode step (as the database
object has done the decoding for us) by simply calling the
JSON->new->decode() method which doesn't perform any string decoding.

Furthermore, the original code was buggy and didn't always remember to
encode the unencoded strings, in Koha::BackgroundJob::process we can
see

my $context = decode_json($self->context);

is missing the encode step. Now after this change encoding before
decoding is not necessary as we are using the methods from the JSON
module that do not perform any transcoding.

Note to those concerned whether the old data in the database is
compatible with this new code:
 Luckily our database connection object seems to be smart and
 didn't utf8 encode the utf8 returned data from the old encode_json()
 calls (probably checks the utf8 flag for the string (Encode::is_utf8($str))).

To test whether this fixes the original bug reported of not being able
to schedule background jobs with koha user having non-ASCII letters in
their surname:
 1) Change your staff users surname/lastname to "ääää"
 2) Log out and back in.
 3) Go to a biblio record detail page and click "Select all" in the
 items table
 4) Click Delete selected items and proceed with the deletion
 5) Notice the batch item deletion job has failed status
 6) Apply patch and repeat but this time the deletion job should finish.
Comment 9 Joonas Kylmälä 2022-09-06 20:11:22 UTC
(In reply to Joonas Kylmälä from comment #4)
> I'm wondering whether the fix is correct, and whether the other encode_utf8
> calls in the same file are the right way to deal with encoding. Shouldn't we
> already decode the data as Perl's internal string when we read the data?

Did some more research and I think we definitely shouldn't do unnecesary encoding and decoding. I attached and alternative patch that removes the unecessary transcoding, which also fixes the bug. For more details see the patch description.

Please remove "ALTERNATIVE" from the patch title before pushing if it will be the fix chosen.
Comment 10 Katrin Fischer 2022-09-07 08:59:06 UTC
Created attachment 140262 [details] [review]
Bug 31351: Koha::BackgroundJob: Let database connection object handle utf8 transcoding

Our database connections have been set up so that they will
automatically convert perl encoded strings to utf8 encoded strings
when sending data to database tables and decode the utf8 encoded
strings from the tables back to internal perl strings. As we can see
from a call in Koha/BackgroundJob.pm we are encoding the perl internal
string to utf8 and then decoding it back to perl internal string:

my $data_dump = decode_json encode_utf8 $self->data;

We can skip this unnecessary encode<->decode step (as the database
object has done the decoding for us) by simply calling the
JSON->new->decode() method which doesn't perform any string decoding.

Furthermore, the original code was buggy and didn't always remember to
encode the unencoded strings, in Koha::BackgroundJob::process we can
see

my $context = decode_json($self->context);

is missing the encode step. Now after this change encoding before
decoding is not necessary as we are using the methods from the JSON
module that do not perform any transcoding.

Note to those concerned whether the old data in the database is
compatible with this new code:
 Luckily our database connection object seems to be smart and
 didn't utf8 encode the utf8 returned data from the old encode_json()
 calls (probably checks the utf8 flag for the string (Encode::is_utf8($str))).

To test whether this fixes the original bug reported of not being able
to schedule background jobs with koha user having non-ASCII letters in
their surname:
 1) Change your staff users surname/lastname to "ääää"
 2) Log out and back in.
 3) Go to a biblio record detail page and click "Select all" in the
 items table
 4) Click Delete selected items and proceed with the deletion
 5) Notice the batch item deletion job has failed status
 6) Apply patch and repeat but this time the deletion job should finish.

Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Comment 11 Marcel de Rooy 2022-09-07 10:11:33 UTC
WIll qa this one today
Comment 12 Marcel de Rooy 2022-09-07 11:32:38 UTC
QAing
Comment 13 Marcel de Rooy 2022-09-07 11:35:34 UTC
For starters, we could have added a test. Maybe I will..
Comment 14 Marcel de Rooy 2022-09-07 11:48:59 UTC
When using the functional interface of the JSON module we should have used to_json and from_json instead of decode/encode. This would require less code changes here, but the author actually recommends the OO interface. So I agree with that switch now.
Comment 15 Marcel de Rooy 2022-09-07 12:16:24 UTC
We need to adjust the test too btw
t/db_dependent/Koha/BackgroundJob.t

Contains a rather simple test set:
subtest 'decoded_data() and set_encoded_data() tests' => sub {
Comment 16 Marcel de Rooy 2022-09-07 12:18:35 UTC
And this module
package t::lib::Koha::BackgroundJob::BatchTest;
Comment 17 Marcel de Rooy 2022-09-07 12:34:32 UTC
Out of scope but possible candidates for inspection too ;)

root@master:/usr/share/koha# git grep decode_json
C4/Search/History.pm:use JSON qw( decode_json encode_json );
C4/Search/History.pm:        if (decode_json( $cookie )) {
C4/Search/History.pm:            @searches = @{decode_json( $cookie )}
C4/Search/History.pm:      eval { decode_json( uri_unescape( $session->param('search_history') ) ) };
C4/UsageStats.pm:use JSON qw( decode_json encode_json );
C4/UsageStats.pm:    my $content = decode_json( $res->decoded_content );
Koha/REST/Plugin/Query.pm:use JSON qw( decode_json );
Koha/REST/Plugin/Query.pm:            $q_params = decode_json($q_params) unless reftype $q_params;
misc/background_jobs_worker.pl:use JSON qw( decode_json );
misc/background_jobs_worker.pl:        my $args = decode_json($body);
misc/background_jobs_worker.pl:            my $args = decode_json($job->data);
misc/devel/get_prepared_letter.pl:use JSON qw( decode_json );
misc/devel/get_prepared_letter.pl:$repeat = $repeat ? decode_json($repeat) : {};
misc/devel/get_prepared_letter.pl:$tables = $tables ? decode_json($tables) : {};
misc/devel/get_prepared_letter.pl:$loops  = $loops  ? decode_json($loops)  : {};
opac/opac-detail.pl:use JSON qw( decode_json );
opac/opac-search.pl:use JSON qw/decode_json encode_json/;
opac/svc/auth/googleopenidconnect:    my $json = decode_json( $response->decoded_content );
opac/svc/auth/googleopenidconnect:    my $json     = decode_json($response);
opac/svc/auth/googleopenidconnect:        my $claims_json = decode_json($claims);
svc/report:use JSON qw( encode_json decode_json );
Comment 18 Marcel de Rooy 2022-09-07 12:51:32 UTC
Koha/BackgroundJobs/BatchUpdateBiblio.t:use JSON qw( encode_json decode_json );
Koha/BackgroundJobs/BatchUpdateBiblio.t:    my $data = decode_json $job->get_from_storage->data;
Comment 19 Marcel de Rooy 2022-09-07 12:52:10 UTC
This is a pity, but we'll manage:

 Koha/BackgroundJob/BatchCancelHold.pm         |  7 +++--- NO TESTS
 Koha/BackgroundJob/BatchDeleteAuthority.pm    |  7 +++--- NO TESTS
 Koha/BackgroundJob/BatchUpdateAuthority.pm    |  7 +++--- NO TESTS
 .../BatchUpdateBiblioHoldsQueue.pm            |  7 +++--- NO REAL TESTS, ONLY MOCKS
 Koha/BackgroundJob/BatchUpdateItem.pm         |  8 +++----  NO TESTS
Comment 20 Marcel de Rooy 2022-09-07 12:53:18 UTC
Still extending the encode/decode subtest now a bit
Comment 21 Marcel de Rooy 2022-09-07 14:06:50 UTC
And misc/background_jobs_worker.pl itself needed attention,,
Comment 22 Marcel de Rooy 2022-09-07 14:34:15 UTC
Created attachment 140310 [details] [review]
Bug 31351: Koha::BackgroundJob: Let database connection object handle utf8 transcoding

Our database connections have been set up so that they will
automatically convert perl encoded strings to utf8 encoded strings
when sending data to database tables and decode the utf8 encoded
strings from the tables back to internal perl strings. As we can see
from a call in Koha/BackgroundJob.pm we are encoding the perl internal
string to utf8 and then decoding it back to perl internal string:

my $data_dump = decode_json encode_utf8 $self->data;

We can skip this unnecessary encode<->decode step (as the database
object has done the decoding for us) by simply calling the
JSON->new->decode() method which doesn't perform any string decoding.

Furthermore, the original code was buggy and didn't always remember to
encode the unencoded strings, in Koha::BackgroundJob::process we can
see

my $context = decode_json($self->context);

is missing the encode step. Now after this change encoding before
decoding is not necessary as we are using the methods from the JSON
module that do not perform any transcoding.

Note to those concerned whether the old data in the database is
compatible with this new code:
 Luckily our database connection object seems to be smart and
 didn't utf8 encode the utf8 returned data from the old encode_json()
 calls (probably checks the utf8 flag for the string (Encode::is_utf8($str))).

To test whether this fixes the original bug reported of not being able
to schedule background jobs with koha user having non-ASCII letters in
their surname:
 1) Change your staff users surname/lastname to "ääää"
 2) Log out and back in.
 3) Go to a biblio record detail page and click "Select all" in the
 items table
 4) Click Delete selected items and proceed with the deletion
 5) Notice the batch item deletion job has failed status
 6) Apply patch and repeat but this time the deletion job should finish.

Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Comment 23 Marcel de Rooy 2022-09-07 14:34:20 UTC
Created attachment 140311 [details] [review]
Bug 31351: (QA follow-up) Use $self->json in Background modules

Making the disabling utf8 flag explicit instead of depending on
the default of the CPAN module.

Incorporating the change in background_jobs_worker too.

Test plan:
See next patches when we look at unit tests.
Restart koha-worker.

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Comment 24 Marcel de Rooy 2022-09-07 14:34:24 UTC
Created attachment 140312 [details] [review]
Bug 13351: (QA follow-up) Adjust tests accordingly

Test plan:
Run t/db_dependent/Koha/BackgroundJob.t
Run t/db_dependent/Koha/BackgroundJobs.t
Prove t/db_dependent/Koha/BackgroundJobs

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Comment 25 Marcel de Rooy 2022-09-07 14:34:27 UTC
Created attachment 140313 [details] [review]
Bug 31351: (QA follow-up) Extend the encode/decode test

Adding some Unicode stuff.
And missing txn couple.

Test plan:
Run t/db_dependent/Koha/BackgroundJob.t

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Comment 26 Marcel de Rooy 2022-09-07 14:36:22 UTC
Hmm. This was a bit heavier than expected. We could squash when backporting. Happy to assist RMaints if needed. Should not be that hard, new code. Not sure how far we come with backports btw.
Comment 27 Marcel de Rooy 2022-09-07 14:38:18 UTC
Note for Jonathan: Left a TODO in background_jobs_worker.pl where we read data from the message broker.
Comment 28 Marcel de Rooy 2022-09-07 14:41:30 UTC
Oops wrong bug number. Hold on
Comment 29 Marcel de Rooy 2022-09-07 14:41:58 UTC
Created attachment 140314 [details] [review]
Bug 31351: (QA follow-up) Adjust tests accordingly

Test plan:
Run t/db_dependent/Koha/BackgroundJob.t
Run t/db_dependent/Koha/BackgroundJobs.t
Prove t/db_dependent/Koha/BackgroundJobs

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Comment 30 Marcel de Rooy 2022-09-07 14:42:02 UTC
Created attachment 140315 [details] [review]
Bug 31351: (QA follow-up) Extend the encode/decode test

Adding some Unicode stuff.
And missing txn couple.

Test plan:
Run t/db_dependent/Koha/BackgroundJob.t

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Comment 31 Tomás Cohen Arazi 2022-09-07 16:00:06 UTC
I think the changes in the task-specific classes is wrong. They should be using `$self->decoded_data` and `$self->set_encoded_data` instead. Those methods were introduced on bug 30360.
Comment 32 Tomás Cohen Arazi 2022-09-07 16:09:16 UTC
FTR: the JSON::encode_json and JSON::decode_json methods deal with UTF-8 [1] and I'm not sure why there's a ->utf8(0) being set.

Overall, I think we should go the bug 30943 path. Take UpdateElasticIndex (as tweaked by bug 30360) as a reference.
Comment 33 Marcel de Rooy 2022-09-07 16:43:35 UTC
(In reply to Tomás Cohen Arazi from comment #32)
> FTR: the JSON::encode_json and JSON::decode_json methods deal with UTF-8 [1]
> and I'm not sure why there's a ->utf8(0) being set.
> 
> Overall, I think we should go the bug 30943 path. Take UpdateElasticIndex
> (as tweaked by bug 30360) as a reference.

Please check the comments earlier. We should not do the UTF8 conversion in the middle of the process but when storing only.
The uf8(0) is just a confirmation of using this default from the JSON OO object. To make it explicit.

There were inconsistencies in the codebase that are addressed here. I would recommend going further here. Moving to the helpers is fine of course.
Comment 34 Tomás Cohen Arazi 2022-09-07 16:49:44 UTC
(In reply to Marcel de Rooy from comment #33)
> (In reply to Tomás Cohen Arazi from comment #32)
> > FTR: the JSON::encode_json and JSON::decode_json methods deal with UTF-8 [1]
> > and I'm not sure why there's a ->utf8(0) being set.
> > 
> > Overall, I think we should go the bug 30943 path. Take UpdateElasticIndex
> > (as tweaked by bug 30360) as a reference.
> 
> Please check the comments earlier. We should not do the UTF8 conversion in
> the middle of the process but when storing only.
> The uf8(0) is just a confirmation of using this default from the JSON OO
> object. To make it explicit.
> 
> There were inconsistencies in the codebase that are addressed here. I would
> recommend going further here. Moving to the helpers is fine of course.

Ok, I'll take a look again. I loved the added tests the most.
Comment 35 Tomás Cohen Arazi 2022-09-07 23:43:49 UTC
Pushed to master for 22.11.

Nice work everyone, thanks!
Comment 36 Marcel de Rooy 2022-09-08 06:19:23 UTC
(In reply to Tomás Cohen Arazi from comment #34)
> Ok, I'll take a look again. I loved the added tests the most.

Thanks, Tomas.
Comment 37 Katrin Fischer 2022-10-14 18:05:32 UTC
Can we please have this backported?
Comment 38 Lucas Gass 2022-10-14 21:40:03 UTC
Backported to 22.05.x for upcoming 22.05.06 release
Comment 39 Arthur Suzuki 2022-10-18 23:14:24 UTC
depends on bug29346 which introduce BatchUpdateBiblioHoldQueue.
Lots of conflicts while trying the rebase.
Please provide a backport patch for 21.11.x if this one is needed.
Won't backport unless asked to.
Arthur
Comment 40 Jonathan Druart 2022-11-09 09:11:16 UTC
*** Bug 32143 has been marked as a duplicate of this bug. ***