"malformed UTF-8 character in JSON string, at character offset 38 (before "\x{fffd}"register_id...") at /home/vagrant/kohaclone/Koha/BackgroundJob.pm line 170." Steps to reproduce: 1. Go to your user settings and change your current logged user's name/surname so it contains non-ASCII characters, for example, "ä" (U+00E4). 2. Log out and log back in with the same user so the session will be refreshed with updated info. 3. Open any biblio record in marc-record edit mode and save it back: just to initiate ElasticSearch background reindex job creation. 4. Open background jobs admin page, and you will see there "Update Elasticsearch index" job that is in the status "New". Normal behaviour for it is to already get to "Finished" status, as this job is simple and going to be completed quickly, but it's going to remain in "New" status and never get to the "Finished" state. That's why: 5. Stop Koha-worker from the root user: `koha-worker --stop %YOUR_KOHA_INSTANCE%` (%YOUR_KOHA_INSTANCE% is your Koha instance name, for ex. 'kohadev'), and stop RabbitMQ service inside the machine from root: `service rabbitmq-server stop`, then if you run misc/background_jobs_worker.pl inside of your koha machine from under koha user, that script won't be able to connect to RabbitMQ (it warns about this) so it will process tasks one-by-one itself in "no-server-available" mode, and thus you will be able to find the problem, why background job task remains "New": you will see the "malformed UTF-8 character in JSON string" error. 6. Apply the patch. 7. Run background_jobs_worker.pl script again, ensure that after it warns about "Connection refused" to RabbitMQ server, it processes all "New" tasks again like in step 5, but the "malformed UTF-8 character in JSON string" error is gone and if you check background jobs admin page, the status of that same job is now "Finished". This means since now the bug is solved, so: 8. Start RabbitMQ service inside the machine from root: `service rabbitmq-server start`. Start Koha-worker from the root user: `koha-worker --start %YOUR_KOHA_INSTANCE%`. 8. Repeat from step 3: edit biblio and save the record again, to create another background job. 9. Check the background job page and see that the new job now has "Finished" status, so the patch works.
Created attachment 139058 [details] [review] Bug 31351: encode bytes to unicode in JSON for background tasks context "malformed UTF-8 character in JSON string, at character offset 38 (before "\x{fffd}"register_id...") at /home/vagrant/kohaclone/Koha/BackgroundJob.pm line 170." Steps to reproduce: 1. Go to your user settings and change your current logged user's name/surname so it contains non-ASCII character, for example "ä" (U+00E4). 2. Log out and log back in with the same user so session will be refreshed with updated info. 3. Open any biblio record in marc-record edit mode and save it back: just to initiate ElasticSearch background reindex job creation. 4. Open background jobs admin page, you will see there "Update Elasticsearch index" job that is in status "New". Normal behaviour for it is to already get to "Finished" status, as this job is simple and going to be completed quickly, but it's going to remain in "New" status and never get to the "Finished" state. That's why: 5. Stop Koha-worker from the root user: `koha-worker --stop %YOUR_KOHA_INSTANCE%` (%YOUR_KOHA_INSTANCE% is your Koha instance name, for ex. 'kohadev'), and stop RabbitMQ service inside machine from root: `service rabbitmq-server stop`, then if you run misc/background_jobs_worker.pl inside of your koha machine from under koha user, that script won't be able to connect to RabbitMQ (it warns about this) so it will process tasks one-by-one itself in "no-server-available" mode, and thus you will be able to find the problem, why background job task remains "New": you will see the "malformed UTF-8 character in JSON string" error. 6. Apply the patch. 7. Run background_jobs_worker.pl script again, ensure that after it warns about "Connection refused" to RabbitMQ server, it processes all "New" tasks again like in step 5, but the "malformed UTF-8 character in JSON string" error is gone and if you check background jobs admin page, the status of that same job is now "Finished". This means since now the bug solved, so: 8. Start RabbitMQ service inside machine from root: `service rabbitmq-server start`. Start Koha-worker from the root user: `koha-worker --start %YOUR_KOHA_INSTANCE%`. 8. Repeat from step 3: edit biblio and save the record again, to create another background job. 9. Check that the background job page and see that new job now has "Finished" status, so patch works.
Created attachment 139438 [details] [review] Bug 31351: encode bytes to unicode in JSON for background tasks context "malformed UTF-8 character in JSON string, at character offset 38 (before "\x{fffd}"register_id...") at /home/vagrant/kohaclone/Koha/BackgroundJob.pm line 170." Steps to reproduce: 1. Go to your user settings and change your current logged user's name/surname so it contains non-ASCII character, for example "ä" (U+00E4). 2. Log out and log back in with the same user so session will be refreshed with updated info. 3. Open any biblio record in marc-record edit mode and save it back: just to initiate ElasticSearch background reindex job creation. 4. Open background jobs admin page, you will see there "Update Elasticsearch index" job that is in status "New". Normal behaviour for it is to already get to "Finished" status, as this job is simple and going to be completed quickly, but it's going to remain in "New" status and never get to the "Finished" state. That's why: 5. Stop Koha-worker from the root user: `koha-worker --stop %YOUR_KOHA_INSTANCE%` (%YOUR_KOHA_INSTANCE% is your Koha instance name, for ex. 'kohadev'), and stop RabbitMQ service inside machine from root: `service rabbitmq-server stop`, then if you run misc/background_jobs_worker.pl inside of your koha machine from under koha user, that script won't be able to connect to RabbitMQ (it warns about this) so it will process tasks one-by-one itself in "no-server-available" mode, and thus you will be able to find the problem, why background job task remains "New": you will see the "malformed UTF-8 character in JSON string" error. 6. Apply the patch. 7. Run background_jobs_worker.pl script again, ensure that after it warns about "Connection refused" to RabbitMQ server, it processes all "New" tasks again like in step 5, but the "malformed UTF-8 character in JSON string" error is gone and if you check background jobs admin page, the status of that same job is now "Finished". This means since now the bug solved, so: 8. Start RabbitMQ service inside machine from root: `service rabbitmq-server start`. Start Koha-worker from the root user: `koha-worker --start %YOUR_KOHA_INSTANCE%`. 8. Repeat from step 3: edit biblio and save the record again, to create another background job. 9. Check that the background job page and see that new job now has "Finished" status, so patch works. Signed-off-by: David Nind <david@davidnind.com>
Hi, my $context = decode_json encode_utf8 $self->context; I think we prefer the syntax with parenthesis : my $context = decode_json(encode_utf8($self->context));
I'm wondering whether the fix is correct, and whether the other encode_utf8 calls in the same file are the right way to deal with encoding. Shouldn't we already decode the data as Perl's internal string when we read the data? In this particular case it would mean that C4::Context->userenv should be made so that it contains unicode strings.
Created attachment 140222 [details] [review] Bug 31351: encode bytes to unicode in JSON for background tasks context "malformed UTF-8 character in JSON string, at character offset 38 (before "\x{fffd}"register_id...") at /home/vagrant/kohaclone/Koha/BackgroundJob.pm line 170." Steps to reproduce: 1. Go to your user settings and change your current logged user's name/surname so it contains non-ASCII character, for example "ä" (U+00E4). 2. Log out and log back in with the same user so session will be refreshed with updated info. 3. Open any biblio record in marc-record edit mode and save it back: just to initiate ElasticSearch background reindex job creation. 4. Open background jobs admin page, you will see there "Update Elasticsearch index" job that is in status "New". Normal behaviour for it is to already get to "Finished" status, as this job is simple and going to be completed quickly, but it's going to remain in "New" status and never get to the "Finished" state. That's why: 5. Stop Koha-worker from the root user: `koha-worker --stop %YOUR_KOHA_INSTANCE%` (%YOUR_KOHA_INSTANCE% is your Koha instance name, for ex. 'kohadev'), and stop RabbitMQ service inside machine from root: `service rabbitmq-server stop`, then if you run misc/background_jobs_worker.pl inside of your koha machine from under koha user, that script won't be able to connect to RabbitMQ (it warns about this) so it will process tasks one-by-one itself in "no-server-available" mode, and thus you will be able to find the problem, why background job task remains "New": you will see the "malformed UTF-8 character in JSON string" error. 6. Apply the patch. 7. Run background_jobs_worker.pl script again, ensure that after it warns about "Connection refused" to RabbitMQ server, it processes all "New" tasks again like in step 5, but the "malformed UTF-8 character in JSON string" error is gone and if you check background jobs admin page, the status of that same job is now "Finished". This means since now the bug solved, so: 8. Start RabbitMQ service inside machine from root: `service rabbitmq-server start`. Start Koha-worker from the root user: `koha-worker --start %YOUR_KOHA_INSTANCE%`. 8. Repeat from step 3: edit biblio and save the record again, to create another background job. 9. Check that the background job page and see that new job now has "Finished" status, so patch works. Signed-off-by: David Nind <david@davidnind.com>
I love these encoding problems :)
Peter, Your fix seems okay to me. But look in this module for decode_json and you will find another one ? return $self->data ? decode_json( $self->data ) : undef; Could you have a look?
Created attachment 140236 [details] [review] Bug 31351: ALTERNATIVE Koha::BackgroundJob: Let database connection object handle utf8 transcoding Our database connections have been set up so that they will automatically convert perl encoded strings to utf8 encoded strings when sending data to database tables and decode the utf8 encoded strings from the tables back to internal perl strings. As we can see from a call in Koha/BackgroundJob.pm we are encoding the perl internal string to utf8 and then decoding it back to perl internal string: my $data_dump = decode_json encode_utf8 $self->data; We can skip this unnecessary encode<->decode step (as the database object has done the decoding for us) by simply calling the JSON->new->decode() method which doesn't perform any string decoding. Furthermore, the original code was buggy and didn't always remember to encode the unencoded strings, in Koha::BackgroundJob::process we can see my $context = decode_json($self->context); is missing the encode step. Now after this change encoding before decoding is not necessary as we are using the methods from the JSON module that do not perform any transcoding. Note to those concerned whether the old data in the database is compatible with this new code: Luckily our database connection object seems to be smart and didn't utf8 encode the utf8 returned data from the old encode_json() calls (probably checks the utf8 flag for the string (Encode::is_utf8($str))). To test whether this fixes the original bug reported of not being able to schedule background jobs with koha user having non-ASCII letters in their surname: 1) Change your staff users surname/lastname to "ääää" 2) Log out and back in. 3) Go to a biblio record detail page and click "Select all" in the items table 4) Click Delete selected items and proceed with the deletion 5) Notice the batch item deletion job has failed status 6) Apply patch and repeat but this time the deletion job should finish.
(In reply to Joonas Kylmälä from comment #4) > I'm wondering whether the fix is correct, and whether the other encode_utf8 > calls in the same file are the right way to deal with encoding. Shouldn't we > already decode the data as Perl's internal string when we read the data? Did some more research and I think we definitely shouldn't do unnecesary encoding and decoding. I attached and alternative patch that removes the unecessary transcoding, which also fixes the bug. For more details see the patch description. Please remove "ALTERNATIVE" from the patch title before pushing if it will be the fix chosen.
Created attachment 140262 [details] [review] Bug 31351: Koha::BackgroundJob: Let database connection object handle utf8 transcoding Our database connections have been set up so that they will automatically convert perl encoded strings to utf8 encoded strings when sending data to database tables and decode the utf8 encoded strings from the tables back to internal perl strings. As we can see from a call in Koha/BackgroundJob.pm we are encoding the perl internal string to utf8 and then decoding it back to perl internal string: my $data_dump = decode_json encode_utf8 $self->data; We can skip this unnecessary encode<->decode step (as the database object has done the decoding for us) by simply calling the JSON->new->decode() method which doesn't perform any string decoding. Furthermore, the original code was buggy and didn't always remember to encode the unencoded strings, in Koha::BackgroundJob::process we can see my $context = decode_json($self->context); is missing the encode step. Now after this change encoding before decoding is not necessary as we are using the methods from the JSON module that do not perform any transcoding. Note to those concerned whether the old data in the database is compatible with this new code: Luckily our database connection object seems to be smart and didn't utf8 encode the utf8 returned data from the old encode_json() calls (probably checks the utf8 flag for the string (Encode::is_utf8($str))). To test whether this fixes the original bug reported of not being able to schedule background jobs with koha user having non-ASCII letters in their surname: 1) Change your staff users surname/lastname to "ääää" 2) Log out and back in. 3) Go to a biblio record detail page and click "Select all" in the items table 4) Click Delete selected items and proceed with the deletion 5) Notice the batch item deletion job has failed status 6) Apply patch and repeat but this time the deletion job should finish. Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
WIll qa this one today
QAing
For starters, we could have added a test. Maybe I will..
When using the functional interface of the JSON module we should have used to_json and from_json instead of decode/encode. This would require less code changes here, but the author actually recommends the OO interface. So I agree with that switch now.
We need to adjust the test too btw t/db_dependent/Koha/BackgroundJob.t Contains a rather simple test set: subtest 'decoded_data() and set_encoded_data() tests' => sub {
And this module package t::lib::Koha::BackgroundJob::BatchTest;
Out of scope but possible candidates for inspection too ;) root@master:/usr/share/koha# git grep decode_json C4/Search/History.pm:use JSON qw( decode_json encode_json ); C4/Search/History.pm: if (decode_json( $cookie )) { C4/Search/History.pm: @searches = @{decode_json( $cookie )} C4/Search/History.pm: eval { decode_json( uri_unescape( $session->param('search_history') ) ) }; C4/UsageStats.pm:use JSON qw( decode_json encode_json ); C4/UsageStats.pm: my $content = decode_json( $res->decoded_content ); Koha/REST/Plugin/Query.pm:use JSON qw( decode_json ); Koha/REST/Plugin/Query.pm: $q_params = decode_json($q_params) unless reftype $q_params; misc/background_jobs_worker.pl:use JSON qw( decode_json ); misc/background_jobs_worker.pl: my $args = decode_json($body); misc/background_jobs_worker.pl: my $args = decode_json($job->data); misc/devel/get_prepared_letter.pl:use JSON qw( decode_json ); misc/devel/get_prepared_letter.pl:$repeat = $repeat ? decode_json($repeat) : {}; misc/devel/get_prepared_letter.pl:$tables = $tables ? decode_json($tables) : {}; misc/devel/get_prepared_letter.pl:$loops = $loops ? decode_json($loops) : {}; opac/opac-detail.pl:use JSON qw( decode_json ); opac/opac-search.pl:use JSON qw/decode_json encode_json/; opac/svc/auth/googleopenidconnect: my $json = decode_json( $response->decoded_content ); opac/svc/auth/googleopenidconnect: my $json = decode_json($response); opac/svc/auth/googleopenidconnect: my $claims_json = decode_json($claims); svc/report:use JSON qw( encode_json decode_json );
Koha/BackgroundJobs/BatchUpdateBiblio.t:use JSON qw( encode_json decode_json ); Koha/BackgroundJobs/BatchUpdateBiblio.t: my $data = decode_json $job->get_from_storage->data;
This is a pity, but we'll manage: Koha/BackgroundJob/BatchCancelHold.pm | 7 +++--- NO TESTS Koha/BackgroundJob/BatchDeleteAuthority.pm | 7 +++--- NO TESTS Koha/BackgroundJob/BatchUpdateAuthority.pm | 7 +++--- NO TESTS .../BatchUpdateBiblioHoldsQueue.pm | 7 +++--- NO REAL TESTS, ONLY MOCKS Koha/BackgroundJob/BatchUpdateItem.pm | 8 +++---- NO TESTS
Still extending the encode/decode subtest now a bit
And misc/background_jobs_worker.pl itself needed attention,,
Created attachment 140310 [details] [review] Bug 31351: Koha::BackgroundJob: Let database connection object handle utf8 transcoding Our database connections have been set up so that they will automatically convert perl encoded strings to utf8 encoded strings when sending data to database tables and decode the utf8 encoded strings from the tables back to internal perl strings. As we can see from a call in Koha/BackgroundJob.pm we are encoding the perl internal string to utf8 and then decoding it back to perl internal string: my $data_dump = decode_json encode_utf8 $self->data; We can skip this unnecessary encode<->decode step (as the database object has done the decoding for us) by simply calling the JSON->new->decode() method which doesn't perform any string decoding. Furthermore, the original code was buggy and didn't always remember to encode the unencoded strings, in Koha::BackgroundJob::process we can see my $context = decode_json($self->context); is missing the encode step. Now after this change encoding before decoding is not necessary as we are using the methods from the JSON module that do not perform any transcoding. Note to those concerned whether the old data in the database is compatible with this new code: Luckily our database connection object seems to be smart and didn't utf8 encode the utf8 returned data from the old encode_json() calls (probably checks the utf8 flag for the string (Encode::is_utf8($str))). To test whether this fixes the original bug reported of not being able to schedule background jobs with koha user having non-ASCII letters in their surname: 1) Change your staff users surname/lastname to "ääää" 2) Log out and back in. 3) Go to a biblio record detail page and click "Select all" in the items table 4) Click Delete selected items and proceed with the deletion 5) Notice the batch item deletion job has failed status 6) Apply patch and repeat but this time the deletion job should finish. Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de> Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Created attachment 140311 [details] [review] Bug 31351: (QA follow-up) Use $self->json in Background modules Making the disabling utf8 flag explicit instead of depending on the default of the CPAN module. Incorporating the change in background_jobs_worker too. Test plan: See next patches when we look at unit tests. Restart koha-worker. Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Created attachment 140312 [details] [review] Bug 13351: (QA follow-up) Adjust tests accordingly Test plan: Run t/db_dependent/Koha/BackgroundJob.t Run t/db_dependent/Koha/BackgroundJobs.t Prove t/db_dependent/Koha/BackgroundJobs Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Created attachment 140313 [details] [review] Bug 31351: (QA follow-up) Extend the encode/decode test Adding some Unicode stuff. And missing txn couple. Test plan: Run t/db_dependent/Koha/BackgroundJob.t Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Hmm. This was a bit heavier than expected. We could squash when backporting. Happy to assist RMaints if needed. Should not be that hard, new code. Not sure how far we come with backports btw.
Note for Jonathan: Left a TODO in background_jobs_worker.pl where we read data from the message broker.
Oops wrong bug number. Hold on
Created attachment 140314 [details] [review] Bug 31351: (QA follow-up) Adjust tests accordingly Test plan: Run t/db_dependent/Koha/BackgroundJob.t Run t/db_dependent/Koha/BackgroundJobs.t Prove t/db_dependent/Koha/BackgroundJobs Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Created attachment 140315 [details] [review] Bug 31351: (QA follow-up) Extend the encode/decode test Adding some Unicode stuff. And missing txn couple. Test plan: Run t/db_dependent/Koha/BackgroundJob.t Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
I think the changes in the task-specific classes is wrong. They should be using `$self->decoded_data` and `$self->set_encoded_data` instead. Those methods were introduced on bug 30360.
FTR: the JSON::encode_json and JSON::decode_json methods deal with UTF-8 [1] and I'm not sure why there's a ->utf8(0) being set. Overall, I think we should go the bug 30943 path. Take UpdateElasticIndex (as tweaked by bug 30360) as a reference.
(In reply to Tomás Cohen Arazi from comment #32) > FTR: the JSON::encode_json and JSON::decode_json methods deal with UTF-8 [1] > and I'm not sure why there's a ->utf8(0) being set. > > Overall, I think we should go the bug 30943 path. Take UpdateElasticIndex > (as tweaked by bug 30360) as a reference. Please check the comments earlier. We should not do the UTF8 conversion in the middle of the process but when storing only. The uf8(0) is just a confirmation of using this default from the JSON OO object. To make it explicit. There were inconsistencies in the codebase that are addressed here. I would recommend going further here. Moving to the helpers is fine of course.
(In reply to Marcel de Rooy from comment #33) > (In reply to Tomás Cohen Arazi from comment #32) > > FTR: the JSON::encode_json and JSON::decode_json methods deal with UTF-8 [1] > > and I'm not sure why there's a ->utf8(0) being set. > > > > Overall, I think we should go the bug 30943 path. Take UpdateElasticIndex > > (as tweaked by bug 30360) as a reference. > > Please check the comments earlier. We should not do the UTF8 conversion in > the middle of the process but when storing only. > The uf8(0) is just a confirmation of using this default from the JSON OO > object. To make it explicit. > > There were inconsistencies in the codebase that are addressed here. I would > recommend going further here. Moving to the helpers is fine of course. Ok, I'll take a look again. I loved the added tests the most.
Pushed to master for 22.11. Nice work everyone, thanks!
(In reply to Tomás Cohen Arazi from comment #34) > Ok, I'll take a look again. I loved the added tests the most. Thanks, Tomas.
Can we please have this backported?
Backported to 22.05.x for upcoming 22.05.06 release
depends on bug29346 which introduce BatchUpdateBiblioHoldQueue. Lots of conflicts while trying the rebase. Please provide a backport patch for 21.11.x if this one is needed. Won't backport unless asked to. Arthur
*** Bug 32143 has been marked as a duplicate of this bug. ***