Bugzilla – Attachment 145243 Details for
Bug 27859
MARC export for search results
Home
|
New
|
Browse
|
Search
|
[?]
|
Reports
|
Help
|
New Account
|
Log In
[x]
|
Forgot Password
Login:
[x]
[patch]
Bug 27859: marc search result export
Bug-27859-marc-search-result-export.patch (text/plain), 48.54 KB, created by
David Gustafsson
on 2023-01-12 18:44:08 UTC
(
hide
)
Description:
Bug 27859: marc search result export
Filename:
MIME Type:
Creator:
David Gustafsson
Created:
2023-01-12 18:44:08 UTC
Size:
48.54 KB
patch
obsolete
>From ebadfa4ca1ea5253c69358d2f54f46329c28303e Mon Sep 17 00:00:00 2001 >From: David Gustafsson <david.gustafsson@ub.gu.se> >Date: Fri, 25 Sep 2020 16:07:41 +0200 >Subject: [PATCH] Bug 27859: marc search result export > >Enable export of staff interface search results in different marc formats. >This feature is only supported when using Elasticsearch. > >To test: >1) Apply patch >2) Run installer/data/mysql/updatedatabase.pl >3) Make sure the syspref EnableSearchResultMARCExport is enabled >4) Make sure the current user has the tools -> export_catalog permission >5) Perform a search >6) Export the serach result by choosing a format under the > "Export results" drop down >7) Verify that link(s) with exported data appear when export is > completed >8) Revoke the permission in 3) and ensure exporting is no longer > possible >9) Run tests in t/db_dependent/Koha/SearchEngine/Elasticsearch.t > >Sponsored-by: Gothenburg University Library >--- > Koha/BackgroundJob.pm | 1 + > Koha/BackgroundJob/SearchResultMARCExport.pm | 186 ++++++++++ > Koha/SearchEngine/Elasticsearch.pm | 335 ++++++++++++++++-- > Koha/SearchEngine/Elasticsearch/Search.pm | 31 +- > catalogue/search.pl | 46 +++ > ...able_search_result_marc_export_sysprefs.pl | 16 + > installer/data/mysql/mandatory/sysprefs.sql | 5 +- > .../prog/en/modules/admin/background_jobs.tt | 4 + > .../en/modules/admin/preferences/admin.pref | 4 +- > .../modules/admin/preferences/searching.pref | 34 ++ > .../prog/en/modules/catalogue/results.tt | 51 +++ > .../Koha/SearchEngine/Elasticsearch.t | 117 +++++- > 12 files changed, 756 insertions(+), 74 deletions(-) > create mode 100644 Koha/BackgroundJob/SearchResultMARCExport.pm > create mode 100755 installer/data/mysql/atomicupdate/bug_27859-add_enable_search_result_marc_export_sysprefs.pl > >diff --git a/Koha/BackgroundJob.pm b/Koha/BackgroundJob.pm >index 727bc3b2f7..5bfa7433b5 100644 >--- a/Koha/BackgroundJob.pm >+++ b/Koha/BackgroundJob.pm >@@ -427,6 +427,7 @@ sub core_types_to_classes { > stage_marc_for_import => 'Koha::BackgroundJob::StageMARCForImport', > marc_import_commit_batch => 'Koha::BackgroundJob::MARCImportCommitBatch', > marc_import_revert_batch => 'Koha::BackgroundJob::MARCImportRevertBatch', >+ search_result_marc_export => 'Koha::BackgroundJob::SearchResultMARCExport', > }; > } > >diff --git a/Koha/BackgroundJob/SearchResultMARCExport.pm b/Koha/BackgroundJob/SearchResultMARCExport.pm >new file mode 100644 >index 0000000000..b35c390d9c >--- /dev/null >+++ b/Koha/BackgroundJob/SearchResultMARCExport.pm >@@ -0,0 +1,186 @@ >+package Koha::BackgroundJob::SearchResultMARCExport; >+ >+# This file is part of Koha. >+# >+# Koha is free software; you can redistribute it and/or modify it >+# under the terms of the GNU General Public License as published by >+# the Free Software Foundation; either version 3 of the License, or >+# (at your option) any later version. >+# >+# Koha is distributed in the hope that it will be useful, but >+# WITHOUT ANY WARRANTY; without even the implied warranty of >+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the >+# GNU General Public License for more details. >+# >+# You should have received a copy of the GNU General Public License >+# along with Koha; if not, see <http://www.gnu.org/licenses>. >+ >+use Modern::Perl; >+use Try::Tiny; >+use Koha::SearchEngine::Search; >+ >+use File::Spec; >+use File::Path qw(mkpath); >+use Koha::Email; >+use Koha::UploadedFiles; >+use POSIX qw(strftime); >+use Digest::MD5 qw(md5_hex); >+use Carp qw(croak); >+ >+use base 'Koha::BackgroundJob'; >+ >+=head1 NAME >+ >+Koha::BackgroundJob::SearchResultMARCExport - Export MARC records from search result >+ >+This is a subclass of Koha::BackgroundJob. >+ >+=head1 API >+ >+=head2 Class methods >+ >+=head3 job_type >+ >+Define the job type of this job: stage_marc_for_import >+ >+=cut >+ >+sub job_type { >+ return 'search_result_marc_export'; >+} >+ >+=head3 process >+ >+Stage the MARC records for import. >+ >+=cut >+ >+sub process { >+ my ( $self, $args ) = @_; >+ >+ $self->start; >+ >+ my $data = $self->decoded_data; >+ my $borrowernumber = $data->{borrowernumber}; >+ my $patron = Koha::Patrons->find( $borrowernumber ); >+ my $elasticsearch_query = $args->{elasticsearch_query}; >+ my $preferred_format = $args->{preferred_format}; >+ my $searcher = Koha::SearchEngine::Search->new({ >+ index => $Koha::SearchEngine::BIBLIOS_INDEX >+ }); >+ my $elasticsearch = $searcher->get_elasticsearch(); >+ >+ my $results = eval { >+ $elasticsearch->search( >+ index => $searcher->index_name, >+ scroll => '1m', #TODO: Syspref for scroll time limit? >+ size => 1000, #TODO: Syspref for batch size? >+ body => $elasticsearch_query >+ ); >+ }; >+ my @errors; >+ push @errors, $@ if $@; >+ >+ my @docs; >+ my $encoded_results; >+ my %export_links; >+ my $query_string = $elasticsearch_query->{query}->{query_string}->{query}; >+ >+ if (!@errors) { >+ my $scroll_id = $results->{_scroll_id}; >+ while (@{$results->{hits}->{hits}}) { >+ push @docs, @{$results->{hits}->{hits}}; >+ $self->progress( $self->progress + scalar @{$results->{hits}->{hits}} )->store; >+ $results = $elasticsearch->scroll( >+ scroll => '1m', >+ scroll_id => $scroll_id >+ ); >+ } >+ >+ if ($preferred_format eq 'ISO2709' || $preferred_format eq 'MARCXML') { >+ $encoded_results = $searcher->search_documents_encode(\@docs, $preferred_format); >+ } >+ else { >+ $encoded_results->{$preferred_format->{name}} = >+ $searcher->search_documents_custom_format_encode(\@docs, $preferred_format); >+ } >+ >+ my %format_extensions = ( >+ 'ISO2709' => '.mrc', >+ 'MARCXML' => '.xml', >+ ); >+ >+ my $upload_dir = Koha::UploadedFile->permanent_directory; >+ >+ while (my ($format, $data) = each %{$encoded_results}) { >+ my $hash = md5_hex($data); >+ my $category = "search_marc_export"; >+ my $time = strftime "%Y%m%d_%H%M", localtime time; >+ my $ext = exists $format_extensions{$format} ? $format_extensions{$format} : '.txt'; >+ my $filename = $category . '_' . $time . $ext; >+ my $file_dir = File::Spec->catfile($upload_dir, $category); >+ if ( !-d $file_dir) { >+ unless(mkpath $file_dir) { >+ push @errors, "Failed to create $file_dir"; >+ next; >+ } >+ } >+ my $filepath = File::Spec->catfile($file_dir, "${hash}_${filename}"); >+ >+ my $fh = IO::File->new($filepath, "w"); >+ >+ if ($fh) { >+ $fh->binmode; >+ print $fh $data; >+ $fh->close; >+ >+ my $size = -s $filepath; >+ my $file = Koha::UploadedFile->new({ >+ hashvalue => $hash, >+ filename => $filename, >+ dir => $category, >+ filesize => $size, >+ owner => $borrowernumber, >+ uploadcategorycode => 'search_marc_export', >+ public => 0, >+ permanent => 1, >+ })->store; >+ my $id = $file->_result()->get_column('id'); >+ $export_links{$format} = "/cgi-bin/koha/tools/upload.pl?op=download&id=$id"; >+ } >+ else { >+ push @errors, "Failed to write \"$filepath\""; >+ } >+ } >+ } >+ my $report = { >+ export_links => \%export_links, >+ total => scalar @docs, >+ errors => \@errors, >+ query_string => $query_string, >+ }; >+ $data->{report} = $report; >+ if (@errors) { >+ $self->set({ progress => 0, status => 'failed' })->store; >+ } >+ else { >+ $self->finish($data); >+ } >+} >+ >+=head3 enqueue >+ >+Enqueue the new job >+ >+=cut >+ >+sub enqueue { >+ my ( $self, $args) = @_; >+ $self->SUPER::enqueue({ >+ job_size => $args->{size}, >+ job_args => $args, >+ job_queue => 'long_tasks', >+ }); >+} >+ >+1; >diff --git a/Koha/SearchEngine/Elasticsearch.pm b/Koha/SearchEngine/Elasticsearch.pm >index 1fe6fb1c6f..163ac54018 100644 >--- a/Koha/SearchEngine/Elasticsearch.pm >+++ b/Koha/SearchEngine/Elasticsearch.pm >@@ -42,8 +42,8 @@ use YAML::XS; > > use List::Util qw( sum0 ); > use MARC::File::XML; >-use MIME::Base64 qw( encode_base64 ); >-use Encode qw( encode ); >+use MIME::Base64 qw(encode_base64 decode_base64); >+use Encode qw(encode decode); > use Business::ISBN; > use Scalar::Util qw( looks_like_number ); > >@@ -542,7 +542,6 @@ sub marc_records_to_documents { > my $control_fields_rules = $rules->{control_fields}; > my $data_fields_rules = $rules->{data_fields}; > my $marcflavour = lc C4::Context->preference('marcflavour'); >- my $use_array = C4::Context->preference('ElasticsearchMARCFormat') eq 'ARRAY'; > > my @record_documents; > >@@ -742,35 +741,21 @@ sub marc_records_to_documents { > } > } > } >+ my $preferred_format = C4::Context->preference('ElasticsearchMARCFormat'); > >- # TODO: Perhaps should check if $records_document non empty, but really should never be the case >- $record->encoding('UTF-8'); >- if ($use_array) { >- $record_document->{'marc_data_array'} = $self->_marc_to_array($record); >- $record_document->{'marc_format'} = 'ARRAY'; >+ my ($encoded_record, $format) = $self->search_document_marc_record_encode( >+ $record, >+ $preferred_format, >+ $marcflavour >+ ); >+ >+ if ($preferred_format eq 'ARRAY') { >+ $record_document->{'marc_data_array'} = $encoded_record; > } else { >- my @warnings; >- { >- # Temporarily intercept all warn signals (MARC::Record carps when record length > 99999) >- local $SIG{__WARN__} = sub { >- push @warnings, $_[0]; >- }; >- $record_document->{'marc_data'} = encode_base64(encode('UTF-8', $record->as_usmarc())); >- } >- if (@warnings) { >- # Suppress warnings if record length exceeded >- unless (substr($record->leader(), 0, 5) eq '99999') { >- foreach my $warning (@warnings) { >- carp $warning; >- } >- } >- $record_document->{'marc_data'} = $record->as_xml_record($marcflavour); >- $record_document->{'marc_format'} = 'MARCXML'; >- } >- else { >- $record_document->{'marc_format'} = 'base64ISO2709'; >- } >+ $record_document->{'marc_data'} = $encoded_record; > } >+ $record_document->{'marc_format'} = $format; >+ > > # Check if there is at least one available item > if ($self->index eq $BIBLIOS_INDEX) { >@@ -783,7 +768,6 @@ sub marc_records_to_documents { > onloan => undef, > itemlost => 0, > })->count; >- > $record_document->{available} = $avail_items ? \1 : \0; > } > } >@@ -793,6 +777,287 @@ sub marc_records_to_documents { > return \@record_documents; > } > >+=head2 search_document_marc_record_encode($record, $format, $marcflavour) >+ my ($encoded_record, $format) = search_document_marc_record_encode($record, $format, $marcflavour) >+ >+Encode a MARC::Record to the preferred marc document record format. If record >+exceeds ISO2709 maximum size record size and C<$format> is set to >+'base64ISO2709' format will fallback to 'MARCXML' instead. >+ >+=over 4 >+ >+=item C<$record> >+ >+A MARC::Record object >+ >+=item C<$marcflavour> >+ >+The marcflavour to use >+ >+=back >+ >+=cut >+ >+sub search_document_marc_record_encode { >+ my ($self, $record, $format, $marcflavour) = @_; >+ >+ $record->encoding('UTF-8'); >+ >+ if ($format eq 'ARRAY') { >+ return ($self->_marc_to_array($record), $format); >+ } >+ elsif ($format eq 'base64ISO2709' || $format eq 'ISO2709') { >+ my @warnings; >+ my $marc_data; >+ { >+ # Temporarily intercept all warn signals (MARC::Record carps when record length > 99999) >+ local $SIG{__WARN__} = sub { >+ push @warnings, $_[0]; >+ }; >+ $marc_data = $record->as_usmarc(); >+ } >+ if (@warnings) { >+ # Suppress warnings if record length exceeded >+ unless (substr($record->leader(), 0, 5) eq '99999') { >+ foreach my $warning (@warnings) { >+ carp $warning; >+ } >+ } >+ return (MARC::File::XML::record($record, $marcflavour), 'MARCXML'); >+ } >+ else { >+ if ($format eq 'base64ISO2709') { >+ $marc_data = encode_base64(encode('UTF-8', $marc_data)); >+ } >+ return ($marc_data, $format); >+ } >+ } >+ elsif ($format eq 'MARCXML') { >+ return (MARC::File::XML::record($record, $marcflavour), $format); >+ } >+ else { >+ # This should be unlikely to happen >+ croak "Invalid marc record serialization format: $format"; >+ } >+} >+ >+=head2 search_document_marc_record_decode >+ my $marc_record = $self->search_document_marc_record_decode(@result); >+ >+Extract marc data from Elasticsearch result and decode to MARC::Record object >+ >+=cut >+ >+sub search_document_marc_record_decode { >+ # Result is passed in as array, will get flattened >+ # and first element will be $result >+ my ($self, $result) = @_; >+ if ($result->{marc_format} eq 'base64ISO2709') { >+ return MARC::Record->new_from_usmarc(decode_base64($result->{marc_data})); >+ } >+ elsif ($result->{marc_format} eq 'MARCXML') { >+ return MARC::Record->new_from_xml($result->{marc_data}, 'UTF-8', uc C4::Context->preference('marcflavour')); >+ } >+ elsif ($result->{marc_format} eq 'ARRAY') { >+ return $self->_array_to_marc($result->{marc_data_array}); >+ } >+ else { >+ Koha::Exceptions::Elasticsearch->throw("Missing marc_format field in Elasticsearch result"); >+ } >+} >+ >+=head2 search_documents_encode($docs, $preferred_format) >+ >+ $records_data = $self->search_documents_encode($docs, $preferred_format) >+ >+Return marc encoded records from ElasticSearch search result documents. The return value >+C<$marc_records> is a hashref with encoded records keyed by MARC format. >+ >+=over 4 >+ >+=item C<$docs> >+ >+An arrayref of Elasticsearch search documents >+ >+=item C<$preferred_format> >+ >+The preferred marc format: 'MARCXML' or 'ISO2709'. Records exceeding maximum >+length supported by ISO2709 will be exported as 'MARCXML' even if C<$preferred_format> >+is set to 'ISO2709'. >+ >+=back >+ >+=cut >+ >+sub search_documents_encode { >+ >+ my ($self, $docs, $preferred_format) = @_; >+ >+ my %encoded_records = ( >+ 'ISO2709' => [], >+ 'MARCXML' => [] >+ ); >+ >+ unless (exists $encoded_records{$preferred_format}) { >+ croak "Invalid preferred format: $preferred_format"; >+ } >+ >+ for my $es_record (@{$docs}) { >+ # Special optimized cases >+ my $marc_data; >+ my $resulting_format = $preferred_format; >+ if ($preferred_format eq 'MARCXML' && $es_record->{_source}{marc_format} eq 'MARCXML') { >+ $marc_data = $es_record->{_source}{marc_data}; >+ } >+ elsif ($preferred_format eq 'ISO2709' && $es_record->{_source}->{marc_format} eq 'base64ISO2709') { >+ $marc_data = decode_base64($es_record->{_source}->{marc_data}); >+ } >+ else { >+ my $record = $self->search_document_marc_record_decode($es_record->{'_source'}); >+ my $marcflavour = lc C4::Context->preference('marcflavour'); >+ ($marc_data, $resulting_format) = $self->search_document_marc_record_encode($record, $preferred_format, $marcflavour); >+ } >+ push @{$encoded_records{$resulting_format}}, $marc_data; >+ } >+ if (@{$encoded_records{'ISO2709'}}) { >+ $encoded_records{'ISO2709'} = join("", @{$encoded_records{'ISO2709'}}); >+ } >+ else { >+ delete $encoded_records{'ISO2709'}; >+ } >+ >+ if (@{$encoded_records{'MARCXML'}}) { >+ $encoded_records{'MARCXML'} = encode( >+ 'UTF-8', >+ join( >+ "\n", >+ MARC::File::XML::header(), >+ join("\n", @{$encoded_records{'MARCXML'}}), >+ MARC::File::XML::footer() >+ ) >+ ); >+ } >+ else { >+ delete $encoded_records{'MARCXML'}; >+ } >+ >+ return \%encoded_records; >+} >+ >+=head2 search_result_export_custom_formats() >+ >+ $custom_formats = $self->search_result_export_custom_formats() >+ >+Return user defined custom search result export formats. >+ >+=cut >+ >+sub search_result_export_custom_formats { >+ my $export_custom_formats_pref = C4::Context->yaml_preference('SearchResultMARCExportCustomFormats') || []; >+ my $custom_export_formats = {}; >+ >+ if (ref $export_custom_formats_pref eq 'ARRAY') { >+ for (my $i = 0; $i < @{$export_custom_formats_pref}; ++$i) { >+ # TODO: Perhaps validate on save or trow error here instead of just >+ # ignoring invalid formats >+ my $format = $export_custom_formats_pref->[$i]; >+ if ( >+ ref $format->{fields} eq 'ARRAY' && >+ @{$format->{fields}} && >+ $format->{name} >+ ) { >+ $format->{multiple} = 'ignore' unless exists $format->{multiple}; >+ $custom_export_formats->{"custom_$i"} = $format; >+ } >+ } >+ } >+ return $custom_export_formats; >+} >+ >+=head2 search_documents_custom_format_encode($docs, $custom_format) >+ >+ $records_data = $self->search_documents_custom_format_encode($docs, $custom_format) >+ >+Return encoded records from ElasticSearch search result documents using a >+custom format defined in the "SearchResultMARCExportCustomFormats" syspref. >+Returns the encoded records. >+ >+=over 4 >+ >+=item C<$docs> >+ >+An arrayref of Elasticsearch search documents >+ >+=item C<$format> >+ >+A hashref with the custom format definition. >+ >+=back >+ >+=cut >+ >+sub search_documents_custom_format_encode { >+ my ($self, $docs, $format) = @_; >+ >+ my $result; >+ >+ my $doc_get_fields = sub { >+ my ($doc, $fields) = @_; >+ my @row; >+ foreach my $field (@{$fields}) { >+ my $values = $doc->{_source}->{$field}; >+ push @row, ref $values eq 'ARRAY' ? $values : ['']; >+ } >+ return \@row; >+ }; >+ >+ my @rows = map { $doc_get_fields->($_, $format->{fields}) } @{$docs}; >+ >+ if($format->{multiple} eq 'ignore') { >+ for (my $i = 0; $i < @rows; ++$i) { >+ $rows[$i] = [map { $_->[0] } @{$rows[$i]}]; >+ } >+ } >+ elsif($format->{multiple} eq 'newline') { >+ if (@{$format->{fields}} == 1) { >+ @rows = map { [join("\n", @{$_->[0]})] } @rows; >+ } >+ else { >+ croak "'newline' is only valid for single field export formats"; >+ } >+ } >+ elsif($format->{multiple} eq 'join') { >+ for (my $i = 0; $i < @rows; ++$i) { >+ # Escape separator >+ for (my $j = 0; $j < @{$rows[$i]}; ++$j) { >+ for (my $k = 0; $k < @{$rows[$i][$j]}; ++$k) { >+ $rows[$i][$j][$k] =~ s/\|/\\|/g; >+ } >+ } >+ # Separate multiple values with "|" >+ $rows[$i] = [map { join("|", @{$_}) } @{$rows[$i]}]; >+ } >+ } >+ else { >+ croak "Invalid 'multiple' option: " . $format->{multiple}; >+ } >+ if (@{$format->{fields}} == 1) { >+ @rows = grep { $_ ne '' } map { $_->[0] } @rows; >+ } >+ else { >+ # Encode CSV >+ for (my $i = 0; $i < @rows; ++$i) { >+ # Escape quotes >+ for (my $j = 0; $j < @{$rows[$i]}; ++$j) { >+ $rows[$i][$j] =~ s/"/""/g; >+ } >+ $rows[$i] = join(',', map { "\"$_\"" } @{$rows[$i]}); >+ } >+ } >+ >+ return encode('UTF-8', join("\n", @rows)); >+} >+ > =head2 _marc_to_array($record) > > my @fields = _marc_to_array($record) >@@ -864,18 +1129,18 @@ sub _array_to_marc { > $record->leader($data->{leader}); > for my $field (@{$data->{fields}}) { > my $tag = (keys %{$field})[0]; >- $field = $field->{$tag}; >+ my $field_data = $field->{$tag}; > my $marc_field; >- if (ref($field) eq 'HASH') { >+ if (ref($field_data) eq 'HASH') { > my @subfields; >- foreach my $subfield (@{$field->{subfields}}) { >+ foreach my $subfield (@{$field_data->{subfields}}) { > my $code = (keys %{$subfield})[0]; > push @subfields, $code; > push @subfields, $subfield->{$code}; > } >- $marc_field = MARC::Field->new($tag, $field->{ind1}, $field->{ind2}, @subfields); >+ $marc_field = MARC::Field->new($tag, $field_data->{ind1}, $field_data->{ind2}, @subfields); > } else { >- $marc_field = MARC::Field->new($tag, $field) >+ $marc_field = MARC::Field->new($tag, $field_data) > } > $record->append_fields($marc_field); > } >diff --git a/Koha/SearchEngine/Elasticsearch/Search.pm b/Koha/SearchEngine/Elasticsearch/Search.pm >index e8e4412b56..20ae712947 100644 >--- a/Koha/SearchEngine/Elasticsearch/Search.pm >+++ b/Koha/SearchEngine/Elasticsearch/Search.pm >@@ -173,7 +173,7 @@ sub search_compat { > my $index = $offset; > my $hits = $results->{'hits'}; > foreach my $es_record (@{$hits->{'hits'}}) { >- $records[$index++] = $self->decode_record_from_result($es_record->{'_source'}); >+ $records[$index++] = $self->search_document_marc_record_decode($es_record->{'_source'}); > } > > # consumers of this expect a name-spaced result, we provide the default >@@ -234,7 +234,7 @@ sub search_auth_compat { > # it's not reproduced here yet. > my $authtype = $rs->single; > my $auth_tag_to_report = $authtype ? $authtype->auth_tag_to_report : ""; >- my $marc = $self->decode_record_from_result($record); >+ my $marc = $self->search_document_marc_record_decode($record); > my $mainentry = $marc->field($auth_tag_to_report); > my $reported_tag; > if ($mainentry) { >@@ -354,7 +354,7 @@ sub simple_search_compat { > my @records; > my $hits = $results->{'hits'}; > foreach my $es_record (@{$hits->{'hits'}}) { >- push @records, $self->decode_record_from_result($es_record->{'_source'}); >+ push @records, $self->search_document_marc_record_decode($es_record->{'_source'}); > } > return (undef, \@records, $hits->{'total'}); > } >@@ -374,31 +374,6 @@ sub extract_biblionumber { > return Koha::SearchEngine::Search::extract_biblionumber( $searchresultrecord ); > } > >-=head2 decode_record_from_result >- my $marc_record = $self->decode_record_from_result(@result); >- >-Extracts marc data from Elasticsearch result and decodes to MARC::Record object >- >-=cut >- >-sub decode_record_from_result { >- # Result is passed in as array, will get flattened >- # and first element will be $result >- my ( $self, $result ) = @_; >- if ($result->{marc_format} eq 'base64ISO2709') { >- return MARC::Record->new_from_usmarc(decode_base64($result->{marc_data})); >- } >- elsif ($result->{marc_format} eq 'MARCXML') { >- return MARC::Record->new_from_xml($result->{marc_data}, 'UTF-8', uc C4::Context->preference('marcflavour')); >- } >- elsif ($result->{marc_format} eq 'ARRAY') { >- return $self->_array_to_marc($result->{marc_data_array}); >- } >- else { >- Koha::Exceptions::Elasticsearch->throw("Missing marc_format field in Elasticsearch result"); >- } >-} >- > =head2 max_result_window > > Returns the maximum number of results that can be fetched >diff --git a/catalogue/search.pl b/catalogue/search.pl >index bf225911ef..f387ac91b6 100755 >--- a/catalogue/search.pl >+++ b/catalogue/search.pl >@@ -149,6 +149,7 @@ use C4::Koha qw( getitemtypeimagelocation GetAuthorisedValues ); > use URI::Escape; > use POSIX qw(ceil floor); > use C4::Search qw( searchResults enabled_staff_search_views z3950_search_args new_record_from_zebra ); >+use Koha::BackgroundJob::SearchResultMARCExport; > > use Koha::ItemTypes; > use Koha::Library::Groups; >@@ -161,6 +162,7 @@ use Koha::SearchFilters; > > use URI::Escape; > use JSON qw( decode_json encode_json ); >+use Carp qw(croak); > > my $DisplayMultiPlaceHold = C4::Context->preference("DisplayMultiPlaceHold"); > # create a new CGI object >@@ -725,6 +727,50 @@ for (my $i=0;$i<@servers;$i++) { > } #/end of the for loop > #$template->param(FEDERATED_RESULTS => \@results_array); > >+my $patron = Koha::Patrons->find( $borrowernumber ); >+my $export_enabled = >+ C4::Context->preference('EnableSearchResultMARCExport') && >+ C4::Context->preference('SearchEngine') eq 'Elasticsearch' && >+ $patron && $patron->has_permission({ tools => 'export_catalog' }); >+ >+$template->param(export_enabled => $export_enabled) if $template_name eq 'catalogue/results.tt'; >+ >+if ($export_enabled) { >+ >+ >+ my $export = $cgi->param('export'); >+ my $preferred_format = $cgi->param('export_format'); >+ my $custom_export_formats = $searcher->search_result_export_custom_formats; >+ >+ $template->param(custom_export_formats => $custom_export_formats); >+ >+ # TODO: Need to handle $hits = 0? >+ my $hits = $results_hashref->{biblioserver}->{'hits'} // 0; >+ >+ if ($export && $preferred_format && $hits) { >+ unless ( >+ $preferred_format eq 'ISO2709' || >+ $preferred_format eq 'MARCXML' >+ ) { >+ if (!exists $custom_export_formats->{$preferred_format}) { >+ croak "Invalid export format: $preferred_format"; >+ } >+ else { >+ $preferred_format = $custom_export_formats->{$preferred_format}; >+ } >+ } >+ my $size_limit = C4::Context->preference('SearchResultMARCExportLimit') || 0; >+ my %export_query = $size_limit ? (%{$query}, (size => $size_limit)) : %{$query}; >+ my $size = $size_limit && $hits > $size_limit ? $size_limit : $hits; >+ my $export_job_id = Koha::BackgroundJob::SearchResultMARCExport->new->enqueue({ >+ size => $size, >+ preferred_format => $preferred_format, >+ elasticsearch_query => \%export_query >+ }); >+ $template->param(export_job_id => $export_job_id); >+ } >+} >+ > my $gotonumber = $cgi->param('gotoNumber'); > if ( $gotonumber && ( $gotonumber eq 'last' || $gotonumber eq 'first' ) ) { > $template->{'VARS'}->{'gotoNumber'} = $gotonumber; >diff --git a/installer/data/mysql/atomicupdate/bug_27859-add_enable_search_result_marc_export_sysprefs.pl b/installer/data/mysql/atomicupdate/bug_27859-add_enable_search_result_marc_export_sysprefs.pl >new file mode 100755 >index 0000000000..06a56c2f74 >--- /dev/null >+++ b/installer/data/mysql/atomicupdate/bug_27859-add_enable_search_result_marc_export_sysprefs.pl >@@ -0,0 +1,16 @@ >+use Modern::Perl; >+ >+return { >+ bug_number => "27859", >+ description => "Add system preferences", >+ up => sub { >+ my ($args) = @_; >+ my ($dbh, $out) = @$args{qw(dbh out)}; >+ $dbh->do(q{ INSERT IGNORE INTO systempreferences (variable, value, options, explanation, type) VALUES ('EnableSearchResultMARCExport', 1, NULL, 'Enable search result MARC export', 'YesNo') }); >+ $dbh->do(q{ INSERT IGNORE INTO systempreferences (variable, value, options, explanation, type) VALUES ('SearchResultMARCExportCustomFormats', '', NULL, 'Search result MARC export custom formats', 'textarea') }); >+ $dbh->do(q{ INSERT IGNORE INTO systempreferences (variable, value, options, explanation, type) VALUES ('SearchResultMARCExportLimit', NULL, NULL, 'Search result MARC export limit', 'integer') }); >+ $dbh->do(q{ UPDATE systempreferences SET options = 'base64ISO2709|ARRAY' WHERE variable = 'ElasticsearchMARCFormat' }); >+ $dbh->do(q{ UPDATE systempreferences SET value = 'base64ISO2709' WHERE variable = 'ElasticsearchMARCFormat' AND value = 'ISO2709' }); >+ say $out "System preferences added"; >+ }, >+} >diff --git a/installer/data/mysql/mandatory/sysprefs.sql b/installer/data/mysql/mandatory/sysprefs.sql >index beebd27a6f..02359cba9a 100644 >--- a/installer/data/mysql/mandatory/sysprefs.sql >+++ b/installer/data/mysql/mandatory/sysprefs.sql >@@ -204,7 +204,7 @@ INSERT INTO systempreferences ( `variable`, `value`, `options`, `explanation`, ` > ('EdifactLSQ', 'location', 'location|ccode', 'Map EDI sequence code (GIR+LSQ) to Koha Item field', 'Choice'), > ('ElasticsearchIndexStatus_authorities', '0', 'Authorities index status', NULL, NULL), > ('ElasticsearchIndexStatus_biblios', '0', 'Biblios index status', NULL, NULL), >-('ElasticsearchMARCFormat', 'ISO2709', 'ISO2709|ARRAY', 'Elasticsearch MARC format. ISO2709 format is recommended as it is faster and takes less space, whereas array is searchable.', 'Choice'), >+('ElasticsearchMARCFormat', 'base64ISO2709', 'base64ISO2709|ARRAY', 'Elasticsearch MARC format. base64ISO2709 format is recommended as it is faster and takes less space, whereas array is searchable.', 'Choice'), > ('ElasticsearchCrossFields', '1', '', 'Enable "cross_fields" option for searches using Elastic search.', 'YesNo'), > ('EmailAddressForPatronRegistrations', '', '', ' If you choose EmailAddressForPatronRegistrations you have to enter a valid email address: ', 'free'), > ('EmailAddressForSuggestions','','',' If you choose EmailAddressForSuggestions you have to enter a valid email address: ','free'), >@@ -220,6 +220,7 @@ INSERT INTO systempreferences ( `variable`, `value`, `options`, `explanation`, ` > ('EnableSearchHistory','0','','Enable or disable search history','YesNo'), > ('EnableItemGroups','0','','Enable the item groups feature','YesNo'), > ('EnableItemGroupHolds','0','','Enable item groups holds feature','YesNo'), >+('EnableSearchResultMARCExport', '1', '', 'Enable search result MARC export', 'YesNo'), > ('EnhancedMessagingPreferences','1','','If ON, allows patrons to select to receive additional messages about items due or nearly due.','YesNo'), > ('EnhancedMessagingPreferencesOPAC', '1', NULL, 'If ON, show patrons messaging setting on the OPAC.', 'YesNo'), > ('ERMModule', '0', NULL, 'Enable the e-resource management module', 'YesNo'), >@@ -639,6 +640,8 @@ INSERT INTO systempreferences ( `variable`, `value`, `options`, `explanation`, ` > ('SearchEngine','Zebra','Elasticsearch|Zebra','Search Engine','Choice'), > ('SearchLimitLibrary', 'homebranch', 'homebranch|holdingbranch|both', "When limiting search results with a library or library group, use the item's home library, or holding library, or both.", 'Choice'), > ('SearchMyLibraryFirst','0',NULL,'If ON, OPAC searches return results limited by the user\'s library by default if they are logged in','YesNo'), >+('SearchResultMARCExportCustomFormats', '', NULL, 'Search result MARC export custom formats', 'textarea'), >+('SearchResultMARCExportLimit', NULL, NULL, 'Search result MARC export limit', 'integer'), > ('SearchWithISBNVariations','0',NULL,'If enabled, search on all variations of the ISBN','YesNo'), > ('SearchWithISSNVariations','0',NULL,'If enabled, search on all variations of the ISSN','YesNo'), > ('SelfCheckAllowByIPRanges','',NULL,'(Leave blank if not used. Use ranges or simple ip addresses separated by spaces, like <code>192.168.1.1 192.168.0.0/24</code>.)','Short'), >diff --git a/koha-tmpl/intranet-tmpl/prog/en/modules/admin/background_jobs.tt b/koha-tmpl/intranet-tmpl/prog/en/modules/admin/background_jobs.tt >index 2a18857b1b..e9297a05f8 100644 >--- a/koha-tmpl/intranet-tmpl/prog/en/modules/admin/background_jobs.tt >+++ b/koha-tmpl/intranet-tmpl/prog/en/modules/admin/background_jobs.tt >@@ -233,6 +233,10 @@ > '_id': 'marc_import_revert_batch', > '_str': _("Revert import MARC records") > }, >+ { >+ '_id': 'search_result_marc_export', >+ '_str': _("Search result MARC export") >+ }, > ]; > > function get_job_type (job_type) { >diff --git a/koha-tmpl/intranet-tmpl/prog/en/modules/admin/preferences/admin.pref b/koha-tmpl/intranet-tmpl/prog/en/modules/admin/preferences/admin.pref >index 593c8c3417..b7ed1b9770 100644 >--- a/koha-tmpl/intranet-tmpl/prog/en/modules/admin/preferences/admin.pref >+++ b/koha-tmpl/intranet-tmpl/prog/en/modules/admin/preferences/admin.pref >@@ -472,9 +472,9 @@ Administration: > - > - "Elasticsearch MARC format: " > - pref: ElasticsearchMARCFormat >- default: "ISO2709" >+ default: "base64ISO2709" > choices: >- "ISO2709": "ISO2709 (exchange format)" >+ "base64ISO2709": "ISO2709 (exchange format)" > "ARRAY": "Searchable array" > - <br>ISO2709 format is recommended as it is faster and takes less space, whereas array format makes the full MARC record searchable. > - <br><strong>NOTE:</strong> Making the full record searchable may have a negative effect on relevance ranking of search results. >diff --git a/koha-tmpl/intranet-tmpl/prog/en/modules/admin/preferences/searching.pref b/koha-tmpl/intranet-tmpl/prog/en/modules/admin/preferences/searching.pref >index 572c3c8417..60f1b20bda 100644 >--- a/koha-tmpl/intranet-tmpl/prog/en/modules/admin/preferences/searching.pref >+++ b/koha-tmpl/intranet-tmpl/prog/en/modules/admin/preferences/searching.pref >@@ -138,6 +138,40 @@ Searching: > 1: use > 0: "don't use" > - 'the operator "phr" in the callnumber and standard number staff interface searches.' >+ - >+ - pref: EnableSearchResultMARCExport >+ type: boolean >+ default: yes >+ choices: >+ yes: Enable >+ no: Disable >+ - "MARC export of search results. The export will be sent to to the logged in user's email address. Records exceeding the ISO2709 record size will be send at separate MARC XML attachment regardless of chosen export format (ElasticSearch only)." >+ - >+ - pref: SearchResultMARCExportCustomFormats >+ type: textarea >+ syntax: text/x-yaml >+ class: code >+ - "Define custom export formats as a YAML list of associative arrays." >+ - "A format have the required properties \"<strong>name</strong>\", \"<strong>fields</strong>\" and an optional \"<strong>multiple</strong>\".<br />" >+ - "<p><strong>name</strong>: the human readable name of the format exposed in the staff interface.</p>" >+ - "<p><strong>fields</strong>: a list of Elasticsearch fields to be included in the export.</p>" >+ - "If <strong>fields</strong> contain a only single field the export result will contain one value per row, for multiple fields a CSV-file will be produced.<br />" >+ - "<p><strong>multiple</strong>: <i>ignore</i>|<i>join</i>|<i>newline</i><br />The behavior when handling fields with multiple values.</p>" >+ - "<p><i>ignore</i> is the default option, only the first value will be included, the rest ignored.</p>" >+ - "<p><i>join</i>, multiple values will be contatenated with tab as a separator.</p>" >+ - "<p><i>newline</i>, a newline will be inserted for each value. This option does not allow \"<strong>fields</strong>\" to contain multiple fields.</p>" >+ - "Example:</br>" >+ - "- name: Biblionumbers<br />" >+ - " fields: [local-number]<br />" >+ - " multiple: ignore<br />" >+ - "- name: Title and author<br />" >+ - " fields: [title, author]<br />" >+ - " multiple: join<br />" >+ - >+ - "Limit exported MARC records from search results to a maximum of" >+ - pref: SearchResultMARCExportLimit >+ class: integer >+ - "records." > Results display: > - > - pref: numSearchResultsDropdown >diff --git a/koha-tmpl/intranet-tmpl/prog/en/modules/catalogue/results.tt b/koha-tmpl/intranet-tmpl/prog/en/modules/catalogue/results.tt >index e9fe20d585..f55fadd473 100644 >--- a/koha-tmpl/intranet-tmpl/prog/en/modules/catalogue/results.tt >+++ b/koha-tmpl/intranet-tmpl/prog/en/modules/catalogue/results.tt >@@ -329,6 +329,21 @@ > </div> <!-- /.btn-group --> > [% END %] > >+ [% IF export_enabled %] >+ <div class="btn-group"> >+ <button type="button" class="btn btn-default btn-xs dropdown-toggle" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> >+ Export all results<span class="caret"></span> >+ </button> >+ <ul class="dropdown-menu"> >+ <li><a href="/cgi-bin/koha/catalogue/search.pl?count=[% results_per_page | uri %]&export=1&export_format=ISO2709[% PROCESS sort_search_query %]">MARC (UTF-8)</a></li> >+ <li><a href="/cgi-bin/koha/catalogue/search.pl?count=[% results_per_page | uri %]&export=1&export_format=MARCXML[% PROCESS sort_search_query %]">MARC XML</a></li> >+ [% FOREACH id IN custom_export_formats.keys %] >+ <li><a href="/cgi-bin/koha/catalogue/search.pl?count=[% results_per_page | uri %]&export=1&export_format=[% id | uri %][% PROCESS sort_search_query %]">[% custom_export_formats.$id.name | html %]</a></li> >+ [% END %] >+ </ul> >+ </div> <!-- /.btn-group --> >+ [% END %] >+ > </div> <!-- /#selection_ops --> > </div> <!-- /#searchheader --> > >@@ -362,6 +377,15 @@ > <div class="dialog alert"><p><strong>Error:</strong> [% query_error | html %]</p></div> > [% END %] > >+ [% IF export_job_id %] >+ <div class="dialog message"> >+ <p>Exporting records, the export will be processed as soon as possible.</p> >+ [% INCLUDE "job_progress.inc" job_id=export_job_id %] >+ <p><a class="job_details" href="/cgi-bin/koha/admin/background_jobs.pl?op=view&id=[% export_job_id | uri %]" title="View detail of the enqueued job">View detail of the enqueued job</a> >+ <div id="job_callback"></div> >+ </div> >+ [% END %] >+ > <!-- Search Results Table --> > [% IF ( total ) %] > [% IF ( scan ) %] >@@ -781,7 +805,34 @@ > [% Asset.css("css/humanmsg.css") | $raw %] > [% Asset.js("lib/jquery/plugins/humanmsg.js") | $raw %] > [% INCLUDE 'select2.inc' %] >+ [% INCLUDE 'str/job_progess.inc' %] >+ [% Asset.js("js/job_progess.js") | $raw %] > <script> >+ [% IF export_job_id %] >+ updateProgress([% export_job_id | html %], function() { >+ $.getJSON('/api/v1/jobs/[% export_job_id | html %]', function(job) { >+ if (job.data.report.errors.length) { >+ humanMsg.displayMsg( >+ _("Export failed with the following errors: ") + "<br>" + job.data.report.errors.join('<br>'), >+ { className: 'humanError' } >+ ); >+ } >+ else { >+ let export_links = Object.entries(job.data.report.export_links); >+ let export_links_html = export_links.map(([format, href]) => >+ `<p>${format}: <a href=${href}>${href}</a></p>` >+ ).join(''); >+ if (export_links.length > 1) { >+ export_links_html = >+ `<p>${_("Some records exceeded the maximum size supported by ISO2709 and was exported as MARCXML instead")}</p>${export_links_html}`; >+ } >+ $(`<p>${_("Export finished successfully:")}</p>${export_links_html}`) >+ .appendTo("#job_callback"); >+ } >+ }); >+ }); >+ [% END %] >+ > var PREF_AmazonCoverImages = parseInt( "[% Koha.Preference('AmazonCoverImages') | html %]", 10); > var q_array = new Array(); // will hold search terms, if present > var PREF_LocalCoverImages = parseInt( "[% Koha.Preference('LocalCoverImages') | html %]", 10); >diff --git a/t/db_dependent/Koha/SearchEngine/Elasticsearch.t b/t/db_dependent/Koha/SearchEngine/Elasticsearch.t >index 6356d9c20a..3764010ff5 100755 >--- a/t/db_dependent/Koha/SearchEngine/Elasticsearch.t >+++ b/t/db_dependent/Koha/SearchEngine/Elasticsearch.t >@@ -184,10 +184,10 @@ subtest 'get_elasticsearch_mappings() tests' => sub { > > subtest 'Koha::SearchEngine::Elasticsearch::marc_records_to_documents () tests' => sub { > >- plan tests => 63; >+ plan tests => 74; > > t::lib::Mocks::mock_preference('marcflavour', 'MARC21'); >- t::lib::Mocks::mock_preference('ElasticsearchMARCFormat', 'ISO2709'); >+ t::lib::Mocks::mock_preference('ElasticsearchMARCFormat', 'base64ISO2709'); > > my @mappings = ( > { >@@ -380,6 +380,16 @@ subtest 'Koha::SearchEngine::Elasticsearch::marc_records_to_documents () tests' > marc_type => 'marc21', > marc_field => '650(avxyz)', > }, >+ { >+ name => 'local-number', >+ type => 'string', >+ facet => 0, >+ suggestible => 0, >+ searchable => 1, >+ sort => 1, >+ marc_type => 'marc21', >+ marc_field => '999c', >+ }, > ); > > my $se = Test::MockModule->new('Koha::SearchEngine::Elasticsearch'); >@@ -427,6 +437,7 @@ subtest 'Koha::SearchEngine::Elasticsearch::marc_records_to_documents () tests' > MARC::Field->new('952', '', '', 0 => 0, g => '127.20', o => $callno2, l => 2), > MARC::Field->new('952', '', '', 0 => 1, g => '0.00', o => $long_callno, l => 1), > ); >+ > my $marc_record_2 = MARC::Record->new(); > $marc_record_2->leader(' cam 22 a 4500'); > $marc_record_2->append_fields( >@@ -533,7 +544,7 @@ subtest 'Koha::SearchEngine::Elasticsearch::marc_records_to_documents () tests' > ok(defined $docs->[0]->{marc_format}, 'First document marc_format field should be set'); > is($docs->[0]->{marc_format}, 'base64ISO2709', 'First document marc_format should be set correctly'); > >- my $decoded_marc_record = $see->decode_record_from_result($docs->[0]); >+ my $decoded_marc_record = $see->search_document_marc_record_decode($docs->[0]); > > ok($decoded_marc_record->isa('MARC::Record'), "base64ISO2709 record successfully decoded from result"); > is($decoded_marc_record->as_usmarc(), $marc_record_1->as_usmarc(), "Decoded base64ISO2709 record has same data as original record"); >@@ -640,8 +651,9 @@ subtest 'Koha::SearchEngine::Elasticsearch::marc_records_to_documents () tests' > MARC::Field->new('100', '', '', a => 'Author 1'), > MARC::Field->new('110', '', '', a => 'Corp Author'), > MARC::Field->new('210', '', '', a => 'Title 1'), >- MARC::Field->new('245', '', '', a => 'Title:', b => 'large record'), >- MARC::Field->new('999', '', '', c => '1234567'), >+ # "|" is for testing escaping for multiple values with custom format >+ MARC::Field->new('245', '', '', a => 'Title:', b => 'large | record'), >+ MARC::Field->new('999', '', '', c => '1234569'), > ); > > my $item_field = MARC::Field->new('952', '', '', o => '123456789123456789123456789', p => '123456789', z => 'test'); >@@ -654,11 +666,100 @@ subtest 'Koha::SearchEngine::Elasticsearch::marc_records_to_documents () tests' > > is($docs->[0]->{marc_format}, 'MARCXML', 'For record exceeding max record size marc_format should be set correctly'); > >- $decoded_marc_record = $see->decode_record_from_result($docs->[0]); >+ $decoded_marc_record = $see->search_document_marc_record_decode($docs->[0]); > > ok($decoded_marc_record->isa('MARC::Record'), "MARCXML record successfully decoded from result"); > is($decoded_marc_record->as_xml_record(), $large_marc_record->as_xml_record(), "Decoded MARCXML record has same data as original record"); > >+ # Search export functionality >+ # Koha::SearchEngine::Elasticsearch::search_documents_encode() >+ my @source_docs = ($marc_record_1, $marc_record_2, $large_marc_record); >+ my @es_response_docs; >+ my $records_data; >+ >+ for my $es_marc_format ('MARCXML', 'ARRAY', 'base64ISO2709') { >+ >+ t::lib::Mocks::mock_preference('ElasticsearchMARCFormat', $es_marc_format); >+ >+ $docs = $see->marc_records_to_documents(\@source_docs); >+ >+ # Emulate Elasticsearch response docs structure >+ @es_response_docs = map { { _source => $_ } } @{$docs}; >+ >+ $records_data = $see->search_documents_encode(\@es_response_docs, 'ISO2709'); >+ >+ # $large_marc_record should not have been encoded as ISO2709 >+ # since exceeds maximum size, see above >+ my @tmp = ($marc_record_1, $marc_record_2); >+ is( >+ $records_data->{ISO2709}, >+ join('', map { $_->as_usmarc() } @tmp), >+ "ISO2709 encoded records from ElasticSearch result are identical with source records using index format \"$es_marc_format\"" >+ ); >+ >+ my $expected_marc_xml = join("\n", >+ MARC::File::XML::header(), >+ MARC::File::XML::record($large_marc_record, 'MARC21'), >+ MARC::File::XML::footer() >+ ); >+ >+ is( >+ $records_data->{MARCXML}, >+ $expected_marc_xml, >+ "Record from search result encoded as MARCXML since exceeding ISO2709 maximum size is identical with source record using index format \"$es_marc_format\"" >+ ); >+ >+ $records_data = $see->search_documents_encode(\@es_response_docs, 'MARCXML'); >+ >+ $expected_marc_xml = join("\n", >+ MARC::File::XML::header(), >+ join("\n", map { MARC::File::XML::record($_, 'MARC21') } @source_docs), >+ MARC::File::XML::footer() >+ ); >+ >+ is( >+ $records_data->{MARCXML}, >+ $expected_marc_xml, >+ "MARCXML encoded records from ElasticSearch result are identical with source records using index format \"$es_marc_format\"" >+ ); >+ } >+ >+ my $custom_formats = <<'END'; >+- name: Biblionumbers >+ fields: [local-number] >+ multiple: ignore >+- name: Title and author >+ fields: [title, author] >+ multiple: join >+END >+ t::lib::Mocks::mock_preference('SearchResultMARCExportCustomFormats', $custom_formats); >+ $custom_formats = C4::Context->yaml_preference('SearchResultMARCExportCustomFormats'); >+ >+ # Biblionumbers custom format >+ $records_data = $see->search_documents_custom_format_encode(\@es_response_docs, $custom_formats->[0]); >+ # UTF-8 encode? >+ is( >+ $records_data, >+ "1234567\n1234568\n1234569", >+ "Records where correctly encoded for the custom format \"Biblionumbers\"" >+ ); >+ >+ # Title and author custom format >+ $records_data = $see->search_documents_custom_format_encode(\@es_response_docs, $custom_formats->[1]); >+ >+ my $encoded_data = join( >+ "\n", >+ "\"Title:|first record|Title: first record\",\"Author 1|Corp Author\"", >+ "\"\",\"Author 2\"", >+ "\"Title:|large \\| record|Title: large \\| record\",\"Author 1|Corp Author\"" >+ ); >+ >+ is( >+ $records_data, >+ $encoded_data, >+ "Records where correctly encoded for the custom format \"Title and author\"" >+ ); >+ > push @mappings, { > name => 'title', > type => 'string', >@@ -791,7 +892,7 @@ subtest 'Koha::SearchEngine::Elasticsearch::marc_records_to_documents_array () t > > is($docs->[0]->{marc_format}, 'ARRAY', 'First document marc_format should be set correctly'); > >- my $decoded_marc_record = $see->decode_record_from_result($docs->[0]); >+ my $decoded_marc_record = $see->search_document_marc_record_decode($docs->[0]); > > ok($decoded_marc_record->isa('MARC::Record'), "ARRAY record successfully decoded from result"); > is($decoded_marc_record->as_usmarc(), $marc_record_1->as_usmarc(), "Decoded ARRAY record has same data as original record"); >@@ -802,7 +903,7 @@ subtest 'Koha::SearchEngine::Elasticsearch::marc_records_to_documents () authori > plan tests => 5; > > t::lib::Mocks::mock_preference('marcflavour', 'MARC21'); >- t::lib::Mocks::mock_preference('ElasticsearchMARCFormat', 'ISO2709'); >+ t::lib::Mocks::mock_preference('ElasticsearchMARCFormat', 'base64ISO2709'); > > my $builder = t::lib::TestBuilder->new; > my $auth_type = $builder->build_object({ class => 'Koha::Authority::Types', value =>{ >-- >2.35.1
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Diff
View Attachment As Raw
Actions:
View
|
Diff
|
Splinter Review
Attachments on
bug 27859
:
117718
|
117721
|
124892
|
124893
|
128632
|
131442
|
131443
|
132993
|
132997
|
133076
|
145243
|
145244
|
145263
|
145300
|
147927
|
147935
|
147936
|
162342
|
162343
|
163761
|
163762