Bugzilla – Attachment 31128 Details for
Bug 12478
Elasticsearch support for Koha
Home
|
New
|
Browse
|
Search
|
[?]
|
Reports
|
Help
|
New Account
|
Log In
[x]
|
Forgot Password
Login:
[x]
[patch]
Bug 12478 - an almagamation of all the Elasticsearch code so far
Bug-12478---an-almagamation-of-all-the-Elasticsear.patch (text/plain), 104.37 KB, created by
Robin Sheat
on 2014-08-25 03:59:19 UTC
(
hide
)
Description:
Bug 12478 - an almagamation of all the Elasticsearch code so far
Filename:
MIME Type:
Creator:
Robin Sheat
Created:
2014-08-25 03:59:19 UTC
Size:
104.37 KB
patch
obsolete
>From 7dbbdf5b0edfbaaf78b6532fa58e7867250eab12 Mon Sep 17 00:00:00 2001 >From: Robin Sheat <robin@catalyst.net.nz> >Date: Mon, 9 Dec 2013 09:38:08 +1300 >Subject: [PATCH] Bug 12478 - an almagamation of all the Elasticsearch code so > far > >--- > C4/Biblio.pm | 8 + > C4/Search.pm | 10 +- > Koha/Biblio.pm | 105 +++++ > Koha/Biblio/Iterator.pm | 126 ++++++ > Koha/Database.pm | 67 +-- > Koha/ElasticSearch.pm | 323 +++++++++++++ > Koha/ElasticSearch/Indexer.pm | 155 +++++++ > Koha/ElasticSearch/Search.pm | 230 ++++++++++ > Koha/ItemType.pm | 70 +++ > Koha/ItemTypes.pm | 113 +++++ > Koha/Schema/Result/ElasticsearchMapping.pm | 105 +++++ > Koha/SearchEngine/Elasticsearch/QueryBuilder.pm | 498 +++++++++++++++++++++ > Koha/SearchEngine/QueryBuilderRole.pm | 3 + > Koha/SearchEngine/Zebra.pm | 3 +- > Koha/SearchEngine/Zebra/QueryBuilder.pm | 7 + > Koha/SearchEngine/Zebra/Search.pm | 42 +- > installer/data/mysql/elasticsearch_mapping.sql | 148 ++++++ > installer/data/mysql/kohastructure.sql | 15 + > .../prog/en/modules/admin/preferences/admin.pref | 1 + > .../opac-tmpl/prog/en/modules/search/results.tt | 26 +- > misc/search_tools/rebuild_elastic_search.pl | 148 ++++++ > myfix.txt | 3 + > opac/elasticsearch.pl | 102 +++++ > opac/opac-search.pl | 41 +- > t/Koha/ItemType.pm | 46 ++ > t/Koha_ElasticSearch.t | 23 + > t/Koha_ElasticSearch_Indexer.t | 51 +++ > t/Koha_ElasticSearch_Search.t | 38 ++ > t/db_dependent/Koha/ItemTypes.pm | 65 +++ > 29 files changed, 2484 insertions(+), 88 deletions(-) > create mode 100644 Koha/Biblio.pm > create mode 100644 Koha/Biblio/Iterator.pm > create mode 100644 Koha/ElasticSearch.pm > create mode 100644 Koha/ElasticSearch/Indexer.pm > create mode 100644 Koha/ElasticSearch/Search.pm > create mode 100644 Koha/ItemType.pm > create mode 100644 Koha/ItemTypes.pm > create mode 100644 Koha/Schema/Result/ElasticsearchMapping.pm > create mode 100644 Koha/SearchEngine/Elasticsearch/QueryBuilder.pm > create mode 100644 installer/data/mysql/elasticsearch_mapping.sql > create mode 100755 misc/search_tools/rebuild_elastic_search.pl > create mode 100644 myfix.txt > create mode 100755 opac/elasticsearch.pl > create mode 100755 t/Koha/ItemType.pm > create mode 100644 t/Koha_ElasticSearch.t > create mode 100644 t/Koha_ElasticSearch_Indexer.t > create mode 100644 t/Koha_ElasticSearch_Search.t > create mode 100755 t/db_dependent/Koha/ItemTypes.pm > >diff --git a/C4/Biblio.pm b/C4/Biblio.pm >index eb2ee2f..f3b7d98 100644 >--- a/C4/Biblio.pm >+++ b/C4/Biblio.pm >@@ -3400,6 +3400,14 @@ sub ModBiblioMarc { > $sth = $dbh->prepare("UPDATE biblioitems SET marc=?,marcxml=? WHERE biblionumber=?"); > $sth->execute( $record->as_usmarc(), $record->as_xml_record($encoding), $biblionumber ); > $sth->finish; >+ if ( C4::Context->preference('SearchEngine') eq 'ElasticSearch' ) { >+# shift to its on sub, so it can do it realtime or queue >+ can_load( modules => { 'Koha::ElasticSearch::Indexer' => undef } ); >+ # need to get this from syspref probably biblio/authority for index >+ my $indexer = Koha::ElasticSearch::Indexer->new(); >+ my $records = [$record]; >+ $indexer->update_index([$biblionumber], $records); >+ } > ModZebra( $biblionumber, "specialUpdate", "biblioserver" ); > return $biblionumber; > } >diff --git a/C4/Search.pm b/C4/Search.pm >index c704c63..31caf98 100644 >--- a/C4/Search.pm >+++ b/C4/Search.pm >@@ -2392,9 +2392,9 @@ sub _ZOOM_event_loop { > } > } > >-=head2 new_record_from_zebra >+=head2 new_record_from_searchengine > >-Given raw data from a Zebra result set, return a MARC::Record object >+Given raw data from a searchengine result set, return a MARC::Record object > > This helper function is needed to take into account all the involved > system preferences and configuration variables to properly create the >@@ -2403,6 +2403,8 @@ MARC::Record object. > If we are using GRS-1, then the raw data we get from Zebra should be USMARC > data. If we are using DOM, then it has to be MARCXML. > >+If we are using elasticsearch, it'll already be a MARC::Record. >+ > =cut > > sub new_record_from_zebra { >@@ -2413,6 +2415,10 @@ sub new_record_from_zebra { > my $index_mode = ( $server eq 'biblioserver' ) > ? C4::Context->config('zebra_bib_index_mode') // 'grs1' > : C4::Context->config('zebra_auth_index_mode') // 'dom'; >+ my $search_engine = C4::Context->preference("SearchEngine"); >+ if ($search_engine eq 'Elasticsearch') { >+ return $raw_data; >+ } > > my $marc_record = eval { > if ( $index_mode eq 'dom' ) { >diff --git a/Koha/Biblio.pm b/Koha/Biblio.pm >new file mode 100644 >index 0000000..4c7b592 >--- /dev/null >+++ b/Koha/Biblio.pm >@@ -0,0 +1,105 @@ >+package Koha::Biblio; >+ >+# This contains functions to do with managing biblio records. >+ >+# Copyright 2014 Catalyst IT >+# >+# This file is part of Koha. >+# >+# Koha is free software; you can redistribute it and/or modify it under the >+# terms of the GNU General Public License as published by the Free Software >+# Foundation; either version 3 of the License, or (at your option) any later >+# version. >+# >+# Koha is distributed in the hope that it will be useful, but WITHOUT ANY >+# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR >+# A PARTICULAR PURPOSE. See the GNU General Public License for more details. >+# >+# You should have received a copy of the GNU General Public License along >+# with Koha; if not, write to the Free Software Foundation, Inc., >+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. >+ >+=head1 NAME >+ >+Koha::Biblio - contains fundamental biblio-related functions >+ >+=head1 DESCRIPTION >+ >+This contains functions for normal operations on biblio records. >+ >+Note: really, C4::Biblio does the main functions, but the Koha namespace is >+the new thing that should be used. >+ >+=cut >+ >+use C4::Biblio; # EmbedItemsInMarcBiblio >+use Koha::Biblio::Iterator; >+use Koha::Database; >+use Modern::Perl; >+ >+use base qw(Class::Accessor); >+ >+__PACKAGE__->mk_accessors(qw()); >+ >+=head1 FUNCTIONS >+ >+=head2 get_all_biblios_iterator >+ >+ my $it = get_all_biblios_iterator(); >+ >+This will provide an iterator object that will, one by one, provide the >+MARC::Record of each biblio. This will include the item data. >+ >+The iterator is a Koha::Biblio::Iterator object. >+ >+=cut >+ >+sub get_all_biblios_iterator { >+ my $database = Koha::Database->new(); >+ my $schema = $database->schema(); >+ my $rs = >+ $schema->resultset('Biblioitem')->search( { marc => { '!=', undef } }, >+ { columns => [qw/ biblionumber marc /] } ); >+ return Koha::Biblio::Iterator->new($rs, items => 1); >+} >+ >+=head2 get_marc_biblio >+ >+ my $marc = get_marc_biblio($bibnum, %options); >+ >+This fetches the MARC::Record for the given biblio number. Nothing is returned >+if the biblionumber couldn't be found (or it somehow has no MARC data.) >+ >+Options are: >+ >+=over 4 >+ >+=item item_data >+ >+If set to true, item data is embedded in the record. Default is to not do this. >+ >+=back >+ >+=cut >+ >+sub get_marc_biblio { >+ my ($class,$bibnum, %options) = @_; >+ >+ my $database = Koha::Database->new(); >+ my $schema = $database->schema(); >+ my $rs = >+ $schema->resultset('Biblioitem') >+ ->search( { marc => { '!=', undef }, biblionumber => $bibnum }, >+ { columns => [qw/ marc /] } ); >+ >+ my $row = $rs->next(); >+ return unless $row; >+ my $marc = MARC::Record->new_from_usmarc($row->marc); >+ >+ # TODO implement this in this module >+ C4::Biblio::EmbedItemsInMarcBiblio($marc, $bibnum) if $options{item_data}; >+ >+ return $marc; >+} >+ >+1; >diff --git a/Koha/Biblio/Iterator.pm b/Koha/Biblio/Iterator.pm >new file mode 100644 >index 0000000..fc150f4 >--- /dev/null >+++ b/Koha/Biblio/Iterator.pm >@@ -0,0 +1,126 @@ >+package Koha::Biblio::Iterator; >+ >+# This contains an iterator over biblio records >+ >+# Copyright 2014 Catalyst IT >+# >+# This file is part of Koha. >+# >+# Koha is free software; you can redistribute it and/or modify it under the >+# terms of the GNU General Public License as published by the Free Software >+# Foundation; either version 3 of the License, or (at your option) any later >+# version. >+# >+# Koha is distributed in the hope that it will be useful, but WITHOUT ANY >+# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR >+# A PARTICULAR PURPOSE. See the GNU General Public License for more details. >+# >+# You should have received a copy of the GNU General Public License along >+# with Koha; if not, write to the Free Software Foundation, Inc., >+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. >+ >+=head1 NAME >+ >+Koha::Biblio::Iterator - iterates over biblios provided by a DBIx::Class::ResultSet >+ >+=head1 DESCRIPTION >+ >+This provides an iterator that gives the MARC::Record of each biblio that's >+returned by a L<DBIx::Class::ResultSet> that provides a C<biblionumber>, and >+C<marc> or C<marcxml> column from the biblioitems table. >+ >+=head1 SYNOPSIS >+ >+ use Koha::Biblio::Iterator; >+ my $rs = $schema->resultset('biblioitems'); >+ my $iterator = Koha::Biblio::Iterator->new($rs); >+ while (my $record = $iterator->next()) { >+ // do something with $record >+ } >+ >+=head1 METHODS >+ >+=cut >+ >+use C4::Biblio; # :( - for EmbedItemsInMarcBiblio >+ >+use Carp; >+use MARC::Record; >+use MARC::File::XML; >+use Modern::Perl; >+ >+=head2 new >+ >+ my $it = new($sth, option => $value, ...); >+ >+Takes a ResultSet to iterate over, and gives you an iterator on it. Optional >+options may be specified. >+ >+=head3 Options >+ >+=over 4 >+ >+=item items >+ >+Set to true to include item data in the resulting MARC record. >+ >+=back >+ >+=cut >+ >+sub new { >+ my ( $class, $rs, %options ) = @_; >+ >+ bless { >+ rs => $rs, >+ %options, >+ }, $class; >+} >+ >+=head2 next() >+ >+In a scalar context, provides the next MARC::Record from the ResultSet, or >+C<undef> if there are no more. >+ >+In a list context it will provide ($biblionumber, $record). >+ >+=cut >+ >+sub next { >+ my ($self) = @_; >+ >+ my $marc; >+ my $row = $self->{rs}->next(); >+ return if !$row; >+ if ( $row->marc ) { >+ $marc = MARC::Record->new_from_usmarc( $row->marc ); >+ } >+ elsif ( $row->marcxml ) { >+ $marc = MARC::Record->new_from_xml( $row->marcxml ); >+ } >+ else { >+ confess "No marc or marcxml column returned in the request."; >+ } >+ >+ my $bibnum; >+ if ( $self->{items} ) { >+ $bibnum = $row->get_column('biblionumber'); >+ confess "No biblionumber column returned in the request." >+ if ( !defined($bibnum) ); >+ >+ # TODO this should really be in Koha::Biblio or something similar. >+ C4::Biblio::EmbedItemsInMarcBiblio( $marc, $bibnum ); >+ } >+ >+ if (wantarray) { >+ $bibnum //= $row->get_column('biblionumber'); >+ confess "No biblionumber column returned in the request." >+ if ( !defined($bibnum) ); >+ return ( $bibnum, $marc ); >+ } >+ else { >+ return $marc; >+ } >+} >+ >+1; >diff --git a/Koha/Database.pm b/Koha/Database.pm >index 92e83ec..ed0b6e8 100644 >--- a/Koha/Database.pm >+++ b/Koha/Database.pm >@@ -40,6 +40,8 @@ use base qw(Class::Accessor); > > __PACKAGE__->mk_accessors(qw( )); > >+our $schema; # the schema is a singleton >+ > # _new_schema > # Internal helper function (not a method!). This creates a new > # database connection from the data given in the current context, and >@@ -64,22 +66,21 @@ creates one, and connects to the database. > > This database handle is cached for future use: if you call > C<$database-E<gt>schema> twice, you will get the same handle both >-times. If you need a second database handle, use C<&new_schema> and >-possibly C<&set_schema>. >+times. If you need a second database handle, use C<&new_schema>. > > =cut > > sub schema { > my $self = shift; > my $sth; >- if ( defined( $self->{"schema"} ) && $self->{"schema"}->storage->connected() ) { >- return $self->{"schema"}; >+ if ( defined( $schema ) && $schema->storage->connected() ) { >+ return $schema; > } > > # No database handle or it died . Create one. >- $self->{"schema"} = &_new_schema(); >+ $schema = &_new_schema(); > >- return $self->{"schema"}; >+ return $schema; > } > > =head2 new_schema >@@ -102,60 +103,6 @@ sub new_schema { > return &_new_schema(); > } > >-=head2 set_schema >- >- $my_schema = $database->new_schema; >- $database->set_schema($my_schema); >- ... >- $database->restore_schema; >- >-C<&set_schema> and C<&restore_schema> work in a manner analogous to >-C<&set_context> and C<&restore_context>. >- >-C<&set_schema> saves the current database handle on a stack, then sets >-the current database handle to C<$my_schema>. >- >-C<$my_schema> is assumed to be a good database handle. >- >-=cut >- >-sub set_schema { >- my $self = shift; >- my $new_schema = shift; >- >- # Save the current database handle on the handle stack. >- # We assume that $new_schema is all good: if the caller wants to >- # screw himself by passing an invalid handle, that's fine by >- # us. >- push @{ $self->{"schema_stack"} }, $self->{"schema"}; >- $self->{"schema"} = $new_schema; >-} >- >-=head2 restore_schema >- >- $database->restore_schema; >- >-Restores the database handle saved by an earlier call to >-C<$database-E<gt>set_schema>. >- >-=cut >- >-sub restore_schema { >- my $self = shift; >- >- if ( $#{ $self->{"schema_stack"} } < 0 ) { >- >- # Stack underflow >- die "SCHEMA stack underflow"; >- } >- >- # Pop the old database handle and set it. >- $self->{"schema"} = pop @{ $self->{"schema_stack"} }; >- >- # FIXME - If it is determined that restore_context should >- # return something, then this function should, too. >-} >- > =head2 EXPORT > > None by default. >diff --git a/Koha/ElasticSearch.pm b/Koha/ElasticSearch.pm >new file mode 100644 >index 0000000..f2f287f >--- /dev/null >+++ b/Koha/ElasticSearch.pm >@@ -0,0 +1,323 @@ >+package Koha::ElasticSearch; >+ >+# Copyright 2013 Catalyst IT >+# >+# This file is part of Koha. >+# >+# Koha is free software; you can redistribute it and/or modify it under the >+# terms of the GNU General Public License as published by the Free Software >+# Foundation; either version 3 of the License, or (at your option) any later >+# version. >+# >+# Koha is distributed in the hope that it will be useful, but WITHOUT ANY >+# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR >+# A PARTICULAR PURPOSE. See the GNU General Public License for more details. >+# >+# You should have received a copy of the GNU General Public License along >+# with Koha; if not, write to the Free Software Foundation, Inc., >+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. >+ >+use base qw(Class::Accessor); >+ >+use C4::Context; >+use Carp; >+use Elasticsearch; >+use Koha::Database; >+use Modern::Perl; >+ >+use Data::Dumper; # TODO remove >+ >+__PACKAGE__->mk_ro_accessors(qw( index )); >+ >+=head1 NAME >+ >+Koha::ElasticSearch - Base module for things using elasticsearch >+ >+=head1 ACCESSORS >+ >+=over 4 >+ >+=item index >+ >+The name of the index to use, generally 'biblios' or 'authorities'. >+ >+=back >+ >+=head1 FUNCTIONS >+ >+=cut >+ >+sub new { >+ my $class = shift @_; >+ my $self = $class->SUPER::new(@_); >+ # Check for a valid index >+ croak('No index name provided') unless $self->index; >+ return $self; >+} >+ >+=head2 get_elasticsearch_params >+ >+ my $params = $self->get_elasticsearch_params(); >+ >+This provides a hashref that contains the parameters for connecting to the >+ElasicSearch servers, in the form: >+ >+ { >+ 'servers' => ['127.0.0.1:9200', 'anotherserver:9200'], >+ 'index_name' => 'koha_instance', >+ } >+ >+This is configured by the following in the C<config> block in koha-conf.xml: >+ >+ <elasticsearch> >+ <server>127.0.0.1:9200</server> >+ <server>anotherserver:9200</server> >+ <index_name>koha_instance</index_name> >+ </elasticsearch> >+ >+=cut >+ >+sub get_elasticsearch_params { >+ my ($self) = @_; >+ >+ # Copy the hash so that we're not modifying the original >+ my $es = { %{ C4::Context->config('elasticsearch') } }; >+ die "No 'elasticsearch' block is defined in koha-conf.xml.\n" if ( !$es ); >+ >+ # Helpfully, the multiple server lines end up in an array for us anyway >+ # if there are multiple ones, but not if there's only one. >+ my $server = $es->{server}; >+ delete $es->{server}; >+ if ( ref($server) eq 'ARRAY' ) { >+ >+ # store it called 'servers' >+ $es->{servers} = $server; >+ } >+ elsif ($server) { >+ $es->{servers} = [$server]; >+ } >+ else { >+ die "No elasticsearch servers were specified in koha-conf.xml.\n"; >+ } >+ die "No elasticserver index_name was specified in koha-conf.xml.\n" >+ if ( !$es->{index_name} ); >+ # Append the name of this particular index to our namespace >+ $es->{index_name} .= '_' . $self->index; >+ return $es; >+} >+ >+=head2 get_elasticsearch_settings >+ >+ my $settings = $self->get_elasticsearch_settings(); >+ >+This provides the settings provided to elasticsearch when an index is created. >+These can do things like define tokenisation methods. >+ >+A hashref containing the settings is returned. >+ >+=cut >+ >+sub get_elasticsearch_settings { >+ my ($self) = @_; >+ >+ # Ultimately this should come from a file or something, and not be >+ # hardcoded. >+ my $settings = { >+ index => { >+ analysis => { >+ analyzer => { >+ analyser_phrase => { >+ tokenizer => 'keyword', >+ filter => 'lowercase', >+ }, >+ analyser_standard => { >+ tokenizer => 'standard', >+ filter => 'lowercase', >+ } >+ } >+ } >+ } >+ }; >+ return $settings; >+} >+ >+=head2 get_elasticsearch_mappings >+ >+ my $mappings = $self->get_elasticsearch_mappings(); >+ >+This provides the mappings that get passed to elasticsearch when an index is >+created. >+ >+=cut >+ >+sub get_elasticsearch_mappings { >+ my ($self) = @_; >+ >+ my $mappings = { >+ data => { >+ properties => { >+ record => { >+ store => "yes", >+ include_in_all => "false", >+ type => "string", >+ }, >+ } >+ } >+ }; >+ $self->_foreach_mapping( >+ sub { >+ my ( undef, $name, $type, $facet ) = @_; >+ >+ # TODO if this gets any sort of complexity to it, it should >+ # be broken out into its own function. >+ >+ # TODO be aware of date formats, but this requires pre-parsing >+ # as ES will simply reject anything with an invalid date. >+ my $es_type = >+ $type eq 'boolean' >+ ? 'boolean' >+ : 'string'; >+ $mappings->{data}{properties}{$name} = { >+ search_analyzer => "analyser_standard", >+ index_analyzer => "analyser_standard", >+ type => $es_type, >+ fields => { >+ phrase => { >+ search_analyzer => "analyser_phrase", >+ index_analyzer => "analyser_phrase", >+ type => "string" >+ }, >+ }, >+ }; >+ $mappings->{data}{properties}{$name}{null_value} = 0 >+ if $type eq 'boolean'; >+ if ($facet) { >+ $mappings->{data}{properties}{ $name . '__facet' } = { >+ type => "string", >+ index => "not_analyzed", >+ }; >+ } >+ } >+ ); >+ return $mappings; >+} >+ >+# Provides the rules for data conversion. >+sub get_fixer_rules { >+ my ($self) = @_; >+ >+ my $marcflavour = lc C4::Context->preference('marcflavour'); >+ my @rules; >+ $self->_foreach_mapping( >+ sub { >+ my ( undef, $name, $type, $facet, $marcs ) = @_; >+ my $field = $marcs->{$marcflavour}; >+ return unless defined $marcs->{$marcflavour}; >+ my $options = ''; >+ >+ # There's a bug when using 'split' with something that >+ # selects a range >+ # The split makes everything into nested arrays, but that's not >+ # really a big deal, ES doesn't mind. >+ $options = '-split => 1' unless $field =~ m|_/| || $type eq 'sum'; >+ push @rules, "marc_map('$field','${name}', $options)"; >+ if ($facet) { >+ push @rules, "marc_map('$field','${name}__facet', $options)"; >+ } >+ if ( $type eq 'boolean' ) { >+ >+ # boolean gets special handling, basically if it doesn't exist, >+ # it's added and set to false. Otherwise we can't query it. >+ push @rules, >+ "unless exists('$name') add_field('$name', 0) end"; >+ } >+ if ($type eq 'sum' ) { >+ push @rules, "sum('$name')"; >+ } >+ } >+ ); >+ >+ return \@rules; >+} >+ >+=head2 _foreach_mapping >+ >+ $self->_foreach_mapping( >+ sub { >+ my ( $id, $name, $type, $facet, $marcs ) = @_; >+ my $marc = $marcs->{marc21}; >+ } >+ ); >+ >+This allows you to apply a function to each entry in the elasticsearch mappings >+table, in order to build the mappings for whatever is needed. >+ >+In the provided function, the files are: >+ >+=over 4 >+ >+=item C<$id> >+ >+An ID number, corresponding to the entry in the database. >+ >+=item C<$name> >+ >+The field name for elasticsearch (corresponds to the 'mapping' column in the >+database. >+ >+=item C<$type> >+ >+The type for this value, e.g. 'string'. >+ >+=item C<$facet> >+ >+True if this value should be facetised. This only really makes sense if the >+field is understood by the facet processing code anyway. >+ >+=item C<$marc> >+ >+A hashref containing the MARC field specifiers for each MARC type. It's quite >+possible for this to be undefined if there is otherwise an entry in a >+different MARC form. >+ >+=back >+ >+=cut >+ >+sub _foreach_mapping { >+ my ( $self, $sub ) = @_; >+ >+ # TODO use a caching framework here >+ my $database = Koha::Database->new(); >+ my $schema = $database->schema(); >+ my $rs = $schema->resultset('ElasticsearchMapping')->search(); >+ for my $row ( $rs->all ) { >+ $sub->( >+ $row->id, >+ $row->mapping, >+ $row->type, >+ $row->facet, >+ { >+ marc21 => $row->marc21, >+ unimarc => $row->unimarc, >+ normarc => $row->normarc >+ } >+ ); >+ } >+} >+ >+1; >+ >+__END__ >+ >+=head1 AUTHOR >+ >+=over 4 >+ >+=item Chris Cormack C<< <chrisc@catalyst.net.nz> >> >+ >+=item Robin Sheat C<< <robin@catalyst.net.nz> >> >+ >+=back >+ >+=cut >diff --git a/Koha/ElasticSearch/Indexer.pm b/Koha/ElasticSearch/Indexer.pm >new file mode 100644 >index 0000000..b7d6097 >--- /dev/null >+++ b/Koha/ElasticSearch/Indexer.pm >@@ -0,0 +1,155 @@ >+package Koha::ElasticSearch::Indexer; >+ >+# Copyright 2013 Catalyst IT >+# >+# This file is part of Koha. >+# >+# Koha is free software; you can redistribute it and/or modify it under the >+# terms of the GNU General Public License as published by the Free Software >+# Foundation; either version 3 of the License, or (at your option) any later >+# version. >+# >+# Koha is distributed in the hope that it will be useful, but WITHOUT ANY >+# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR >+# A PARTICULAR PURPOSE. See the GNU General Public License for more details. >+# >+# You should have received a copy of the GNU General Public License along >+# with Koha; if not, write to the Free Software Foundation, Inc., >+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. >+ >+use Carp; >+use Modern::Perl; >+use base qw(Koha::ElasticSearch); >+use Data::Dumper; >+ >+# For now just marc, but we can do anything here really >+use Catmandu::Importer::MARC; >+use Catmandu::Store::ElasticSearch; >+ >+Koha::ElasticSearch::Indexer->mk_accessors(qw( store )); >+ >+=head1 NAME >+ >+Koha::ElasticSearch::Indexer - handles adding new records to the index >+ >+=head1 SYNOPSIS >+ >+ my $indexer = Koha::ElasticSearch::Indexer->new({ index => 'biblios' }); >+ $indexer->delete_index(); >+ $indexer->update_index(\@biblionumbers, \@records); >+ >+=head1 FUNCTIONS >+ >+=cut >+ >+=head2 $indexer->update_index($biblionums, $records); >+ >+C<$biblionums> is an arrayref containing the biblionumbers for the records. >+ >+C<$records> is an arrayref containing the L<MARC::Record>s themselves. >+ >+The values in the arrays must match up, and the 999$c value in the MARC record >+will be rewritten using the values in C<$biblionums> to ensure they are correct. >+If C<$biblionums> is C<undef>, this won't happen, but you should be sure that >+999$c is correct on your own then. >+ >+Note that this will modify the original record if C<$biblionums> is supplied. >+If that's a problem, clone them first. >+ >+=cut >+ >+sub update_index { >+ my ($self, $biblionums, $records) = @_; >+ >+ if ($biblionums) { >+ $self->_sanitise_records($biblionums, $records); >+ } >+ >+ my $from = $self->_convert_marc_to_json($records); >+ if ( !$self->store ) { >+ my $params = $self->get_elasticsearch_params(); >+ $self->store( >+ Catmandu::Store::ElasticSearch->new( >+ %$params, >+ index_settings => $self->get_elasticsearch_settings(), >+ index_mappings => $self->get_elasticsearch_mappings(), >+ #trace_calls => 1, >+ ) >+ ); >+ } >+ $self->store->bag->add_many($from); >+ $self->store->bag->commit; >+ return 1; >+} >+ >+=head2 $indexer->delete_index(); >+ >+Deletes the index from the elasticsearch server. Calling C<update_index> >+after this will recreate it again. >+ >+=cut >+ >+sub delete_index { >+ my ($self) = @_; >+ >+ if (!$self->store) { >+ # If this index doesn't exist, this will create it. Then it'll be >+ # deleted. That's not the end of the world however. >+ my $params = $self->get_elasticsearch_params(); >+ $self->store( >+ Catmandu::Store::ElasticSearch->new( >+ %$params, >+ index_settings => $self->get_elasticsearch_settings(), >+ index_mappings => $self->get_elasticsearch_mappings(), >+ #trace_calls => 1, >+ ) >+ ); >+ } >+ $self->store->drop(); >+ $self->store(undef); >+} >+ >+sub _sanitise_records { >+ my ($self, $biblionums, $records) = @_; >+ >+ confess "Unequal number of values in \$biblionums and \$records." if (@$biblionums != @$records); >+ >+ my $c = @$biblionums; >+ for (my $i=0; $i<$c; $i++) { >+ my $bibnum = $biblionums->[$i]; >+ my $rec = $records->[$i]; >+ # I've seen things you people wouldn't believe. Attack ships on fire >+ # off the shoulder of Orion. I watched C-beams glitter in the dark near >+ # the Tannhauser gate. MARC records where 999$c doesn't match the >+ # biblionumber column. All those moments will be lost in time... like >+ # tears in rain... >+ $rec->delete_fields($rec->field('999')); >+ $rec->append_fields(MARC::Field->new('999','','','c' => $bibnum, 'd' => $bibnum)); >+ } >+} >+ >+sub _convert_marc_to_json { >+ my $self = shift; >+ my $records = shift; >+ my $importer = >+ Catmandu::Importer::MARC->new( records => $records, id => '999c' ); >+ my $fixer = Catmandu::Fix->new( fixes => $self->get_fixer_rules() ); >+ $importer = $fixer->fix($importer); >+ return $importer; >+} >+ >+1; >+ >+__END__ >+ >+=head1 AUTHOR >+ >+=over 4 >+ >+=item Chris Cormack C<< <chrisc@catalyst.net.nz> >> >+ >+=item Robin Sheat C<< <robin@catalyst.net.nz> >> >+ >+=back >+ >+=cut >diff --git a/Koha/ElasticSearch/Search.pm b/Koha/ElasticSearch/Search.pm >new file mode 100644 >index 0000000..3b1ed7e >--- /dev/null >+++ b/Koha/ElasticSearch/Search.pm >@@ -0,0 +1,230 @@ >+package Koha::ElasticSearch::Search; >+ >+# Copyright 2014 Catalyst IT >+# >+# This file is part of Koha. >+# >+# Koha is free software; you can redistribute it and/or modify it under the >+# terms of the GNU General Public License as published by the Free Software >+# Foundation; either version 3 of the License, or (at your option) any later >+# version. >+# >+# Koha is distributed in the hope that it will be useful, but WITHOUT ANY >+# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR >+# A PARTICULAR PURPOSE. See the GNU General Public License for more details. >+# >+# You should have received a copy of the GNU General Public License along >+# with Koha; if not, write to the Free Software Foundation, Inc., >+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. >+ >+=head1 NAME >+ >+Koha::ElasticSearch::Search - search functions for Elasticsearch >+ >+=head1 SYNOPSIS >+ >+ my $searcher = Koha::ElasticSearch::Search->new(); >+ my $builder = Koha::SearchEngine::Elasticsearch::QueryBuilder->new(); >+ my $query = $builder->build_query('perl'); >+ my $results = $searcher->search($query); >+ print "There were " . $results->total . " results.\n"; >+ $results->each(sub { >+ push @hits, @_[0]; >+ }); >+ >+=head1 METHODS >+ >+=cut >+ >+use base qw(Koha::ElasticSearch); >+use Koha::ItemTypes; >+ >+use Catmandu::Store::ElasticSearch; >+ >+use Data::Dumper; #TODO remove >+use Carp qw(cluck); >+ >+Koha::ElasticSearch::Search->mk_accessors(qw( store )); >+ >+=head2 search >+ >+ my $results = $searcher->search($query, $page, $count); >+ >+Run a search using the query. It'll return C<$count> results, starting at page >+C<$page> (C<$page> counts from 1, anything less that, or C<undef> becomes 1.) >+ >+C<%options> is a hash containing extra options: >+ >+=over 4 >+ >+=item offset >+ >+If provided, this overrides the C<$page> value, and specifies the record as >+an offset (i.e. the number of the record to start with), rather than a page. >+ >+=back >+ >+=cut >+ >+sub search { >+ my ($self, $query, $page, $count, %options) = @_; >+ >+ my $params = $self->get_elasticsearch_params(); >+ my %paging; >+ $paging{limit} = $count || 20; >+ # ES doesn't want pages, it wants a record to start from. >+ if (exists $options{offset}) { >+ $paging{start} = $options{offset}; >+ } else { >+ $page = (!defined($page) || ($page <= 0)) ? 1 : $page - 1; >+ $paging{start} = $page * $paging{limit}; >+ } >+ $self->store( >+ Catmandu::Store::ElasticSearch->new( >+ %$params, >+ trace_calls => 0, >+ ) >+ ); >+ my $results = $self->store->bag->search( %$query, %paging ); >+ return $results; >+} >+ >+=head2 search_compat >+ >+ my ( $error, $results, $facets ) = $search->search_compat( >+ $query, $simple_query, \@sort_by, \@servers, >+ $results_per_page, $offset, $expanded_facet, $branches, >+ $query_type, $scan >+ ) >+ >+A search interface somewhat compatible with L<C4::Search->getRecords>. Anything >+that is returned in the query created by build_query_compat will probably >+get ignored here. >+ >+=cut >+ >+sub search_compat { >+ my ( >+ $self, $query, $simple_query, $sort_by, >+ $servers, $results_per_page, $offset, $expanded_facet, >+ $branches, $query_type, $scan >+ ) = @_; >+ >+ my %options; >+ $options{offset} = $offset; >+ my $results = $self->search($query, undef, $results_per_page, %options); >+ >+ # Convert each result into a MARC::Record >+ my (@records, $index); >+ $index = $offset; # opac-search expects results to be put in the >+ # right place in the array, according to $offset >+ $results->each(sub { >+ # The results come in an array for some reason >+ my $marc_json = @_[0]->{record}; >+ my $marc = $self->json2marc($marc_json); >+ $records[$index++] = $marc; >+ }); >+ # consumers of this expect a name-spaced result, we provide the default >+ # configuration. >+ my %result; >+ $result{biblioserver}{hits} = $results->total; >+ $result{biblioserver}{RECORDS} = \@records; >+ return (undef, \%result, $self->_convert_facets($results->{facets})); >+} >+ >+=head2 json2marc >+ >+ my $marc = $self->json2marc($marc_json); >+ >+Converts the form of marc (based on its JSON, but as a Perl structure) that >+Catmandu stores into a MARC::Record object. >+ >+=cut >+ >+sub json2marc { >+ my ( $self, $marcjson ) = @_; >+ >+ my $marc = MARC::Record->new(); >+ $marc->encoding('UTF-8'); >+ >+ # fields are like: >+ # [ '245', '1', '2', 'a' => 'Title', 'b' => 'Subtitle' ] >+ # conveniently, this is the form that MARC::Field->new() likes >+ foreach $field (@$marcjson) { >+ next if @$field < 5; # Shouldn't be possible, but... >+ if ( $field->[0] eq 'LDR' ) { >+ $marc->leader( $field->[4] ); >+ } >+ else { >+ my $marc_field = MARC::Field->new(@$field); >+ $marc->append_fields($marc_field); >+ } >+ } >+ return $marc; >+} >+ >+=head2 _convert_facets >+ >+ my $koha_facets = _convert_facets($es_facets); >+ >+Converts elasticsearch facets types to the form that Koha expects. >+It expects the ES facet name to match the Koha type, for example C<itype>, >+C<au>, C<su-to>, etc. >+ >+=cut >+ >+sub _convert_facets { >+ my ( $self, $es ) = @_; >+ >+ return undef if !$es; >+ >+ # These should correspond to the ES field names, as opposed to the CCL >+ # things that zebra uses. >+ my %type_to_label = ( >+ author => 'Authors', >+ location => 'Location', >+ itype => 'ItemTypes', >+ se => 'Series', >+ subject => 'Topics', >+ 'su-geo' => 'Places', >+ ); >+ >+ # We also have some special cases, e.g. itypes that need to show the >+ # value rather than the code. >+ my $itypes = Koha::ItemTypes->new(); >+ my %special = ( itype => sub { $itypes->get_description_for_code(@_) }, ); >+ my @res; >+ while ( ( $type, $data ) = each %$es ) { >+ next if !exists( $type_to_label{$type} ); >+ my $facet = { >+ type_id => $type . '_id', >+ expand => $type, >+ expandable => 1, # TODO figure how that's supposed to work >+ "type_label_$type_to_label{$type}" => 1, >+ type_link_value => $type, >+ }; >+ foreach my $term ( @{ $data->{terms} } ) { >+ my $t = $term->{term}; >+ my $c = $term->{count}; >+ if ( exists( $special{$type} ) ) { >+ $label = $special{$type}->($t); >+ } >+ else { >+ $label = $t; >+ } >+ push @{ $facet->{facets} }, { >+ facet_count => $c, >+ facet_link_value => $t, >+ facet_title_value => $t . " ($c)", >+ facet_label_value => $label, # TODO either truncate this, >+ # or make the template do it like it should anyway >+ type_link_value => $type, >+ }; >+ } >+ push @res, $facet if exists $facet->{facets}; >+ } >+ return \@res; >+} >+ >+ >+1; >diff --git a/Koha/ItemType.pm b/Koha/ItemType.pm >new file mode 100644 >index 0000000..f2bfd6f >--- /dev/null >+++ b/Koha/ItemType.pm >@@ -0,0 +1,70 @@ >+package Koha::ItemType; >+ >+# This represents a single itemtype >+ >+# Copyright 2014 Catalyst IT >+# >+# This file is part of Koha. >+# >+# Koha is free software; you can redistribute it and/or modify it under the >+# terms of the GNU General Public License as published by the Free Software >+# Foundation; either version 3 of the License, or (at your option) any later >+# version. >+# >+# Koha is distributed in the hope that it will be useful, but WITHOUT ANY >+# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR >+# A PARTICULAR PURPOSE. See the GNU General Public License for more details. >+# >+# You should have received a copy of the GNU General Public License along >+# with Koha; if not, write to the Free Software Foundation, Inc., >+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. >+ >+=head1 NAME >+ >+Koha::ItemType - represents a single itemtype >+ >+=head1 DESCRIPTION >+ >+This contains the data relating to a single itemtype >+ >+=head1 SYNOPSIS >+ >+ use Koha::ItemTypes; >+ my $types = Koha::ItemTypes->new(); >+ my $type = $types->get_itemtype('CODE'); >+ print $type->code, $type->description, $type->rentalcharge, >+ $type->imageurl, $type->summary, $type->checkinmsg, >+ $type->checkinmsgtype; >+ >+Creating an instance of C<Koha::ItemType> without using L<Koha::ItemTypes> >+can be done simply by passing a hashref containing the values to C<new()>. >+Note when doing this that a value for C<itemtype> will become a value for >+C<code>. >+ >+=head1 FUNCTIONS >+ >+In addition to the read-only accessors mentioned above, the following functions >+exist. >+ >+=cut >+ >+use Modern::Perl; >+ >+use base qw(Class::Accessor); >+ >+# TODO can we make these auto-generate from the input hash so it doesn't >+# have to be updated when the database is? >+__PACKAGE__->mk_ro_accessors( >+ qw(code description rentalcharge imageurl >+ summary checkinmsg checkinmsgtype) >+); >+ >+sub new { >+ my $class = shift @_; >+ >+ my %data = ( %{ $_[0] }, code => $_[0]->{itemtype} ); >+ my $self = $class->SUPER::new( \%data ); >+ return $self; >+} >+ >+1; >diff --git a/Koha/ItemTypes.pm b/Koha/ItemTypes.pm >new file mode 100644 >index 0000000..9cf0b37 >--- /dev/null >+++ b/Koha/ItemTypes.pm >@@ -0,0 +1,113 @@ >+package Koha::ItemTypes; >+ >+# This contains the item types that the system knows about. >+ >+# Copyright 2014 Catalyst IT >+# >+# This file is part of Koha. >+# >+# Koha is free software; you can redistribute it and/or modify it under the >+# terms of the GNU General Public License as published by the Free Software >+# Foundation; either version 3 of the License, or (at your option) any later >+# version. >+# >+# Koha is distributed in the hope that it will be useful, but WITHOUT ANY >+# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR >+# A PARTICULAR PURPOSE. See the GNU General Public License for more details. >+# >+# You should have received a copy of the GNU General Public License along >+# with Koha; if not, write to the Free Software Foundation, Inc., >+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. >+ >+=head1 NAME >+ >+Koha::ItemTypes - handles the item types that Koha knows about >+ >+=head1 DESCRIPTION >+ >+This contains functions to access the item types. >+ >+Note that any changes that happen to the database while this object is live >+may not be reflected, so best don't hold onto it for a long time >+ >+=cut >+ >+use Koha::Database; >+use Koha::ItemType; >+use Modern::Perl; >+ >+use Data::Dumper; # TODO remove >+use base qw(Class::Accessor); >+ >+__PACKAGE__->mk_accessors(qw()); >+ >+=head1 FUNCTIONS >+ >+=head2 new >+ >+ my $itypes = Koha::ItemTypes->new(); >+ >+Creates a new instance of the object. >+ >+=cut >+ >+# Handled by Class::Accessor >+ >+=head2 get_itemtype >+ >+ my @itype = $itypes->get_itemtype('CODE1', 'CODE2'); >+ >+This returns a L<Koha::ItemType> object for each of the provided codes. For >+any that don't exist, an C<undef> is returned. >+ >+=cut >+ >+sub get_itemtype { >+ my ($self, @codes) = @_; >+ >+ my $schema = Koha::Database->new()->schema(); >+ my @res; >+ >+ foreach my $c (@codes) { >+ if (exists $self->{cached}{$c}) { >+ push @res, $self->{cached}{$c}; >+ next; >+ } >+ my $rs = $schema->resultset('Itemtype')->search( { itemtype => $c } ); >+ my $r = $rs->next; >+ if (!$r) { >+ push @res, undef; >+ next; >+ } >+ my %data = $r->get_inflated_columns; >+ my $it = Koha::ItemType->new(\%data); >+ push @res, $it; >+ $self->{cached}{$c} = $it; >+ } >+ if (wantarray) { >+ return @res; >+ } else { >+ return @res ? $res[0] : undef; >+ } >+} >+ >+=head2 get_description_for_code >+ >+ my $desc = $itypes->get_description_for_code($code); >+ >+This returns the description for an itemtype code. As a special case, if >+there is no itemtype for this code, it'll return what it was given. >+ >+It is mostly as a convenience function rather than using L<get_itemtype>. >+ >+=cut >+ >+sub get_description_for_code { >+ my ($self, $code) = @_; >+ >+ my $itype = $self->get_itemtype($code); >+ return $code if !$itype; >+ return $itype->description; >+} >+ >+1; >diff --git a/Koha/Schema/Result/ElasticsearchMapping.pm b/Koha/Schema/Result/ElasticsearchMapping.pm >new file mode 100644 >index 0000000..f37009a >--- /dev/null >+++ b/Koha/Schema/Result/ElasticsearchMapping.pm >@@ -0,0 +1,105 @@ >+use utf8; >+package Koha::Schema::Result::ElasticsearchMapping; >+ >+# Created by DBIx::Class::Schema::Loader >+# DO NOT MODIFY THE FIRST PART OF THIS FILE >+ >+=head1 NAME >+ >+Koha::Schema::Result::ElasticsearchMapping >+ >+=cut >+ >+use strict; >+use warnings; >+ >+use base 'DBIx::Class::Core'; >+ >+=head1 TABLE: C<elasticsearch_mapping> >+ >+=cut >+ >+__PACKAGE__->table("elasticsearch_mapping"); >+ >+=head1 ACCESSORS >+ >+=head2 id >+ >+ data_type: 'integer' >+ is_auto_increment: 1 >+ is_nullable: 0 >+ >+=head2 mapping >+ >+ data_type: 'varchar' >+ is_nullable: 1 >+ size: 255 >+ >+=head2 type >+ >+ data_type: 'varchar' >+ is_nullable: 1 >+ size: 255 >+ >+=head2 facet >+ >+ data_type: 'tinyint' >+ default_value: 0 >+ is_nullable: 1 >+ >+=head2 marc21 >+ >+ data_type: 'varchar' >+ is_nullable: 1 >+ size: 255 >+ >+=head2 unimarc >+ >+ data_type: 'varchar' >+ is_nullable: 1 >+ size: 255 >+ >+=head2 normarc >+ >+ data_type: 'varchar' >+ is_nullable: 1 >+ size: 255 >+ >+=cut >+ >+__PACKAGE__->add_columns( >+ "id", >+ { data_type => "integer", is_auto_increment => 1, is_nullable => 0 }, >+ "mapping", >+ { data_type => "varchar", is_nullable => 1, size => 255 }, >+ "type", >+ { data_type => "varchar", is_nullable => 1, size => 255 }, >+ "facet", >+ { data_type => "tinyint", default_value => 0, is_nullable => 1 }, >+ "marc21", >+ { data_type => "varchar", is_nullable => 1, size => 255 }, >+ "unimarc", >+ { data_type => "varchar", is_nullable => 1, size => 255 }, >+ "normarc", >+ { data_type => "varchar", is_nullable => 1, size => 255 }, >+); >+ >+=head1 PRIMARY KEY >+ >+=over 4 >+ >+=item * L</id> >+ >+=back >+ >+=cut >+ >+__PACKAGE__->set_primary_key("id"); >+ >+ >+# Created by DBIx::Class::Schema::Loader v0.07040 @ 2014-06-06 16:20:16 >+# DO NOT MODIFY THIS OR ANYTHING ABOVE! md5sum:uGRmWU0rshP6awyLMQYJeQ >+ >+ >+# You can replace this text with custom code or comments, and it will be preserved on regeneration >+1; >diff --git a/Koha/SearchEngine/Elasticsearch/QueryBuilder.pm b/Koha/SearchEngine/Elasticsearch/QueryBuilder.pm >new file mode 100644 >index 0000000..184ecef >--- /dev/null >+++ b/Koha/SearchEngine/Elasticsearch/QueryBuilder.pm >@@ -0,0 +1,498 @@ >+package Koha::SearchEngine::Elasticsearch::QueryBuilder; >+ >+# This file is part of Koha. >+# >+# Copyright 2014 Catalyst IT Ltd. >+# >+# Koha is free software; you can redistribute it and/or modify it >+# under the terms of the GNU General Public License as published by >+# the Free Software Foundation; either version 3 of the License, or >+# (at your option) any later version. >+# >+# Koha is distributed in the hope that it will be useful, but >+# WITHOUT ANY WARRANTY; without even the implied warranty of >+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the >+# GNU General Public License for more details. >+# >+# You should have received a copy of the GNU General Public License >+# along with Koha; if not, see <http://www.gnu.org/licenses>. >+ >+=head1 NAME >+ >+Koha::SearchEngine::Elasticsearch::QueryBuilder - constructs elasticsearch >+query objects from user-supplied queries >+ >+=head1 DESCRIPTION >+ >+This provides the functions that take a user-supplied search query, and >+provides something that can be given to elasticsearch to get answers. >+ >+=head1 SYNOPSIS >+ >+ use Koha::SearchEngine::Elasticsearch; >+ $builder = Koha::SearchEngine::Elasticsearch->new(); >+ my $simple_query = $builder->build_query("hello"); >+ # This is currently undocumented because the original code is undocumented >+ my $adv_query = $builder->build_advanced_query($indexes, $operands, $operators); >+ >+=head1 METHODS >+ >+=cut >+ >+use base qw(Class::Accessor); >+use List::MoreUtils qw/ each_array /; >+use Modern::Perl; >+use URI::Escape; >+ >+use Data::Dumper; # TODO remove >+ >+=head2 build_query >+ >+ my $simple_query = $builder->build_query("hello", %options) >+ >+This will build a query that can be issued to elasticsearch from the provided >+string input. This expects a lucene style search form (see >+L<http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/query-dsl-query-string-query.html#query-string-syntax> >+for details.) >+ >+It'll make an attempt to respect the various query options. >+ >+Additional options can be provided with the C<%options> hash. >+ >+=over 4 >+ >+=item sort >+ >+This should be an arrayref of hashrefs, each containing a C<field> and an >+C<direction> (optional, defaults to C<asc>.) The results will be sorted >+according to these values. Valid values for C<direction> are 'asc' and 'desc'. >+ >+=back >+ >+=cut >+ >+sub build_query { >+ my ( $self, $query, %options ) = @_; >+ >+ my $stemming = C4::Context->preference("QueryStemming") || 0; >+ my $auto_truncation = C4::Context->preference("QueryAutoTruncate") || 0; >+ my $weight_fields = C4::Context->preference("QueryWeightFields") || 0; >+ my $fuzzy_enabled = C4::Context->preference("QueryFuzzy") || 0; >+ >+ $query = '*' unless defined $query; >+ >+ my $res; >+ $res->{query} = { >+ query_string => { >+ query => $query, >+ fuzziness => $fuzzy_enabled ? 'auto' : '0', >+ default_operator => "AND", >+ default_field => "_all", >+ } >+ }; >+ >+ if ( $options{sort} ) { >+ foreach my $sort ( @{ $options{sort} } ) { >+ my ( $f, $d ) = @$sort{qw/ field direction /}; >+ die "Invalid sort direction, $d" >+ if $d && ( $d ne 'asc' && $d ne 'desc' ); >+ $d = 'asc' unless $d; >+ >+ # TODO account for fields that don't have a 'phrase' type >+ push @{ $res->{sort} }, { "$f.phrase" => { order => $d } }; >+ } >+ } >+ >+ # See _convert_facets in Search.pm for how these get turned into >+ # things that Koha can use. >+ $res->{facets} = { >+ author => { terms => { field => "author__facet" } }, >+ subject => { terms => { field => "subject__facet" } }, >+ itype => { terms => { field => "itype__facet" } }, >+ }; >+ return $res; >+} >+ >+=head2 build_browse_query >+ >+ my $browse_query = $builder->build_browse_query($field, $query); >+ >+This performs a "starts with" style query on a particular field. The field >+to be searched must have been indexed with an appropriate mapping as a >+"phrase" subfield. >+ >+=cut >+ >+sub build_browse_query { >+ my ( $self, $field, $query ) = @_; >+ >+ my $fuzzy_enabled = C4::Context->preference("QueryFuzzy") || 0; >+ >+ return { query => '*' } if !defined $query; >+ >+ # TODO this should come from Koha::Elasticsearch >+ my %field_whitelist = ( >+ title => 1, >+ author => 1, >+ ); >+ $field = 'title' if !exists $field_whitelist{$field}; >+ >+ my $res = { >+ query => { >+ match_phrase_prefix => { >+ "$field.phrase" => { >+ query => $query, >+ operator => 'or', >+ fuzziness => $fuzzy_enabled ? 'auto' : '0', >+ } >+ } >+ }, >+ sort => [ { "$field.phrase" => { order => "asc" } } ], >+ }; >+} >+ >+=head2 build_query_compat >+ >+ my ( >+ $error, $query, $simple_query, $query_cgi, >+ $query_desc, $limit, $limit_cgi, $limit_desc, >+ $stopwords_removed, $query_type >+ ) >+ = $builder->build_query_compat( \@operators, \@operands, \@indexes, >+ \@limits, \@sort_by, $scan, $lang ); >+ >+This handles a search using the same api as L<C4::Search::buildQuery> does. >+ >+A very simple query will go in with C<$operands> set to ['query'], and >+C<$sort_by> set to ['pubdate_dsc']. This simple case will return with >+C<$query> set to something that can perform the search, C<$simple_query> >+set to just the search term, C<$query_cgi> set to something that can >+reproduce this search, and C<$query_desc> set to something else. >+ >+=cut >+ >+sub build_query_compat { >+ my ( $self, $operators, $operands, $indexes, $orig_limits, $sort_by, $scan, >+ $lang ) >+ = @_; >+ >+#die Dumper ( $self, $operators, $operands, $indexes, $limits, $sort_by, $scan, $lang ); >+ my @sort_params = $self->_convert_sort_fields(@$sort_by); >+ my @index_params = $self->_convert_index_fields(@$indexes); >+ my $limits = $self->_fix_limit_special_cases($orig_limits); >+ >+ # Merge the indexes in with the search terms and the operands so that >+ # each search thing is a handy unit. >+ unshift @$operators, undef; # The first one can't have an op >+ my @search_params; >+ my $ea = each_array( @$operands, @$operators, @index_params ); >+ while ( my ( $oand, $otor, $index ) = $ea->() ) { >+ next if ( !defined($oand) || $oand eq '' ); >+ push @search_params, { >+ operand => $self->_clean_search_term($oand), # the search terms >+ operator => defined($otor) ? uc $otor : undef, # AND and so on >+ $index ? %$index : (), >+ }; >+ } >+ >+ # We build a string query from limits and the queries. An alternative >+ # would be to pass them separately into build_query and let it build >+ # them into a structured ES query itself. Maybe later, though that'd be >+ # more robust. >+ my $query_str = join( ' AND ', >+ join( ' ', $self->_create_query_string(@search_params) ), >+ $self->_join_queries( $self->_convert_index_strings(@$limits) ) ); >+ >+ # If there's no query on the left, let's remove the junk left behind >+ $query_str =~ s/^ AND //; >+ my %options; >+ $options{sort} = \@sort_params; >+ my $query = $self->build_query( $query_str, %options ); >+ >+ #die Dumper($query); >+ # We roughly emulate the CGI parameters of the zebra query builder >+ my $query_cgi = 'idx=kw&q=' . uri_escape( $operands->[0] ) if @$operands; >+ my $simple_query = $operands->[0] if @$operands == 1; >+ my $query_desc = $simple_query; >+ my $limit = 'and ' . join( ' and ', @$limits ); >+ my $limit_cgi = >+ '&limit=' . join( '&limit=', map { uri_escape($_) } @$orig_limits ); >+ my $limit_desc = "@$limits"; >+ >+ return ( >+ undef, $query, $simple_query, $query_cgi, $query_desc, >+ $limit, $limit_cgi, $limit_desc, undef, undef >+ ); >+} >+ >+=head2 _convert_sort_fields >+ >+ my @sort_params = _convert_sort_fields(@sort_by) >+ >+Converts the zebra-style sort index information into elasticsearch-style. >+ >+C<@sort_by> is the same as presented to L<build_query_compat>, and it returns >+something that can be sent to L<build_query>. >+ >+=cut >+ >+sub _convert_sort_fields { >+ my ( $self, @sort_by ) = @_; >+ >+ # Turn the sorting into something we care about. >+ my %sort_field_convert = ( >+ acqdate => 'acqdate', >+ author => 'author', >+ call_number => 'callnum', >+ popularity => 'issues', >+ relevance => undef, # default >+ title => 'title', >+ pubdate => 'pubdate', >+ ); >+ my %sort_order_convert = >+ ( qw( dsc desc ), qw( asc asc ), qw( az asc ), qw( za desc ) ); >+ >+ # Convert the fields and orders, drop anything we don't know about. >+ grep { $_->{field} } map { >+ my ( $f, $d ) = split /_/; >+ { >+ field => $sort_field_convert{$f}, >+ direction => $sort_order_convert{$d} >+ } >+ } @sort_by; >+} >+ >+=head2 _convert_index_fields >+ >+ my @index_params = $self->_convert_index_fields(@indexes); >+ >+Converts zebra-style search index notation into elasticsearch-style. >+ >+C<@indexes> is an array of index names, as presented to L<build_query_compat>, >+and it returns something that can be sent to L<build_query>. >+ >+B<TODO>: this will pull from the elasticsearch mappings table to figure out >+types. >+ >+=cut >+ >+our %index_field_convert = ( >+ 'kw' => '_all', >+ 'ti' => 'title', >+ 'au' => 'author', >+ 'su' => 'subject', >+ 'nb' => 'isbn', >+ 'se' => 'title-series', >+ 'callnum' => 'callnum', >+ 'mc-itype' => 'itype', >+ 'ln' => 'ln', >+ 'branch' => 'homebranch', >+ 'fic' => 'lf', >+ 'mus' => 'rtype', >+ 'aud' => 'ta', >+); >+ >+sub _convert_index_fields { >+ my ( $self, @indexes ) = @_; >+ >+ my %index_type_convert = >+ ( __default => undef, phr => 'phrase', rtrn => 'right-truncate' ); >+ >+ # Convert according to our table, drop anything that doesn't convert >+ grep { $_->{field} } map { >+ my ( $f, $t ) = split /,/; >+ { >+ field => $index_field_convert{$f}, >+ type => $index_type_convert{ $t // '__default' } >+ } >+ } @indexes; >+} >+ >+=head2 _convert_index_strings >+ >+ my @searches = $self->_convert_index_strings(@searches); >+ >+Similar to L<_convert_index_fields>, this takes strings of the form >+B<field:search term> and rewrites the field from zebra-style to >+elasticsearch-style. Anything it doesn't understand is returned verbatim. >+ >+=cut >+ >+sub _convert_index_strings { >+ my ( $self, @searches ) = @_; >+ >+ my @res; >+ foreach my $s (@searches) { >+ next if $s eq ''; >+ my ( $field, $term ) = $s =~ /^\s*([\w,-]*?):(.*)/; >+ unless ( defined($field) && defined($term) ) { >+ push @res, $s; >+ next; >+ } >+ my ($conv) = $self->_convert_index_fields($field); >+ unless ( defined($conv) ) { >+ push @res, $s; >+ next; >+ } >+ push @res, $conv->{field} . ":" >+ . $self->_modify_string_by_type( %$conv, operand => $term ); >+ } >+ return @res; >+} >+ >+=head2 _modify_string_by_type >+ >+ my $str = $self->_modify_string_by_type(%index_field); >+ >+If you have a search term (operand) and a type (phrase, right-truncated), this >+will convert the string to have the function in lucene search terms, e.g. >+wrapping quotes around it. >+ >+=cut >+ >+sub _modify_string_by_type { >+ my ( $self, %idx ) = @_; >+ >+ my $type = $idx{type} || ''; >+ my $str = $idx{operand}; >+ return $str unless $str; # Empty or undef, we can't use it. >+ >+ $str .= '*' if $type eq 'right-truncate'; >+ $str = '"' . $str . '"' if $type eq 'phrase'; >+ return $str; >+} >+ >+=head2 _convert_index_strings_freeform >+ >+ my $search = $self->_convert_index_strings_freeform($search); >+ >+This is similar to L<_convert_index_strings>, however it'll search out the >+things to change within the string. So it can handle strings such as >+C<(su:foo) AND (su:bar)>, converting the C<su> appropriately. >+ >+=cut >+ >+sub _convert_index_strings_freeform { >+ my ( $self, $search ) = @_; >+ >+ while ( my ( $zeb, $es ) = each %index_field_convert ) { >+ $search =~ s/\b$zeb:/$es:/g; >+ } >+ return $search; >+} >+ >+=head2 _join_queries >+ >+ my $query_str = $self->_join_queries(@query_parts); >+ >+This takes a list of query parts, that might be search terms on their own, or >+booleaned together, or specifying fields, or whatever, wraps them in >+parentheses, and ANDs them all together. Suitable for feeding to the ES >+query string query. >+ >+=cut >+ >+sub _join_queries { >+ my ( $self, @parts ) = @_; >+ >+ @parts = grep { defined($_) && $_ ne '' } @parts; >+ return () unless @parts; >+ return $parts[0] if @parts < 2; >+ join ' AND ', map { "($_)" } @parts; >+} >+ >+=head2 _make_phrases >+ >+ my @phrased_queries = $self->_make_phrases(@query_parts); >+ >+This takes the supplied queries and forces them to be phrases by wrapping >+quotes around them. It understands field prefixes, e.g. 'subject:' and puts >+the quotes outside of them if they're there. >+ >+=cut >+ >+sub _make_phrases { >+ my ( $self, @parts ) = @_; >+ map { s/^\s*(\w*?:)(.*)$/$1"$2"/r } @parts; >+} >+ >+=head2 _create_query_string >+ >+ my @query_strings = $self->_create_query_string(@queries); >+ >+Given a list of hashrefs, it will turn them into a lucene-style query string. >+The hash should contain field, type (both for the indexes), operator, and >+operand. >+ >+=cut >+ >+sub _create_query_string { >+ my ( $self, @queries ) = @_; >+ >+ map { >+ my $otor = $_->{operator} ? $_->{operator} . ' ' : ''; >+ my $field = $_->{field} ? $_->{field} . ':' : ''; >+ >+ my $oand = $self->_modify_string_by_type(%$_); >+ "$otor($field$oand)"; >+ } @queries; >+} >+ >+=head2 _clean_search_term >+ >+ my $term = $self->_clean_search_term($term); >+ >+This cleans a search term by removing any funny characters that may upset >+ES and give us an error. It also calls L<_convert_index_strings_freeform> >+to ensure those parts are correct. >+ >+=cut >+ >+sub _clean_search_term { >+ my ( $self, $term ) = @_; >+ >+ $term = $self->_convert_index_strings_freeform($term); >+ $term =~ s/[{}]/"/g; >+ return $term; >+} >+ >+=head2 _fix_limit_special_cases >+ >+ my $limits = $self->_fix_limit_special_cases($limits); >+ >+This converts any special cases that the limit specifications have into things >+that are more readily processable by the rest of the code. >+ >+The argument should be an arrayref, and it'll return an arrayref. >+ >+=cut >+ >+sub _fix_limit_special_cases { >+ my ( $self, $limits ) = @_; >+ >+ my @new_lim; >+ foreach my $l (@$limits) { >+ >+ # This is set up by opac-search.pl >+ if ( $l =~ /^yr,st-numeric,ge=/ ) { >+ my ( $start, $end ) = >+ ( $l =~ /^yr,st-numeric,ge=(.*) and yr,st-numeric,le=(.*)$/ ); >+ next unless defined($start) && defined($end); >+ push @new_lim, "copydate:[$start TO $end]"; >+ } >+ elsif ( $l =~ /^yr,st-numeric=/ ) { >+ my ($date) = ( $l =~ /^yr,st-numeric=(.*)$/ ); >+ next unless defined($date); >+ push @new_lim, "copydate:$date"; >+ } >+ elsif ( $l =~ /^available$/ ) { >+ push @new_lim, 'onloan:false'; >+ } >+ else { >+ push @new_lim, $l; >+ } >+ } >+ return \@new_lim; >+} >+ >+1; >diff --git a/Koha/SearchEngine/QueryBuilderRole.pm b/Koha/SearchEngine/QueryBuilderRole.pm >index 3c1c778..953a671 100644 >--- a/Koha/SearchEngine/QueryBuilderRole.pm >+++ b/Koha/SearchEngine/QueryBuilderRole.pm >@@ -20,5 +20,8 @@ package Koha::SearchEngine::QueryBuilderRole; > use Moose::Role; > > requires 'build_query'; >+# The compat version should accept and return parameters in the same form as >+# C4::Search->buildQuery does. >+requires 'build_query_compat'; > > 1; >diff --git a/Koha/SearchEngine/Zebra.pm b/Koha/SearchEngine/Zebra.pm >index ffe18b8..afa40ae 100644 >--- a/Koha/SearchEngine/Zebra.pm >+++ b/Koha/SearchEngine/Zebra.pm >@@ -19,7 +19,8 @@ package Koha::SearchEngine::Zebra; > > use Moose; > >-extends 'Data::SearchEngine::Zebra'; >+# Removed because it doesn't exist. >+#extends 'Data::SearchEngine::Zebra'; > > # the configuration file is retrieved from KOHA_CONF by default, provide it from there² > has '+conf_file' => ( >diff --git a/Koha/SearchEngine/Zebra/QueryBuilder.pm b/Koha/SearchEngine/Zebra/QueryBuilder.pm >index 09a6d93..7fa2d3b 100644 >--- a/Koha/SearchEngine/Zebra/QueryBuilder.pm >+++ b/Koha/SearchEngine/Zebra/QueryBuilder.pm >@@ -17,6 +17,7 @@ package Koha::SearchEngine::Zebra::QueryBuilder; > # You should have received a copy of the GNU General Public License > # along with Koha; if not, see <http://www.gnu.org/licenses>. > >+use base qw(Class::Accessor); > use Modern::Perl; > use Moose::Role; > use C4::Search; >@@ -28,4 +29,10 @@ sub build_query { > C4::Search::buildQuery @_; > } > >+sub build_query_compat { >+ # Because this passes directly on to C4::Search, we have no trouble being >+ # compatible. >+ build_query(@_); >+} >+ > 1; >diff --git a/Koha/SearchEngine/Zebra/Search.pm b/Koha/SearchEngine/Zebra/Search.pm >index 3b736df..535b428 100644 >--- a/Koha/SearchEngine/Zebra/Search.pm >+++ b/Koha/SearchEngine/Zebra/Search.pm >@@ -17,20 +17,26 @@ package Koha::SearchEngine::Zebra::Search; > # You should have received a copy of the GNU General Public License > # along with Koha; if not, see <http://www.gnu.org/licenses>. > >-use Moose::Role; >-with 'Koha::SearchEngine::SearchRole'; >+# I don't think this ever worked right >+#use Moose::Role; >+#with 'Koha::SearchEngine::SearchRole'; > >-use Data::SearchEngine::Zebra; >-use Data::SearchEngine::Query; >-use Koha::SearchEngine::Zebra; >-use Data::Dump qw(dump); >+use base qw(Class::Accessor); >+# Removed because it doesn't exist/doesn't work. >+#use Data::SearchEngine::Zebra; >+#use Data::SearchEngine::Query; >+#use Koha::SearchEngine::Zebra; >+#use Data::Dump qw(dump); > >-has searchengine => ( >- is => 'rw', >- isa => 'Koha::SearchEngine::Zebra', >- default => sub { Koha::SearchEngine::Zebra->new }, >- lazy => 1 >-); >+use C4::Search; # :( >+ >+# Broken without the Data:: stuff >+#has searchengine => ( >+# is => 'rw', >+# isa => 'Koha::SearchEngine::Zebra', >+# default => sub { Koha::SearchEngine::Zebra->new }, >+# lazy => 1 >+#); > > sub search { > my ($self,$query_string) = @_; >@@ -53,6 +59,18 @@ sub search { > } > } > >+=head2 search_compat >+ >+This passes straight through to C4::Search::getRecords. >+ >+=cut >+ >+sub search_compat { >+ shift; # get rid of $self >+ >+ return getRecords(@_); >+} >+ > sub dosmth {'bou' } > > 1; >diff --git a/installer/data/mysql/elasticsearch_mapping.sql b/installer/data/mysql/elasticsearch_mapping.sql >new file mode 100644 >index 0000000..6a0dd0d >--- /dev/null >+++ b/installer/data/mysql/elasticsearch_mapping.sql >@@ -0,0 +1,148 @@ >+DROP TABLE IF EXISTS elasticsearch_mapping; >+CREATE TABLE `elasticsearch_mapping` ( >+ `id` int(11) NOT NULL AUTO_INCREMENT, >+ `mapping` varchar(255) DEFAULT NULL, >+ `type` varchar(255) NOT NULL, >+ `facet` boolean DEFAULT FALSE, >+ `marc21` varchar(255) DEFAULT NULL, >+ `unimarc` varchar(255) DEFAULT NULL, >+ `normarc` varchar(255) DEFAULT NULL, >+ PRIMARY KEY (`id`) >+) ENGINE=InnoDB AUTO_INCREMENT=126 DEFAULT CHARSET=utf8; >+ >+ >+ >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('llength',FALSE,'','leader_/1-5',NULL,'leader_/1-5'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('rtype',FALSE,'','leader_/6',NULL,'leader_/6'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('bib-level',FALSE,'','leader_/7',NULL,'leader_/7'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('control-number',FALSE,'','001',NULL,'001'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('local-number',FALSE,'',NULL,'001',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('date-time-last-modified',FALSE,'','005','099d',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('microform-generation',FALSE,'','007_/11',NULL,'007_/11'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('material-type',FALSE,'','007','200b','007'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('ff7-00',FALSE,'','007_/1',NULL,'007_/1'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('ff7-01',FALSE,'','007_/2',NULL,'007_/2'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('ff7-02',FALSE,'','007_/3',NULL,'007_/3'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('ff7-01-02',FALSE,'','007_/1-2',NULL,'007_/1-2'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('date-entered-on-file',FALSE,'','008_/1-5','099c','008_/1-5'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('pubdate',FALSE,'','008_/7-10','100a_/9-12','008_/7-10'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('pl',FALSE,'','008_/15-17','210a','008_/15-17'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('ta',FALSE,'','008_/22','100a_/17','008_/22'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('ff8-23',FALSE,'','008_/23',NULL,'008_/23'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('ff8-29',FALSE,'','008_/29','105a_/8','008_/29'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('lf',FALSE,'','008_/33','105a_/11','008_/33'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('bio',FALSE,'','008_/34','105a_/12','008_/34'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('ln',FALSE,'','008_/35-37','101a','008_/35-37'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('ctype',FALSE,'','008_/24-27','105a_/4-7','008_/24-27'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('record-source',FALSE,'','008_/39','995c','008_/39'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('lc-cardnumber',FALSE,'','010','995j','010'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('lc-cardnumber',FALSE,'','011',NULL,NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('identifier-standard',FALSE,'','010',NULL,'010'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('identifier-standard',FALSE,'','011',NULL,NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('bnb-card-number',FALSE,'','015',NULL,'015'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('bgf-number',FALSE,'','015',NULL,'015'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('number-db',FALSE,'','015',NULL,'015'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('number-natl-biblio',FALSE,'','015',NULL,'015'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('identifier-standard',FALSE,'','015',NULL,'015'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('number-legal-deposit',FALSE,'','017',NULL,NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('identifier-standard',FALSE,'','017',NULL,NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('identifier-standard',FALSE,'','018',NULL,NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('identifier-standard',FALSE,'','020a','010az','020a'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('isbn',FALSE,'','020a','010az','020a'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('identifier-standard',FALSE,'','022a','011ayz','022a'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('issn',FALSE,'','022a','011ayz','022a'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('author',TRUE,'string','100a','200f','100a'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('author',TRUE,'string','110a','200g','110a'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('author',TRUE,'string','111a',NULL,'111a'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('author',TRUE,'string','700a','700a','700a'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('author',FALSE,'string','245c','701','245c'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string','245a','200a','245a'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string','246','200c','246'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string','247','200d','247'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string','490','200e','490a'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string','505t','200h',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string','711t','200i','711t'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string','700t','205','700t'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string','710t','304a','710t'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string','730','327a','730'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string','740','327b','740'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string','780','327c','780'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string','785','327d','785'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string','130','327e','130'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string','210','327f','210'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string','211','327g',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string','212','327h',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string','214','327i',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string','222','328t','222'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string','240','410t','240'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'411t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'412t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'413t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'421t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'422t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'423t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'424t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'425t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'430t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'431t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'432t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'433t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'434t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'435t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'436t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'437t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'440t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'441t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'442t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'443t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'444t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'445t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'446t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'447t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'448t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'451t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'452t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'453t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'454t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'455t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'456t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'461t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'462t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'463t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'464t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'470t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'481t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'482t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('title',FALSE,'string',NULL,'488t',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('subject',TRUE,'string','600a','600a','600a'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('subject',TRUE,'string','600t','600','600t'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('subject',TRUE,'string','610a','601','610a'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('subject',TRUE,'string','610t','602','610t'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('subject',TRUE,'string','611','604','611'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('subject',TRUE,'string','630n','605','630n'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('subject',TRUE,'string','630r','606','630r'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('subject',TRUE,'string','650a','607','650a'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('subject',TRUE,'string','650b',NULL,'650b'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('subject',TRUE,'string','650c',NULL,'650c'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('subject',TRUE,'string','650d',NULL,'650d'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('subject',TRUE,'string','650v',NULL,'650v'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('subject',TRUE,'string','650x',NULL,'650x'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('subject',TRUE,'string','650y',NULL,'650y'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('subject',TRUE,'string','650z',NULL,'650z'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('subject',TRUE,'string','651','608','651'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('subject',TRUE,'string','653a','610','653'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('local-classification',FALSE,'','952o','995k','952o'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('local-classification',FALSE,'',NULL,'686',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('local-number',FALSE,'','999c','001','999c'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('local-number',FALSE,'',NULL,'0909',NULL); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('itype',TRUE,'string','942c','200b','942c'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('itype',TRUE,'string','952y','995r','952y'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('acqdate',FALSE,'date','952d','9955','952y'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('place',TRUE,'string','260a','210a','260a'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('publisher',TRUE,'string','260b','210c','260b'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('copydate',TRUE,'date','260c',NULL,'260c'); -- No copydate for unimarc? Seems strange. >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('homebranch',TRUE,'string','952a','995b','952a'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('holdingbranch',TRUE,'string','952b','995c','952b'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('onloan',FALSE,'boolean','952q','995n','952q'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('itemnumber',FALSE,'number','9529','9959','9529'); >+INSERT INTO `elasticsearch_mapping` (`mapping`, `facet`, `type`, `marc21`, `unimarc`, `normarc`) VALUES ('issues',FALSE,'sum','952l',NULL,'952l'); -- Apparently not tracked in unimarc >diff --git a/installer/data/mysql/kohastructure.sql b/installer/data/mysql/kohastructure.sql >index b46dc6a..0e89b1a 100644 >--- a/installer/data/mysql/kohastructure.sql >+++ b/installer/data/mysql/kohastructure.sql >@@ -931,6 +931,21 @@ CREATE TABLE `deleteditems` ( > ) ENGINE=InnoDB DEFAULT CHARSET=utf8; > > -- >+-- Table structure for table `elasticsearch_mapping` >+-- >+ >+DROP TABLE IF EXISTS `elasticsearch_mapping`; >+CREATE TABLE `elasticsearch_mapping` ( >+ `id` int(11) NOT NULL AUTO_INCREMENT, >+ `mapping` varchar(255) DEFAULT NULL, >+ `type` varchar(255) DEFAULT NULL, >+ `marc21` varchar(255) DEFAULT NULL, >+ `unimarc` varchar(255) DEFAULT NULL, >+ `normarc` varchar(255) DEFAULT NULL, >+ PRIMARY KEY (`id`) >+) ENGINE=InnoDB AUTO_INCREMENT=24 DEFAULT CHARSET=utf8; >+ >+-- > -- Table structure for table `ethnicity` > -- > >diff --git a/koha-tmpl/intranet-tmpl/prog/en/modules/admin/preferences/admin.pref b/koha-tmpl/intranet-tmpl/prog/en/modules/admin/preferences/admin.pref >index 1215ac1..5d43a34 100644 >--- a/koha-tmpl/intranet-tmpl/prog/en/modules/admin/preferences/admin.pref >+++ b/koha-tmpl/intranet-tmpl/prog/en/modules/admin/preferences/admin.pref >@@ -112,4 +112,5 @@ Administration: > choices: > Solr: Solr > Zebra: Zebra >+ Elasticsearch: Elasticsearch > - is the search engine used. >diff --git a/koha-tmpl/opac-tmpl/prog/en/modules/search/results.tt b/koha-tmpl/opac-tmpl/prog/en/modules/search/results.tt >index 0322ec5..b9a8e04 100644 >--- a/koha-tmpl/opac-tmpl/prog/en/modules/search/results.tt >+++ b/koha-tmpl/opac-tmpl/prog/en/modules/search/results.tt >@@ -14,7 +14,7 @@ > > <script type="text/javascript"> > $(document).ready(function() { >- $('#bookbag_form').find("input").hide(); >+// $('#bookbag_form').find("input").hide(); > $('#sort_by').change(function() { > $('#bookbag_form').submit(); > }); >@@ -47,6 +47,22 @@ > </div> > <div class="searchresults"> > <form action="/cgi-bin/koha/opac-search.pl" method="get" name="bookbag_form" id="bookbag_form"> >+ [%# IF (browse) %] >+ <label for="browse_field">Browse: </label> >+ <select name="browse_field" id="browse_field"> >+ <option value="title">Title</option> >+ <option value="author">Author</option> >+ <option value="callnumber">Call Number</option> >+ <option value="subject">Subject</option> >+ <option value="isbn">ISBN</option> >+ <option value="issn">ISSN</option> >+ </select> >+ <input type="hidden" name="type" value="browse" /> >+ <br /> >+ <label for="search_field">Query:</label> >+ <input type="text" name="q" style="display:initial;" /> >+ <input type="submit" value="Browse" style="display:initial;" /> >+ [%# END %] > <!-- TABLE RESULTS START --> > <table> > <thead> >@@ -76,14 +92,16 @@ > </thead> > <!-- Actual Search Results --> > <tbody> >+ [% USE Dumper %] > [% FOREACH SEARCH_RESULT IN SEARCH_RESULTS %] >+ [% result =SEARCH_RESULT.item('_source') %] > <tr> > <td> >- <input type="checkbox" id="bib[% SEARCH_RESULT.biblionumber %]" name="biblionumber" value="[% SEARCH_RESULT.biblionumber %]" /> <label for="bib[% SEARCH_RESULT.biblionumber %]"></label> >+ <input type="checkbox" id="bib[% result.biblionumber %]" name="biblionumber" value="[% result.biblionumber %]" /> <label for="bib[% result.biblionumber %]"></label> > </td> > <td> >- <a class="title" href="/cgi-bin/koha/opac-detail.pl?biblionumber=[% SEARCH_RESULT.biblionumber |url %]" title="View details for this title">[% SEARCH_RESULT.title |html %]</a> >- by <a href="/cgi-bin/koha/opac-search.pl?q=author:[% SEARCH_RESULT.author |url %]" title="Search for works by this author" class="author">[% SEARCH_RESULT.author %]</a> >+ <a class="title" href="/cgi-bin/koha/opac-detail.pl?biblionumber=[% result.biblionumber |url %]" title="View details for this title">[% result.title |html %]</a> >+ by <a href="/cgi-bin/koha/opac-search.pl?q=author:[% result.author |url %]" title="Search for works by this author" class="author">[% result.author %]</a> > </td> > </tr> > [% END %] >diff --git a/misc/search_tools/rebuild_elastic_search.pl b/misc/search_tools/rebuild_elastic_search.pl >new file mode 100755 >index 0000000..055bd1e >--- /dev/null >+++ b/misc/search_tools/rebuild_elastic_search.pl >@@ -0,0 +1,148 @@ >+#!/usr/bin/perl >+ >+# This inserts records from a Koha database into elastic search >+ >+# Copyright 2014 Catalyst IT >+# >+# This file is part of Koha. >+# >+# Koha is free software; you can redistribute it and/or modify it under the >+# terms of the GNU General Public License as published by the Free Software >+# Foundation; either version 3 of the License, or (at your option) any later >+# version. >+# >+# Koha is distributed in the hope that it will be useful, but WITHOUT ANY >+# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR >+# A PARTICULAR PURPOSE. See the GNU General Public License for more details. >+# >+# You should have received a copy of the GNU General Public License along >+# with Koha; if not, write to the Free Software Foundation, Inc., >+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. >+ >+=head1 NAME >+ >+rebuild_elastic_search.pl - inserts records from a Koha database into Elasticsearch >+ >+=head1 SYNOPSIS >+ >+B<rebuild_elastic_search.pl> >+[B<-c|--commit>=C<count>] >+[B<-v|--verbose>] >+[B<-h|--help>] >+[B<--man>] >+ >+=head1 DESCRIPTION >+ >+=head1 OPTIONS >+ >+=over >+ >+=item B<-c|--commit>=C<count> >+ >+Specify how many records will be batched up before they're added to Elasticsearch. >+Higher should be faster, but will cause more RAM usage. Default is 100. >+ >+=item B<-d|--delete> >+ >+Delete the index and recreate it before indexing. >+ >+=item B<-b|--biblionumber> >+ >+Only index the supplied biblionumber, mostly for testing purposes. May be >+repeated. >+ >+=item B<-v|--verbose> >+ >+By default, this program only emits warnings and errors. This makes it talk >+more. Add more to make it even more wordy, in particular when debugging. >+ >+=item B<-h|--help> >+ >+Help! >+ >+=item B<--man> >+ >+Full documentation. >+ >+=cut >+ >+use autodie; >+use Getopt::Long; >+use Koha::Biblio; >+use Koha::ElasticSearch::Indexer; >+use MARC::Field; >+use MARC::Record; >+use Modern::Perl; >+use Pod::Usage; >+ >+use Data::Dumper; # TODO remove >+ >+my $verbose = 0; >+my $commit = 100; >+my ($delete, $help, $man); >+my (@biblionumbers); >+ >+GetOptions( >+ 'c|commit=i' => \$commit, >+ 'd|delete' => \$delete, >+ 'b|biblionumber=i' => \@biblionumbers, >+ 'v|verbose!' => \$verbose, >+ 'h|help' => \$help, >+ 'man' => \$man, >+); >+ >+pod2usage(1) if $help; >+pod2usage( -exitstatus => 0, -verbose => 2 ) if $man; >+ >+my $next; >+if (@biblionumbers) { >+ $next = sub { >+ my $r = shift @biblionumbers; >+ return () unless defined $r; >+ return ($r, Koha::Biblio->get_marc_biblio($r, item_data => 1)); >+ }; >+} else { >+ my $records = Koha::Biblio->get_all_biblios_iterator(); >+ $next = sub { >+ $records->next(); >+ } >+} >+my $indexer = Koha::ElasticSearch::Indexer->new({index => 'biblios' }); >+if ($delete) { >+ # We know it's safe to not recreate the indexer because update_index >+ # hasn't been called yet. >+ $indexer->delete_index(); >+} >+ >+my $count = 0; >+my $commit_count = $commit; >+my (@bibnums_buffer, @commit_buffer); >+while (scalar(my ($bibnum, $rec) = $next->())) { >+ _log(1,"$bibnum\n"); >+ $count++; >+ >+ push @bibnums_buffer, $bibnum; >+ push @commit_buffer, $rec; >+ if (!(--$commit_count)) { >+ _log(2, "Committing...\n"); >+ $indexer->update_index(\@bibnums_buffer, \@commit_buffer); >+ $commit_count = $commit; >+ @bibnums_buffer = (); >+ @commit_buffer = (); >+ } >+} >+# There are probably uncommitted records >+$indexer->update_index(\@bibnums_buffer, \@commit_buffer); >+_log(1, "$count records indexed.\n"); >+ >+# Output progress information. >+# >+# _log($level, $msg); >+# >+# Will output $msg if the verbosity setting is set to $level or more. Will >+# not include a trailing newline. >+sub _log { >+ my ($level, $msg) = @_; >+ >+ print $msg if ($verbose <= $level); >+} >diff --git a/myfix.txt b/myfix.txt >new file mode 100644 >index 0000000..278b3e9 >--- /dev/null >+++ b/myfix.txt >@@ -0,0 +1,3 @@ >+marc_map( '245a','title' ); >+marc_map( '100a','author' ); >+marc_map( '999c','biblionumber' ); >diff --git a/opac/elasticsearch.pl b/opac/elasticsearch.pl >new file mode 100755 >index 0000000..b9fccad >--- /dev/null >+++ b/opac/elasticsearch.pl >@@ -0,0 +1,102 @@ >+#!/usr/bin/perl >+ >+# Copyright 2013 Catalyst >+# >+# This file is part of Koha. >+# >+# Koha is free software; you can redistribute it and/or modify it under the >+# terms of the GNU General Public License as published by the Free Software >+# Foundation; either version 3 of the License, or (at your option) any later >+# version. >+# >+# Koha is distributed in the hope that it will be useful, but WITHOUT ANY >+# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR >+# A PARTICULAR PURPOSE. See the GNU General Public License for more details. >+# >+# You should have received a copy of the GNU General Public License along >+# with Koha; if not, write to the Free Software Foundation, Inc., >+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. >+ >+use Modern::Perl; >+ >+use C4::Context; >+use CGI; >+use C4::Auth; >+use C4::Koha; >+use C4::Output; >+ >+# TODO this should use the moose thing that auto-picks. >+use Koha::SearchEngine::Elasticsearch::QueryBuilder; >+use Koha::ElasticSearch::Search; >+ >+my $cgi = new CGI; >+ >+my $template_name; >+my $template_type = "basic"; >+if ( $cgi->param("idx") or $cgi->param("q") ) { >+ $template_name = 'search/results.tt'; >+} >+else { >+ $template_name = 'search/advsearch.tt'; >+ $template_type = 'advsearch'; >+} >+ >+# load the template >+my ( $template, $borrowernumber, $cookie ) = get_template_and_user( >+ { >+ template_name => $template_name, >+ query => $cgi, >+ type => "opac", >+ authnotrequired => 1, >+ } >+); >+my %template_params; >+my $format = $cgi->param("format") || 'html'; >+ >+# load the Type stuff >+my $itemtypes = GetItemTypes; >+ >+my $page = $cgi->param("page") || 1; >+my $count = >+ $cgi->param('count') >+ || C4::Context->preference('OPACnumSearchResults') >+ || 20; >+my $q = $cgi->param("q"); >+ >+my $searcher = Koha::ElasticSearch::Search->new(); >+my $builder = Koha::SearchEngine::Elasticsearch::QueryBuilder->new(); >+my $query; >+if ($cgi->param('type') eq 'browse') { >+ $query = $builder->build_browse_query($cgi->param('browse_field') || undef, $q ); >+ $template_params{browse} = 1; >+} else { >+ $query = $builder->build_query($q); >+} >+my $results = $searcher->search( $query, $page, $count ); >+#my $results = $searcher->search( { "match_phrase_prefix" => { "title" => "the" } } ); >+ >+# This is temporary, but will do the job for now. >+my @hits; >+$results->each(sub { >+ push @hits, { _source => @_[0] }; >+ }); >+# Make a list of the page numbers >+my @pages = map { { page => $_, current => ($_ == ( $page || 1)) } } 1 .. int($results->total / $count); >+my $max_page = int($results->total / $count); >+# Pager template params >+$template->param( >+ SEARCH_RESULTS => \@hits, >+ PAGE_NUMBERS => \@pages, >+ total => $results->total, >+ previous_page => ( $page > 1 ? $page - 1 : undef ), >+ next_page => ( $page < $max_page ? $page + 1 : undef ), >+ follower_params => [ >+ { var => 'type', val => $cgi->param('type') }, >+ { var => 'q', val => $q }, >+ { var => 'count', val => $count }, >+ ], >+ %template_params, >+); >+ >+my $content_type = ( $format eq 'rss' or $format eq 'atom' ) ? $format : 'html'; >+output_with_http_headers $cgi, $cookie, $template->output, $content_type; >diff --git a/opac/opac-search.pl b/opac/opac-search.pl >index cac6ab5..286ecc9 100755 >--- a/opac/opac-search.pl >+++ b/opac/opac-search.pl >@@ -28,14 +28,35 @@ use Modern::Perl; > # to perform, etc. > ## load Koha modules > use C4::Context; >+use C4::Search; > >-my $searchengine = C4::Context->preference("SearchEngine"); >-if ( $searchengine =~ /^Solr$/ ) { >- warn "We use Solr"; >- require 'opac/search.pl'; >- exit; >-} elsif ( $searchengine =~ /^Zebra$/ ) { >+use Data::Dumper; # TODO remove > >+use Koha::SearchEngine::Elasticsearch::QueryBuilder; >+use Koha::ElasticSearch::Search; >+use Koha::SearchEngine::Zebra::QueryBuilder; >+use Koha::SearchEngine::Zebra::Search; >+ >+my $searchengine = C4::Context->preference("SearchEngine"); >+my ($builder, $searcher); >+#$searchengine = 'Zebra'; # XXX >+for ( $searchengine ) { >+ when ( /^Solr$/ ) { >+ warn "We use Solr"; >+ require 'opac/search.pl'; >+ exit; >+ } >+ when ( /^Zebra$/ ) { >+ $builder=Koha::SearchEngine::Zebra::QueryBuilder->new(); >+ $searcher=Koha::SearchEngine::Zebra::Search->new(); >+ } >+ when (/^Elasticsearch$/) { >+ # Should use the base QueryBuilder, but I don't have it wired up >+ # for moose yet. >+ $builder=Koha::SearchEngine::Elasticsearch::QueryBuilder->new(); >+# $builder=Koha::SearchEngine::Zebra::QueryBuilder->new(); >+ $searcher=Koha::ElasticSearch::Search->new({index => 'biblios'}); >+ } > } > > use C4::Output; >@@ -449,7 +470,8 @@ my ($error,$query,$simple_query,$query_cgi,$query_desc,$limit,$limit_cgi,$limit_ > my @results; > > ## I. BUILD THE QUERY >-( $error,$query,$simple_query,$query_cgi,$query_desc,$limit,$limit_cgi,$limit_desc,$stopwords_removed,$query_type) = buildQuery(\@operators,\@operands,\@indexes,\@limits,\@sort_by, 0, $lang); >+( $error,$query,$simple_query,$query_cgi,$query_desc,$limit,$limit_cgi,$limit_desc,$stopwords_removed,$query_type) = $builder->build_query_compat(\@operators,\@operands,\@indexes,\@limits,\@sort_by, 0, $lang); >+#die Dumper( $error,$query,$simple_query,$query_cgi,$query_desc,$limit,$limit_cgi,$limit_desc,$stopwords_removed,$query_type); > > sub _input_cgi_parse { > my @elements; >@@ -528,11 +550,12 @@ if ($tag) { > $pasarParams .= '&simple_query=' . $simple_query; > $pasarParams .= '&query_type=' . $query_type if ($query_type); > eval { >- ($error, $results_hashref, $facets) = getRecords($query,$simple_query,\@sort_by,\@servers,$results_per_page,$offset,$expanded_facet,$branches,$itemtypes,$query_type,$scan,1); >+ ($error, $results_hashref, $facets) = $searcher->search_compat($query,$simple_query,\@sort_by,\@servers,$results_per_page,$offset,$expanded_facet,$branches,$itemtypes,$query_type,$scan,1); > }; > } >+ > # This sorts the facets into alphabetical order >-if ($facets) { >+if ($facets && @$facets) { > foreach my $f (@$facets) { > $f->{facets} = [ sort { uc($a->{facet_title_value}) cmp uc($b->{facet_title_value}) } @{ $f->{facets} } ]; > } >diff --git a/t/Koha/ItemType.pm b/t/Koha/ItemType.pm >new file mode 100755 >index 0000000..e47d5e2 >--- /dev/null >+++ b/t/Koha/ItemType.pm >@@ -0,0 +1,46 @@ >+#!/usr/bin/perl >+# >+# Copyright 2014 Catalyst IT >+# >+# This file is part of Koha. >+# >+# Koha is free software; you can redistribute it and/or modify it under the >+# terms of the GNU General Public License as published by the Free Software >+# Foundation; either version 3 of the License, or (at your option) any later >+# version. >+# >+# Koha is distributed in the hope that it will be useful, but WITHOUT ANY >+# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR >+# A PARTICULAR PURPOSE. See the GNU General Public License for more details. >+# >+# You should have received a copy of the GNU General Public License along >+# with Koha; if not, write to the Free Software Foundation, Inc., >+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. >+ >+use Modern::Perl; >+ >+use Test::More tests => 8; >+ >+BEGIN { >+ use_ok('Koha::ItemType'); >+} >+ >+my $data = { >+ itemtype => 'CODE', >+ description => 'description', >+ rentalcharge => 'rentalcharge', >+ imageurl => 'imageurl', >+ summary => 'summary', >+ checkinmsg => 'checkinmsg', >+ checkinmsgtype => 'checkinmsgtype', >+}; >+ >+my $type = Koha::ItemType->new($data); >+ >+is( $type->code, 'CODE', 'itemtype/code' ); >+is( $type->description, 'description', 'description' ); >+is( $type->rentalcharge, 'rentalcharge', 'rentalcharge' ); >+is( $type->imageurl, 'imageurl', 'imageurl' ); >+is( $type->summary, 'summary', 'summary' ); >+is( $type->checkinmsg, 'checkinmsg', 'checkinmsg' ); >+is( $type->checkinmsgtype, 'checkinmsgtype', 'checkinmsgtype' ); >diff --git a/t/Koha_ElasticSearch.t b/t/Koha_ElasticSearch.t >new file mode 100644 >index 0000000..8888f2c >--- /dev/null >+++ b/t/Koha_ElasticSearch.t >@@ -0,0 +1,23 @@ >+# >+#=============================================================================== >+# >+# FILE: Koha_ElasticSearch.t >+# >+# DESCRIPTION: >+# >+# FILES: --- >+# BUGS: --- >+# NOTES: --- >+# AUTHOR: Chris Cormack (rangi), chrisc@catalyst.net.nz >+# ORGANIZATION: Koha Development Team >+# VERSION: 1.0 >+# CREATED: 09/12/13 08:56:44 >+# REVISION: --- >+#=============================================================================== >+ >+use strict; >+use warnings; >+ >+use Test::More tests => 1; # last test to print >+ >+use_ok('Koha::ElasticSearch'); >diff --git a/t/Koha_ElasticSearch_Indexer.t b/t/Koha_ElasticSearch_Indexer.t >new file mode 100644 >index 0000000..6de6a32 >--- /dev/null >+++ b/t/Koha_ElasticSearch_Indexer.t >@@ -0,0 +1,51 @@ >+# >+#=============================================================================== >+# >+# FILE: Koha_ElasticSearch_Indexer.t >+# >+# DESCRIPTION: >+# >+# FILES: --- >+# BUGS: --- >+# NOTES: --- >+# AUTHOR: Chris Cormack (rangi), chrisc@catalyst.net.nz >+# ORGANIZATION: Koha Development Team >+# VERSION: 1.0 >+# CREATED: 09/12/13 08:57:25 >+# REVISION: --- >+#=============================================================================== >+ >+use strict; >+use warnings; >+ >+use Test::More tests => 5; # last test to print >+use MARC::Record; >+ >+use_ok('Koha::ElasticSearch::Indexer'); >+ >+my $indexer; >+ok( >+ my $indexer = Koha::ElasticSearch::Indexer->new( >+ { >+ 'nodes' => ['localhost:9200'], >+ 'index' => 'mydb' >+ } >+ ), >+ 'Creating new indexer object' >+); >+ >+my $marc_record = MARC::Record->new(); >+my $field = MARC::Field->new( '001', '1234567' ); >+$marc_record->append_fields($field); >+$field = MARC::Field->new( '020', '', '', 'a' => '1234567890123' ); >+$marc_record->append_fields($field); >+$field = MARC::Field->new( '245', '', '', 'a' => 'Title' ); >+$marc_record->append_fields($field); >+ >+my $records = [$marc_record]; >+ok( my $converted = $indexer->convert_marc_to_json($records), >+ 'Convert some records' ); >+ >+is( $converted->count, 1, 'One converted record' ); >+ >+ok( $indexer->update_index($records), 'Update Index' ); >diff --git a/t/Koha_ElasticSearch_Search.t b/t/Koha_ElasticSearch_Search.t >new file mode 100644 >index 0000000..081b162 >--- /dev/null >+++ b/t/Koha_ElasticSearch_Search.t >@@ -0,0 +1,38 @@ >+# >+#=============================================================================== >+# >+# FILE: Koha_ElasticSearch_Search.t >+# >+# DESCRIPTION: >+# >+# FILES: --- >+# BUGS: --- >+# NOTES: --- >+# AUTHOR: Chris Cormack (rangi), chrisc@catalyst.net.nz >+# ORGANIZATION: Koha Development Team >+# VERSION: 1.0 >+# CREATED: 09/12/13 09:43:29 >+# REVISION: --- >+#=============================================================================== >+ >+use strict; >+use warnings; >+ >+use Test::More tests => 5; # last test to print >+ >+use_ok('Koha::ElasticSearch::Search'); >+ >+ok( >+ my $searcher = Koha::ElasticSearch::Search->new( >+ { 'nodes' => ['localhost:9200'], 'index' => 'mydb' } >+ ), >+ 'Creating a Koha::ElasticSearch::Search object' >+); >+ >+is( $searcher->index, 'mydb', 'Testing basic accessor' ); >+ >+ok( $searcher->connect, 'Connect to ElasticSearch server' ); >+ok( my $results = $searcher->search( { record => 'easy' } ), 'Do a search ' ); >+ >+ok( my $marcresults = $searcher->marc_search( { record => 'Fish' } ), >+ 'Do a marc search' ); >diff --git a/t/db_dependent/Koha/ItemTypes.pm b/t/db_dependent/Koha/ItemTypes.pm >new file mode 100755 >index 0000000..dfefd08 >--- /dev/null >+++ b/t/db_dependent/Koha/ItemTypes.pm >@@ -0,0 +1,65 @@ >+#!/usr/bin/perl >+# >+# Copyright 2014 Catalyst IT >+# >+# This file is part of Koha. >+# >+# Koha is free software; you can redistribute it and/or modify it under the >+# terms of the GNU General Public License as published by the Free Software >+# Foundation; either version 3 of the License, or (at your option) any later >+# version. >+# >+# Koha is distributed in the hope that it will be useful, but WITHOUT ANY >+# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR >+# A PARTICULAR PURPOSE. See the GNU General Public License for more details. >+# >+# You should have received a copy of the GNU General Public License along >+# with Koha; if not, write to the Free Software Foundation, Inc., >+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. >+ >+# XXX This doesn't work because I need to figure out how to do transactions >+# in a test-case with DBIx::Class >+ >+use Modern::Perl; >+ >+use Test::More tests => 8; >+use Data::Dumper; >+ >+BEGIN { >+ use_ok('Koha::ItemTypes'); >+} >+ >+my $dbh = C4::Context->dbh; >+ >+# Start transaction >+$dbh->{AutoCommit} = 0; >+$dbh->{RaiseError} = 1; >+ >+my $prep = $dbh->prepare('INSERT INTO itemtypes (itemtype, description, rentalcharge, imageurl, summary, checkinmsg, checkinmsgtype) VALUES (?,?,?,?,?,?,?)'); >+$prep->execute('type1', 'description', 'rentalcharge', 'imageurl', 'summary', 'checkinmsg', 'checkinmsgtype'); >+$prep->execute('type2', 'description', 'rentalcharge', 'imageurl', 'summary', 'checkinmsg', 'checkinmsgtype'); >+ >+my $itypes = Koha::ItemTypes->new(); >+ >+my @types = $itypes->get_itemtype('type1', 'type2'); >+ >+die Dumper(\@types); >+my $type = $types[0]; >+ok(defined($type), 'first result'); >+is( $type->code, 'type1', 'itemtype/code' ); >+is( $type->description, 'description', 'description' ); >+is( $type->rentalcharge, 'rentalcharge', 'rentalcharge' ); >+is( $type->imageurl, 'imageurl', 'imageurl' ); >+is( $type->summary, 'summary', 'summary' ); >+is( $type->checkinmsg, 'checkinmsg', 'checkinmsg' ); >+is( $type->checkinmsgtype, 'checkinmsgtype', 'checkinmsgtype' ); >+ >+$type = $types[1]; >+ok(defined($type), 'second result'); >+is( $type->code, 'type2', 'itemtype/code' ); >+is( $type->description, 'description', 'description' ); >+is( $type->rentalcharge, 'rentalcharge', 'rentalcharge' ); >+is( $type->imageurl, 'imageurl', 'imageurl' ); >+is( $type->summary, 'summary', 'summary' ); >+is( $type->checkinmsg, 'checkinmsg', 'checkinmsg' ); >+is( $type->checkinmsgtype, 'checkinmsgtype', 'checkinmsgtype' ); >-- >1.9.1
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Diff
View Attachment As Raw
Actions:
View
|
Diff
|
Splinter Review
Attachments on
bug 12478
:
29204
|
31128
|
31129
|
33136
|
33137
|
33138
|
33139
|
33140
|
33141
|
35654
|
40903
|
42030
|
42031
|
42032
|
42033
|
42062
|
42063
|
42064
|
42065
|
42066
|
42067
|
42068
|
42069
|
42392
|
42393
|
43127
|
43128
|
43129
|
43130
|
43131
|
43132
|
43133
|
43134
|
43140
|
43141
|
43142
|
43148
|
43345
|
43346
|
43347
|
43348
|
43349
|
43350
|
43351
|
43352
|
43373
|
47767
|
47768
|
50115
|
50159
|
50211
|
50212
|
50213
|
50214
|
50245
|
50246
|
50291
|
50292