Bugzilla – Attachment 50254 Details for
Bug 10662
Build OAI-PMH Harvesting Client
Home
|
New
|
Browse
|
Search
|
[?]
|
Reports
|
Help
|
New Account
|
Log In
[x]
|
Forgot Password
Login:
[x]
[patch]
Bug 10662 - Build OAI-PMH Harvesting Client
Bug-10662---Build-OAI-PMH-Harvesting-Client.patch (text/plain), 149.22 KB, created by
David Cook
on 2016-04-15 07:02:03 UTC
(
hide
)
Description:
Bug 10662 - Build OAI-PMH Harvesting Client
Filename:
MIME Type:
Creator:
David Cook
Created:
2016-04-15 07:02:03 UTC
Size:
149.22 KB
patch
obsolete
>From 40144278dba416f4233e677d3ae57c074e112692 Mon Sep 17 00:00:00 2001 >From: David Cook <dcook@prosentient.com.au> >Date: Tue, 8 Sep 2015 12:52:51 +1000 >Subject: [PATCH] Bug 10662 - Build OAI-PMH Harvesting Client > >_TEST PLAN_ > >In the following steps replace "/home/user" with whatever your Koha dev install is > >_CONFIGURATION_ >1) Apply Bugzilla patches 15555, 15541, then 10662 >2) Upgrade your Koha dev install so that it uses new Zebra configuration files and updates your koha-conf.xml; try something like the following: > perl Makefile.PL --prev-install-log "$INSTALL_LOG" > make > make test > make upgrade >3) Run upgrade database: > perl installer/data/mysql/updatedatabase.pl >4) At this stage, in your koha-conf.xml, you should see something like the following at the end of the file: ><icarus> > <socket>unix:/home/user/koha-dev/var/run/icarus.sock</socket> > <pidfile>/home/user/koha-dev/var/run/icarus.pid</pidfile> > <log>/home/user/koha-dev/var/log/icarus.log</log> > <task_plugin>Koha::Icarus::Task::Enqueue::OAIPMH::Biblio</task_plugin> > <task_plugin>Koha::Icarus::Task::Dequeue::OAIPMH::Biblio</task_plugin> > <max_tasks>30</max_tasks> ></icarus> >5) Set PERL5LIB to include /home/user/koha (i.e. path to your C4 and Koha module directories) > export PERL5LIB=/home/user/koha >6) In Koha, create a record matching rule: > Code = OAI > Match threshold = 100 > Record type = Bibliographic > > Match point 1 > Search index = control-number > Score = 100 > Tag = 001 > > Match point 2 > Search index = id-other,st-urx > Score = 100 > Tag = 024 > Subfields = a > Normalization rule = raw > >_TESTING_ >7) Activate Icarus using the following command: > perl /home/user/koha/misc/bin/icarusd.pl -f /home/user/koha-dev/etc/koha-conf.xml > NOTE: It will send output to your terminal window. To write to the log file, you can daemonize by adding the "-d" or "--daemon" options. > NOTE: If you want lots of debugging info in your logs add "-v 9" > >_USER DOWNLOAD TASK_ >7a) Go to Koha administration > Saved tasks (http://KOHA/cgi-bin/koha/admin/saved_tasks.pl) >7b) Click "New saved task" >7c) Leave it on "Koha::Icarus::Task::Download::OAIPMH::Biblio" and click "Next" >7d) Choose a "Start time" in the past using the calendar pop-up >7e) Choose a "Repeat interval" of at least 30 seconds (for initial troubleshooting purposes) >7f) Choose a URL of an OAI-PMH repository that you want to harvest (also include a username, password, and realm if necessary) >7g) Fill out your OAI-PMH repositories as you like >7h) Fill out "Queue" with something like file:///home/user/koha/icarus_test >7i) Click "Save" > >8a) Check that the Icarus dashboard has a "Status" of "Online" >8b) Click "Send to Icarus" next to your new saved task entry >8c) A task should now appear under "Active Icarus tasks"; click "Start" >8d) Go back to your terminal to check the Icarus server output (or 'tail -f' the log if you daemonized) >8e) You should notice activity; you can also check /home/user/koha/icarus_test to see if records are being downloaded and stored there. > >_USER IMPORT TASK_ >9a) Go back to Koha administration > Saved tasks (http://KOHA/cgi-bin/koha/admin/saved_tasks.pl) >9b) Click "New saved task" >9c) Choose Koha::Icarus::Task::Upload::OAIPMH::Biblio, and click "Next" >9d) Repeat the same steps for "Start time" and "Repeat interval" as these are common to all tasks >9e) "Queue" should be the same as before, so try your path of file:///home/user/koha/icarus_test >9f) Provide a username and password for the API authentication; you should be able to use the default URL >9g) You should be able to trust the "Import target parameters" URL >9h) Write "OAI" for "Record matching rule code", if you added it earlier in the configuration steps >9i) Leave the "Action..." defaults... >9j) Change "Filter" to "file:///home/user/koha/koha-tmpl/intranet-tmpl/prog/en/xslt/OAI2MARC21slim.xsl" >9k) Follow the same steps as above for sending the task to Icarus and telling Icarus to start it >9l) You can either watch Icarus's server output, or look at http://KOHA/cgi-bin/koha/tools/manage-oai-import.pl, >or look directly at your database's "import_oai" and "biblio" tables to see how records are imported to Koha >9m) You'll want to visit http://KOHA/cgi-bin/koha/tools/manage-oai-import.pl in any case, as this is your dashboard >for seeing incoming OAI-PMH records. You can view the MARC import batch, see the OAI-PMH record itself, and see >if the record was imported OK or if there wasn an ERROR. >9n) Resolving ERRORs is a work in progress... >--- > Koha/Icarus.pm | 177 +++++++++++ > Koha/Icarus/Base.pm | 32 ++ > Koha/Icarus/Listener.pm | 328 +++++++++++++++++++ > Koha/Icarus/Task.pm | 315 +++++++++++++++++++ > Koha/Icarus/Task/Base.pm | 24 ++ > Koha/Icarus/Task/Download/OAIPMH/Biblio.pm | 316 +++++++++++++++++++ > Koha/Icarus/Task/Upload/OAIPMH/Biblio.pm | 118 +++++++ > Koha/OAI/Client/Record.pm | 249 +++++++++++++++ > Koha/SavedTask.pm | 86 +++++ > Koha/SavedTasks.pm | 62 ++++ > Koha/Schema/Result/ImportOai.pm | 152 +++++++++ > Koha/Schema/Result/SavedTask.pm | 98 ++++++ > Makefile.PL | 19 +- > admin/saved_tasks.pl | 347 +++++++++++++++++++++ > docs/Icarus/README | 64 ++++ > etc/koha-conf.xml | 8 + > .../bug_10662-Build_import_oai_table.sql | 25 ++ > installer/data/mysql/kohastructure.sql | 31 ++ > .../intranet-tmpl/prog/en/includes/admin-menu.inc | 1 + > .../tasks/KohaIcarusTaskDownloadOAIPMHBiblio.inc | 87 ++++++ > .../tasks/KohaIcarusTaskUploadOAIPMHBiblio.inc | 143 +++++++++ > .../prog/en/modules/admin/admin-home.tt | 2 + > .../prog/en/modules/admin/saved_tasks.tt | 338 ++++++++++++++++++++ > .../prog/en/modules/tools/manage-oai-import.tt | 122 ++++++++ > .../intranet-tmpl/prog/en/xslt/OAI2MARC21slim.xsl | 74 +++++ > misc/bin/icarusd.pl | 181 +++++++++++ > rewrite-config.PL | 2 + > skel/var/run/koha/icarus/README | 1 + > svc/import_oai | 143 +++++++++ > tools/manage-oai-import.pl | 128 ++++++++ > 30 files changed, 3672 insertions(+), 1 deletion(-) > create mode 100755 Koha/Icarus.pm > create mode 100755 Koha/Icarus/Base.pm > create mode 100755 Koha/Icarus/Listener.pm > create mode 100755 Koha/Icarus/Task.pm > create mode 100755 Koha/Icarus/Task/Base.pm > create mode 100755 Koha/Icarus/Task/Download/OAIPMH/Biblio.pm > create mode 100755 Koha/Icarus/Task/Upload/OAIPMH/Biblio.pm > create mode 100755 Koha/OAI/Client/Record.pm > create mode 100755 Koha/SavedTask.pm > create mode 100755 Koha/SavedTasks.pm > create mode 100755 Koha/Schema/Result/ImportOai.pm > create mode 100755 Koha/Schema/Result/SavedTask.pm > create mode 100755 admin/saved_tasks.pl > create mode 100755 docs/Icarus/README > create mode 100644 installer/data/mysql/atomicupdate/bug_10662-Build_import_oai_table.sql > create mode 100755 koha-tmpl/intranet-tmpl/prog/en/includes/tasks/KohaIcarusTaskDownloadOAIPMHBiblio.inc > create mode 100755 koha-tmpl/intranet-tmpl/prog/en/includes/tasks/KohaIcarusTaskUploadOAIPMHBiblio.inc > create mode 100644 koha-tmpl/intranet-tmpl/prog/en/modules/admin/saved_tasks.tt > create mode 100755 koha-tmpl/intranet-tmpl/prog/en/modules/tools/manage-oai-import.tt > create mode 100755 koha-tmpl/intranet-tmpl/prog/en/xslt/OAI2MARC21slim.xsl > create mode 100755 misc/bin/icarusd.pl > create mode 100644 skel/var/run/koha/icarus/README > create mode 100755 svc/import_oai > create mode 100755 tools/manage-oai-import.pl > >diff --git a/Koha/Icarus.pm b/Koha/Icarus.pm >new file mode 100755 >index 0000000..b57e691 >--- /dev/null >+++ b/Koha/Icarus.pm >@@ -0,0 +1,177 @@ >+package Koha::Icarus; >+ >+# Copyright 2016 Prosentient Systems >+# >+# This file is part of Koha. >+# >+# Koha is free software; you can redistribute it and/or modify it under the >+# terms of the GNU General Public License as published by the Free Software >+# Foundation; either version 3 of the License, or (at your option) any later >+# version. >+# >+# Koha is distributed in the hope that it will be useful, but WITHOUT ANY >+# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR >+# A PARTICULAR PURPOSE. See the GNU General Public License for more details. >+# >+# You should have received a copy of the GNU General Public License along >+# with Koha; if not, write to the Free Software Foundation, Inc., >+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. >+ >+use Modern::Perl; >+use IO::Socket::UNIX; >+use IO::Select; >+use URI; >+use JSON; >+ >+sub new { >+ my ($class, $args) = @_; >+ $args = {} unless defined $args; >+ return bless ($args, $class); >+} >+ >+sub connected { >+ my ($self) = @_; >+ if ($self->{_connected}){ >+ return 1; >+ } >+} >+ >+sub connect { >+ my ($self) = @_; >+ my $socket_uri = $self->{socket_uri}; >+ if ($socket_uri){ >+ my $uri = URI->new($socket_uri); >+ if ($uri && $uri->scheme eq 'unix'){ >+ my $socket_path = $uri->path; >+ my $socket = IO::Socket::UNIX->new( >+ Type => IO::Socket::UNIX::SOCK_STREAM(), >+ Peer => $socket_path, >+ ); >+ if ($socket){ >+ my $socketio = new IO::Select(); >+ $socketio->add($socket); >+ #FIXME: Should probably fix these return values... >+ $self->{_socketio} = $socketio; >+ $self->{_socket} = $socket; >+ my $message = $self->_read(); >+ if ($message eq 'HELLO'){ >+ $self->{_connected} = 1; >+ return 1; >+ } >+ } >+ } >+ } >+ return 0; >+} >+ >+sub add_task { >+ my ($self, $args) = @_; >+ my $task = $args->{task}; >+ if ($task && %$task){ >+ my $response = $self->command("add task", undef, $task); >+ if ($response){ >+ return $response; >+ } >+ } >+} >+ >+sub start_task { >+ my ($self, $args) = @_; >+ my $task_id = $args->{task_id}; >+ if ($task_id){ >+ my $response = $self->command("start task", $task_id); >+ if ($response){ >+ return $response; >+ } >+ } >+} >+ >+sub remove_task { >+ my ($self, $args) = @_; >+ my $task_id = $args->{task_id}; >+ if ($task_id){ >+ my $response = $self->command("remove task", $task_id); >+ if ($response){ >+ return $response; >+ } >+ } >+} >+ >+sub list_tasks { >+ my ($self) = @_; >+ my $response = $self->command("list tasks"); >+ if ($response){ >+ if (my $tasks = $response->{tasks}){ >+ return $tasks; >+ } >+ } >+} >+ >+sub shutdown { >+ my ($self) = @_; >+ my $response = $self->command("shutdown"); >+ if ($response){ >+ return $response; >+ } >+} >+ >+ >+ >+ >+ >+sub command { >+ my ($self, $command, $task_id, $task) = @_; >+ my $serialized = $self->_serialize({ "command" => $command, "task_id" => $task_id, "task" => $task }); >+ if ($serialized){ >+ $self->_write({ serialized => $serialized }); >+ my $json = $self->_read(); >+ if ($json){ >+ my $response = from_json($json); >+ if ($response){ >+ return $response; >+ } >+ } >+ } >+} >+ >+sub _serialize { >+ my ($self, $output) = @_; >+ my $serialized = to_json($output); >+ return $serialized; >+} >+ >+sub _write { >+ my ($self, $args) = @_; >+ my $socket = $self->{_socket}; >+ my $output = $args->{serialized}; >+ if ($output){ >+ if (my $socketio = $self->{_socketio}){ >+ if (my @filehandles = $socketio->can_write(5)){ >+ foreach my $filehandle (@filehandles){ >+ #Localize output record separator as null >+ local $\ = "\x00"; >+ print $socket $output; >+ } >+ } >+ } >+ } >+} >+ >+sub _read { >+ my ($self) = @_; >+ if (my $socketio = $self->{_socketio}){ >+ if (my @filehandles = $socketio->can_read(5)){ >+ foreach my $filehandle (@filehandles){ >+ #Localize input record separator as null >+ local $/ = "\x00"; >+ my $message = <$filehandle>; >+ chomp($message) if $message; >+ return $message; >+ } >+ } >+ } >+} >+ >+ >+ >+1; >\ No newline at end of file >diff --git a/Koha/Icarus/Base.pm b/Koha/Icarus/Base.pm >new file mode 100755 >index 0000000..8e6423f >--- /dev/null >+++ b/Koha/Icarus/Base.pm >@@ -0,0 +1,32 @@ >+package Koha::Icarus::Base; >+ >+use Modern::Perl; >+use DateTime; >+ >+use constant DEBUG => 9; >+use constant SILENT => 0; >+ >+sub new { >+ my ($class, $args) = @_; >+ $args = {} unless defined $args; >+ return bless ($args, $class); >+} >+ >+sub debug { >+ my ($self,$message) = @_; >+ if ($self->{Verbosity} == DEBUG){ >+ $self->log($message); >+ } >+} >+ >+sub log { >+ my ($self,$message) = @_; >+ my $id = $self->{_id}; >+ my $component = $self->{_component} // "component"; >+ if ( ($self->{Verbosity}) && ($self->{Verbosity} > SILENT) ){ >+ my $now = DateTime->now(time_zone => "local"); >+ say "[$now] [$component $id] $message"; >+ } >+} >+ >+1; >\ No newline at end of file >diff --git a/Koha/Icarus/Listener.pm b/Koha/Icarus/Listener.pm >new file mode 100755 >index 0000000..5abab3b >--- /dev/null >+++ b/Koha/Icarus/Listener.pm >@@ -0,0 +1,328 @@ >+package Koha::Icarus::Listener; >+ >+use Modern::Perl; >+use parent 'Koha::Icarus::Base'; >+ >+use POE qw(Wheel::ReadWrite Wheel::SocketFactory Wheel::Run); >+use IO::Socket qw(AF_UNIX); >+use URI; >+use Koha::Icarus::Task; >+use JSON; #For "on_client_input" >+ >+my $null_filter = POE::Filter::Line->new( >+ Literal => chr(0), >+); >+ >+sub new { >+ my ($class, $args) = @_; >+ $args = {} unless defined $args; >+ $args->{_component} = "server"; >+ $args->{_id} = "undefined"; >+ return bless ($args, $class); >+} >+ >+#NOTE: "spawn" inspired by http://poe.perl.org/?POE_Cookbook/Object_Methods >+sub spawn { >+ my ($class, $args) = @_; >+ my $self = $class->new($args); >+ POE::Session->create( >+ object_states => [ >+ $self => { >+ _start => "on_server_start", >+ shutdown => "shutdown", >+ set_verbosity => "set_verbosity", >+ _child => "on_task_event", >+ got_list_tasks => "on_list_tasks", >+ graceful_shutdown => "graceful_shutdown", >+ got_client_accept => "on_client_accept", >+ got_client_error => "on_client_error", >+ got_server_error => "on_server_error", >+ got_add_task => "on_add_task", >+ got_client_input => "on_client_input", >+ }, >+ ], >+ ); >+} >+ >+#Methods for POE::Session >+ >+sub on_server_start { >+ my ($self, $kernel,$heap,$session) = @_[OBJECT, KERNEL,HEAP,SESSION]; >+ my $server_id = $session->ID; >+ $self->{_id} = $server_id; #Set internal id for logging purposes >+ >+ my $bind_address_uri = $self->{Socket}; >+ my $max_tasks = $self->{MaxTasks}; >+ >+ $kernel->sig(INT => "graceful_shutdown"); >+ $kernel->sig(TERM => "graceful_shutdown"); >+ >+ $heap->{max_tasks} = $max_tasks // 25; #Default maximum of 25 unless otherwise specified >+ >+ $self->log("Maximum number of tasks allowed: $heap->{max_tasks}"); >+ $self->log("Starting server..."); >+ >+ my %server_params = ( >+ SuccessEvent => "got_client_accept", >+ FailureEvent => "got_server_error", >+ ); >+ >+ #TODO: At this time, only "unix" sockets are supported. In future, perhaps TCP/IP sockets could also be supported. >+ my $uri = URI->new($bind_address_uri); >+ my $scheme = $uri->scheme; >+ >+ if ($scheme eq 'unix'){ >+ my $bind_address = $uri->path; >+ $server_params{SocketDomain} = AF_UNIX; >+ $server_params{BindAddress} = $bind_address; >+ #When starting a unix socket server, you need to remove any existing references to that socket file. >+ if ($bind_address && (-e $bind_address) ){ >+ unlink $bind_address; >+ } >+ } >+ >+ $heap->{server} = POE::Wheel::SocketFactory->new(%server_params); >+ >+ if ($scheme eq 'unix'){ >+ #FIXME/DEBUGGING: This is a way to force a permission denied error... >+ #chmod 0755, $uri->path; >+ #Make the socket writeable to other users like Apache >+ chmod 0666, $uri->path; >+ } >+ >+} >+ >+sub shutdown { >+ my ($self,$heap,$session,$kernel) = @_[OBJECT, HEAP,SESSION,KERNEL]; >+ >+ if ($heap->{server}){ >+ $self->log("Shutting down server..."); >+ #Delete the server, so that you can't get any new connections >+ delete $heap->{server} if $heap->{server}; >+ } >+ >+ if ($heap->{client}){ >+ $self->log("Shutting down any remaining clients..."); >+ #Delete the clients, so that you bring down the existing connections >+ delete $heap->{client}; #http://www.perlmonks.org/?node_id=176971 >+ } >+} >+ >+sub on_task_event { >+ my ($self, $kernel, $heap,$session) = @_[OBJECT,KERNEL, HEAP,SESSION]; >+ my ($action,$child_session,$task) = @_[ARG0,ARG1,ARG2]; >+ >+ my $child_id = $child_session->ID; >+ >+ $self->debug("$action child $child_id"); >+ >+ >+ if ($action eq 'create'){ >+ #NOTE: The $task variable is returned by the child POE session's _start event >+ my $task_id = $child_session->ID; >+ $heap->{tasks}->{$task_id}->{task} = $task; >+ >+ } elsif ($action eq 'lose'){ >+ my $task_id = $child_session->ID; >+ delete $heap->{tasks}->{$task_id}; >+ } >+} >+ >+#TODO: Put this in a parent class? >+sub set_verbosity { >+ my ($self,$session,$kernel,$new_verbosity) = @_[OBJECT,SESSION,KERNEL,ARG0]; >+ if (defined $new_verbosity){ >+ $self->{Verbosity} = $new_verbosity; >+ } >+} >+ >+sub on_list_tasks { >+ my ($self, $kernel, $heap,$session) = @_[OBJECT, KERNEL, HEAP,SESSION]; >+ >+ #DEBUG: You can access the POE::Kernel's sessions with "$POE::Kernel::poe_kernel->[POE::Kernel::KR_SESSIONS]". >+ #While it's black magic you shouldn't touch, it can be helpful when debugging. >+ >+ my @tasks = (); >+ foreach my $task_id (keys %{$heap->{tasks}} ){ >+ push(@tasks,{ task_id => $task_id, task => $heap->{tasks}->{$task_id}->{task} }); >+ } >+ return \@tasks; >+} >+ >+sub graceful_shutdown { >+ my ($self, $heap,$session,$kernel,$signal) = @_[OBJECT, HEAP,SESSION,KERNEL,ARG0]; >+ >+ #Tell the kernel that you're handling the signal sent to this session >+ $kernel->sig_handled(); >+ $kernel->sig($signal); >+ >+ my $tasks = $kernel->call($session,"got_list_tasks"); >+ >+ >+ if ( $heap->{tasks} && %{$heap->{tasks}} ){ >+ $self->log("Waiting for tasks to finish..."); >+ foreach my $task_id (keys %{$heap->{tasks}}){ >+ $self->log("Task $task_id still exists..."); >+ $kernel->post($task_id,"got_task_stop"); >+ } >+ } else { >+ $self->log("All tasks have finished"); >+ $kernel->yield("shutdown"); >+ return; >+ } >+ >+ $self->log("Attempting graceful shutdown in 1 second..."); >+ #NOTE: Basically, we just try another graceful shutdown on the next tick. >+ $kernel->delay("graceful_shutdown" => 1); >+} >+ >+#Accept client connection to listener >+sub on_client_accept { >+ my ($self, $client_socket, $server_wheel_id, $heap, $session) = @_[OBJECT, ARG0, ARG3, HEAP,SESSION]; >+ >+ my $client_wheel = POE::Wheel::ReadWrite->new( >+ Handle => $client_socket, >+ InputEvent => "got_client_input", >+ ErrorEvent => "got_client_error", >+ InputFilter => $null_filter, >+ OutputFilter => $null_filter, >+ ); >+ >+ $client_wheel->put("HELLO"); >+ $heap->{client}->{ $client_wheel->ID() } = $client_wheel; >+ >+ $self->debug("Connection ".$client_wheel->ID()." started."); >+ >+} >+ >+#Handle server error - shutdown server >+sub on_server_error { >+ my ($self, $operation, $errnum, $errstr, $heap, $session) = @_[OBJECT, ARG0, ARG1, ARG2,HEAP, SESSION]; >+ $self->debug("Server $operation error $errnum: $errstr\n"); >+ delete $heap->{server}; >+} >+ >+#Handle client error - including disconnect >+sub on_client_error { >+ my ($self, $wheel_id,$heap,$session) = @_[OBJECT, ARG3,HEAP,SESSION]; >+ >+ $self->debug("Connection $wheel_id failed or ended."); >+ >+ delete $heap->{client}->{$wheel_id}; >+ >+} >+ >+sub on_add_task { >+ my ($self, $message, $kernel, $heap, $session) = @_[OBJECT, ARG0, KERNEL, HEAP,SESSION]; >+ >+ #Fetch a list of all tasks >+ my @task_keys = keys %{$heap->{tasks}}; >+ >+ #If the number in the list is less than the max, add a new task >+ #else die. >+ if (scalar @task_keys < $heap->{max_tasks}){ >+ my $server_id = $session->ID; >+ my $task_session = Koha::Icarus::Task->spawn({ message => $message, server_id => $server_id, Verbosity => $self->{Verbosity}, }); >+ return $task_session->ID; >+ } else { >+ #This die should be caught by the event caller... >+ die "Maximum number of tasks already reached.\n"; >+ } >+} >+ >+sub on_client_input { >+ my ($self, $input, $wheel_id, $session, $kernel, $heap) = @_[OBJECT, ARG0, ARG1, SESSION, KERNEL, HEAP]; >+ >+ #Store server id more explicitly >+ my $server_id = $session->ID; >+ >+ #Server listener has received input from client >+ my $client = $heap->{client}->{$wheel_id}; >+ >+ #Parse input from client >+ my $message = from_json($input); >+ >+ if ( ref $message eq 'HASH' ){ >+ #Read "command" from client >+ if (my $command = $message->{command}){ >+ $self->log("Message received with command \"$command\"."); >+ if ($command eq 'add task'){ >+ my $output = {}; >+ >+ #Create a task session >+ eval { >+ #NOTE: The server automatically keeps track of its child tasks >+ my $task_id = $kernel->call($server_id,"got_add_task",$message); >+ >+ $output->{action} = "added"; >+ $output->{task_id} = $task_id; >+ }; >+ if ($@){ >+ $self->debug("$@"); >+ chomp($@); >+ $output->{action} = "error"; >+ $output->{error_message} = $@; >+ } >+ my $server_output = to_json($output); >+ $client->put($server_output); >+ return; >+ >+ } elsif ( ($command eq 'remove task') || ($command eq 'start task' ) ){ >+ >+ my $task_id = $message->{task_id}; >+ >+ my $output = { >+ task_id => $task_id, >+ }; >+ >+ if ($command eq 'remove task'){ >+ $kernel->call($task_id,"got_task_stop"); >+ $output->{action} = "removed"; >+ } elsif ($command eq 'start task'){ >+ my $response = $kernel->call($task_id, "on_task_init"); >+ $output->{action} = $response; >+ } >+ >+ if ($!){ >+ $output->{action} = "error"; >+ $output->{error_message} = $!; >+ } >+ >+ #FIXME: What do we actually want to send back to the client? >+ my $server_output = to_json($output); >+ $client->put($server_output); >+ return; >+ >+ } elsif ($command eq 'list tasks'){ >+ >+ #Get tasks from listener (ie self) >+ my $tasks = $kernel->call($server_id, "got_list_tasks"); >+ >+ #Prepare output for client >+ my $server_output = to_json({tasks => $tasks}, {pretty => 1}); >+ >+ #Send output to client >+ $client->put($server_output); >+ return; >+ >+ } elsif ($command eq 'shutdown'){ >+ $kernel->post($server_id, "graceful_shutdown"); >+ my $server_output = to_json({action => 'shutting down'}); >+ $client->put($server_output); >+ return; >+ } else { >+ $self->log("The message contained an invalid command!"); >+ $client->put("Sorry! That is an invalid command!"); >+ return; >+ } >+ } else { >+ $self->log("The message was missing a command!"); >+ } >+ } else { >+ $self->log("The message was malformed!"); >+ } >+ $client->put("Sorry! That is an invalid message!"); >+ return; >+} >+ >+1; >diff --git a/Koha/Icarus/Task.pm b/Koha/Icarus/Task.pm >new file mode 100755 >index 0000000..2e8fe78 >--- /dev/null >+++ b/Koha/Icarus/Task.pm >@@ -0,0 +1,315 @@ >+package Koha::Icarus::Task; >+ >+use Modern::Perl; >+use parent 'Koha::Icarus::Base'; >+ >+use POE qw(Wheel::Run); >+use DateTime; >+use DateTime::Format::Strptime; >+use JSON; >+use Module::Load::Conditional qw/can_load/; >+ >+my $datetime_pattern = DateTime::Format::Strptime->new( >+ pattern => '%F %T', >+ time_zone => 'local', >+); >+my $epoch_pattern = DateTime::Format::Strptime->new( >+ pattern => '%s', >+); >+ >+sub new { >+ my ($class, $args) = @_; >+ $args = {} unless defined $args; >+ $args->{_component} = "task"; >+ $args->{_id} = "undefined"; >+ return bless ($args, $class); >+} >+ >+#NOTE: "spawn" inspired by http://poe.perl.org/?POE_Cookbook/Object_Methods >+sub spawn { >+ my ($class, $args) = @_; >+ my $self = $class->new($args); >+ my $task_session = POE::Session->create( >+ object_states => [ >+ $self => { >+ _start => "on_task_create", >+ "got_child_stdout" => "on_child_stdout", >+ "got_child_stderr" => "on_child_stderr", >+ "got_child_close" => "on_child_close", >+ "got_child_signal" => "on_child_signal", >+ "got_terminal_signal" => "on_terminal_signal", >+ "child_process_success" => "child_process_success", >+ "got_task_stop" => "on_task_stop", >+ "on_task_init" => "on_task_init", >+ "on_task_start" => "on_task_start", >+ }, >+ ], >+ ); >+ return $task_session; >+} >+ >+sub on_task_create { >+ my ($self, $session, $kernel, $heap) = @_[OBJECT, SESSION, KERNEL, HEAP]; >+ >+ #Trap terminal signals so that the task can stop gracefully. >+ $kernel->sig(INT => "got_terminal_signal"); >+ $kernel->sig(TERM => "got_terminal_signal"); >+ >+ my $task_id = $session->ID; >+ if ($task_id){ >+ #Tell the kernel that this task is waiting for an external action (ie keepalive counter) >+ $kernel->refcount_increment($task_id,"waiting task"); >+ $self->{_id} = $task_id; #Set internal id for logging purposes >+ } >+ >+ my $server_id = $self->{server_id}; >+ if ($server_id){ >+ $heap->{server_id} = $server_id; >+ } >+ >+ my $task = undef; >+ my $message = $self->{message}; >+ if ($message){ >+ $task = $message->{task}; >+ if ($task){ >+ $task->{status} = 'new'; >+ $heap->{task} = $task; >+ } >+ } >+ return $task; #This return value is used by the parent POE session's _child handler >+} >+ >+#This sub is just to start it now, or set it to start in the future... if the time is now or in the past, it starts now... if it's in the future, it starts in the future... >+sub on_task_init { >+ my ($self, $session, $kernel, $heap) = @_[OBJECT, SESSION, KERNEL, HEAP]; >+ my $response = 'pending'; >+ my $task = $heap->{task}; >+ my $status = $task->{status}; >+ if ($status){ >+ if ($status eq 'started'){ >+ $response = 'already started'; >+ } elsif ($status eq 'pending'){ >+ $response = 'already pending'; >+ } else { >+ $task->{status} = 'pending'; >+ >+ my $start = $task->{start}; >+ my $start_message = $start; >+ >+ >+ my $dt; >+ if ( $dt = $datetime_pattern->parse_datetime($start) ){ >+ #e.g. 2016-04-06 00:00:00 >+ } elsif ( $dt = $epoch_pattern->parse_datetime($start) ){ >+ #e.g. 1459837498 or apparently 0000-00-00 00:00:00 >+ } else { >+ #If we don't match the datetime_pattern or epoch_pattern, then we start right now. >+ $dt = DateTime->now( time_zone => 'local', ); >+ } >+ if ($dt){ >+ $start = $dt->epoch; >+ $start_message = $dt; >+ } >+ >+ >+ $self->log("Start task at $start_message"); >+ #NOTE: $start must be in UNIX epoch time (ie number of seconds that have elapsed since 00:00:00 UTC Thursday 1 January 1970) >+ $kernel->alarm("on_task_start",$start); >+ } >+ } >+ return $response; >+} >+ >+sub on_task_start { >+ my ($self, $session, $kernel, $heap) = @_[OBJECT, SESSION, KERNEL, HEAP]; >+ my $task = $heap->{task}; >+ $task->{status} = 'started'; >+ >+ if (my $repeat_interval = $task->{repeat_interval}){ >+ #NOTE: Reset the start time with a human readable timestamp >+ my $dt = DateTime->now( time_zone => 'local', ); >+ $dt->add( seconds => $repeat_interval ); >+ $task->{start} = $dt->strftime("%F %T"); >+ } >+ #FIXME: You need to impose child process limits here! How many child processes are allowed to be running at any given time? Well, you can only have one child process per task... >+ #so it's really more of a limit on the number of tasks... you probably need to have an internal task queue... that's easy enough though. >+ my $child = POE::Wheel::Run->new( >+ ProgramArgs => [ $task, ], >+ Program => sub { >+ my ($task) = @_; >+ >+ #Perform some last minute POE calls before running the task module plugin >+ my $session = $poe_kernel->get_active_session(); >+ if ($session){ >+ my $heap = $session->get_heap(); >+ $poe_kernel->call($heap->{server_id},"set_verbosity",0); #This turns off the server logging in this forked process, so the following call() doesn't mess up our logs >+ $poe_kernel->call($heap->{server_id},"shutdown"); #Shutdown the socket listener on the child process, so there's zero chance of writing to or reading from the socket in the child process >+ } >+ >+ #NOTE: I don't know if this really needs to be run, but it shouldn't hurt. >+ $poe_kernel->stop(); >+ >+ #Try to load the task type module. >+ my $task_type = $task->{type}; >+ if ( can_load ( modules => { $task_type => undef, }, ) ){ >+ #Create the object >+ my $task_object = $task_type->new({task => $task, Verbosity => $self->{Verbosity}, }); >+ if ($task_object){ >+ #Synchronous action: run the task module >+ $task_object->run; >+ } >+ } else { >+ die "Couldn't load module $task_type: $Module::Load::Conditional::ERROR" >+ } >+ }, >+ StdoutEvent => "got_child_stdout", >+ StderrEvent => "got_child_stderr", >+ CloseEvent => "got_child_close", >+ NoSetPgrp => 1, #Keep child processes in same group as parent. This is especially useful when using Ctrl+C to kill the whole group. >+ ); >+ >+ $kernel->sig_child($child->PID, "got_child_signal"); >+ # Wheel events include the wheel's ID. >+ $_[HEAP]{children_by_wid}{$child->ID} = $child; >+ # Signal events include the process ID. >+ $_[HEAP]{children_by_pid}{$child->PID} = $child; >+ >+ $self->debug("child pid ".$child->PID." started as wheel ".$child->ID); >+} >+ >+sub on_task_stop { >+ my ($self, $session, $kernel, $heap) = @_[OBJECT, SESSION, KERNEL, HEAP]; >+ my $task = $heap->{task}; >+ $task->{status} = 'stopping'; >+ my $task_id = $session->ID; >+ my $server_id = $heap->{server_id}; >+ >+ if ($heap->{stopping}){ >+ $self->debug("Task is already in the process of stopping..."); >+ >+ } else { >+ >+ $self->log("Trying to stop task."); >+ >+ >+ #Mark this task as stopping >+ $heap->{stopping} = 1; >+ >+ #Stop the task from spawning new jobs >+ $kernel->alarm("on_task_start"); >+ >+ my $children_by_pid = $heap->{children_by_pid}; >+ if ($children_by_pid && %$children_by_pid){ >+ >+ $self->debug("Child processes in progres..."); >+ my $child_processes = $heap->{children_by_pid}; >+ foreach my $child_pid (keys %$child_processes){ >+ my $child = $child_processes->{$child_pid}; >+ $self->debug("Telling child pid $child_pid to stop"); >+ $child->put("quit"); >+ #TODO: Perhaps it would be worthwhile having a kill switch too? >+ # my $rv = $child->kill("TERM"); >+ } >+ } >+ >+ $self->log("Removing task keepalive."); >+ >+ $kernel->refcount_decrement($task_id,"waiting task"); >+ } >+} >+ >+sub on_terminal_signal { >+ my ($self, $signal,$session,$kernel) = @_[OBJECT, ARG0,SESSION,KERNEL]; >+ $self->debug("Trapped SIGNAL: $signal."); >+ #Gracefully stop the task >+ $kernel->call($session, "got_task_stop"); >+} >+ >+sub child_process_success { >+ my ($self, $heap,$session,$kernel) = @_[OBJECT, HEAP,SESSION,KERNEL]; >+ my $task = $heap->{task}; >+ if (my $repeat_interval = $task->{repeat_interval}){ >+ if ($heap->{stopping}){ >+ $self->log("Will skip repeating the task, as task is stopping."); >+ } else { >+ $self->log("Will repeat the task"); >+ $task->{status} = "restarting"; >+ $kernel->yield("on_task_init"); >+ } >+ } else { >+ $self->debug("I'm going to stop this task"); >+ $kernel->yield("got_task_stop"); >+ } >+} >+ >+############################################################# >+# # >+# Methods for communicating with child processes # >+# # >+############################################################# >+# Originally inspired by the POE::Wheel::Run perldoc example >+ >+# Wheel event, including the wheel's ID >+sub on_child_stdout { >+ my ($self, $stdout_line, $wheel_id, $session) = @_[OBJECT, ARG0, ARG1, SESSION]; >+ my $child = $_[HEAP]{children_by_wid}{$wheel_id}; >+ #NOTE: Log everything child process sends to STDOUT >+ $self->log("[pid ".$child->PID."] STDOUT: $stdout_line"); >+ >+ #If the child outputs a line to STDOUT which starts with UPDATE_PARAMS=, we capture the data, >+ #and update the task params. >+ if ($stdout_line =~ /^UPDATE_PARAMS=(.*)$/){ >+ my $json_string = $1; >+ my $json = from_json($json_string); >+ my $task = $_[HEAP]->{task}; >+ my $params = $task->{params}; >+ foreach my $key (%$json){ >+ if (defined $params->{$key}){ >+ #FIXME: Don't just overwrite? Only update differences? >+ $params->{$key} = $json->{$key}; >+ } >+ } >+ $_[HEAP]->{task} = $task; >+ } >+} >+ >+# Wheel event, including the wheel's ID. >+sub on_child_stderr { >+ my ($self, $stderr_line, $wheel_id, $session) = @_[OBJECT, ARG0, ARG1,SESSION]; >+ my $child = $_[HEAP]{children_by_wid}{$wheel_id}; >+ #NOTE: Log everything child process sends to STDERR >+ $self->log("[pid ".$child->PID."] STDERR: $stderr_line"); >+} >+ >+# Wheel event, including the wheel's ID. >+sub on_child_close { >+ my ($self, $wheel_id,$session,$kernel) = @_[OBJECT, ARG0,SESSION,KERNEL]; >+ >+ my $child = delete $_[HEAP]{children_by_wid}{$wheel_id}; >+ >+ # May have been reaped by on_child_signal(). >+ unless (defined $child) { >+ $self->debug("[wid $wheel_id] closed all pipes."); >+ return; >+ } >+ $self->debug("[pid ".$child->PID."] closed all pipes."); >+ delete $_[HEAP]{children_by_pid}{$child->PID}; >+} >+ >+sub on_child_signal { >+ my ($self, $heap,$kernel,$pid,$exit_code,$session) = @_[OBJECT, HEAP,KERNEL,ARG1,ARG2,SESSION]; >+ >+ #If the child's exit code is 0, handle this successful exit status >+ if ($exit_code == 0){ >+ $kernel->yield("child_process_success"); >+ } >+ $self->debug("pid $pid exited with status $exit_code."); >+ my $child = delete $_[HEAP]{children_by_pid}{$pid}; >+ >+ # May have been reaped by on_child_close(). >+ return unless defined $child; >+ >+ delete $_[HEAP]{children_by_wid}{$child->ID}; >+} >+ >+1; >diff --git a/Koha/Icarus/Task/Base.pm b/Koha/Icarus/Task/Base.pm >new file mode 100755 >index 0000000..8de774c >--- /dev/null >+++ b/Koha/Icarus/Task/Base.pm >@@ -0,0 +1,24 @@ >+package Koha::Icarus::Task::Base; >+ >+use Modern::Perl; >+use IO::Select; >+ >+sub new { >+ my ($class, $args) = @_; >+ $args = {} unless defined $args; >+ return bless ($args, $class); >+} >+ >+sub listen_for_instruction { >+ my ($self) = @_; >+ my $select = $self->{_select} ||= IO::Select->new(\*STDIN); >+ if (my @ready_FHs = $select->can_read(0) ){ >+ foreach my $FH (@ready_FHs){ >+ my $line = $FH->getline(); >+ chomp($line); >+ return $line; >+ } >+ } >+} >+ >+1; >diff --git a/Koha/Icarus/Task/Download/OAIPMH/Biblio.pm b/Koha/Icarus/Task/Download/OAIPMH/Biblio.pm >new file mode 100755 >index 0000000..13f2662 >--- /dev/null >+++ b/Koha/Icarus/Task/Download/OAIPMH/Biblio.pm >@@ -0,0 +1,316 @@ >+package Koha::Icarus::Task::Download::OAIPMH::Biblio; >+ >+use Modern::Perl; >+use parent 'Koha::Icarus::Task::Base'; >+ >+use DateTime; >+use DateTime::Format::Strptime; >+use HTTP::OAI; >+use File::Path qw(make_path); >+use Digest::MD5; >+use JSON; >+use URI; >+ >+my $strp = DateTime::Format::Strptime->new( >+ pattern => '%Y%m%dT%H%M%S.%NZ', >+); >+ >+my $oai_second_granularity = DateTime::Format::Strptime->new( >+ pattern => '%Y-%m-%dT%H:%M:%SZ', >+); >+ >+my $oai_day_granularity = DateTime::Format::Strptime->new( >+ pattern => '%Y-%m-%d', >+); >+ >+sub validate_parameter_names { >+ >+} >+sub validate_repeat_interval { >+ my ($self,$repeat_interval) = @_; >+ if (defined $repeat_interval && $repeat_interval =~ /^\d+$/){ >+ return undef; >+ } >+ $self->{invalid_data}++; >+ return { not_numeric => 1, }; >+} >+ >+sub validate_url { >+ my ($self,$url) = @_; >+ my $response = {}; >+ if (my $url_obj = URI->new($url)){ >+ if ($url_obj->scheme ne "http"){ >+ $response->{not_http} = 1; >+ $self->{invalid_data}++; >+ } >+ if ( ! $url_obj->path){ >+ $response->{no_path} = 1; >+ $self->{invalid_data}++; >+ } >+ } else { >+ $response->{not_a_url} = 1; >+ $self->{invalid_data}++; >+ } >+ >+ return $response; >+} >+ >+sub validate { >+ my ($self, $args) = @_; >+ #Reset the invalid data counter... >+ $self->{invalid_data} = 0; >+ my $errors = { }; >+ my $task = $self->{task}; >+ my $tests = $args->{tests}; >+ if ($task){ >+ if ($tests && $tests eq 'all'){ >+ #warn "PARAMS = ".$task->{params}; >+ } >+ } >+ my $params = $task->{params}; >+ >+ #validate_start_time >+ $errors->{"repeat_interval"} = $self->validate_repeat_interval($task->{repeat_interval}); >+ >+ $errors->{"url"} = $self->validate_url($params->{url}); >+ >+ #NOTE: You don't need to validate these 3 HTTP Basic Auth parameters >+ #validate_username >+ #validate_password >+ #validate_realm >+ >+ #OAI-PMH parameters >+ #validate_verb >+ #validate_sets >+ #validate_marcxml >+ #validate_from >+ #validate_until >+ >+ #Download parameters >+ #validate_queue >+ >+ return $errors; >+} >+ >+sub new { >+ my ($class, $args) = @_; >+ $args = {} unless defined $args; >+ $args->{invalid_data} = 0; >+ return bless ($args, $class); >+} >+ >+sub validate_queue { >+ my ( $self ) = @_; >+ my $task = $self->{task}; >+ if (my $queue = $task->{params}->{queue}){ >+ >+ my $queue_uri = URI->new($queue); >+ #TODO: In theory, you could even use a DBI DSN like DBI:mysql:database=koha;host=koha.db;port=3306. >+ #Then you could provide the table, username, and password in the params as well... >+ >+ #NOTE: If the queue directory doesn't exist on the filesystem, we try to make it and change to it. >+ if ($queue_uri->scheme eq 'file'){ >+ my $filepath = $queue_uri->file; >+ if ( ! -d $filepath ){ >+ make_path($filepath,{ mode => 0755 }); >+ } >+ if ( -d $filepath ){ >+ chdir $filepath or die "$!"; >+ } >+ } >+ >+ } >+} >+ >+sub run { >+ my ( $self ) = @_; >+ $self->validate_queue; >+ >+ my $task = $self->{task}; >+ >+ #DEBUGGING/FIXME: Remove these lines >+ if ($self->{Verbosity} && $self->{Verbosity} == 9){ >+ use Data::Dumper; >+ warn Dumper($task); >+ } >+ >+ my $params = $task->{params}; >+ >+ my $now = DateTime->now(); #This is in UTC time, which is required by the OAI-PMH protocol. >+ if ( $oai_second_granularity->parse_datetime($params->{from}) ){ >+ $now->set_formatter($oai_second_granularity); >+ } else { >+ $now->set_formatter($oai_day_granularity); >+ } >+ >+ $params->{until} = "$now" if $task->{repeat_interval}; >+ >+ $self->{digester} = Digest::MD5->new(); >+ $self->create_harvester; >+ my $sets = $self->prepare_sets; >+ >+ #Send a OAI-PMH request for each set >+ foreach my $set (@{$sets}){ >+ my $response = $self->send_request({set => $set}); >+ $self->handle_response({ response => $response, set => $set,}); >+ } >+ >+ #FIXME: Do you want to update the task only when the task is finished, or >+ #also after each resumption? >+ #Update the task params in Icarus after the task is finished... >+ #TODO: This really does make it seem like you should be handling the repeat_interval within the child process rather than the parent... >+ if ($task->{repeat_interval}){ >+ $params->{from} = "$now"; >+ $params->{until} = ""; >+ my $json_update = to_json($params); >+ say STDOUT "UPDATE_PARAMS=$json_update"; >+ } >+ >+} >+ >+#FIXME: I wonder if it would be faster to send your own HTTP requests and not use HTTP::OAI... >+sub send_request { >+ my ( $self, $args ) = @_; >+ >+ #NOTE: This is plugin specific as the plugins define when they stop to listen for instructions... >+ #NOTE: Before sending a new request, check if Icarus has already asked us to quit. >+ my $instruction = $self->listen_for_instruction(); >+ if ($instruction eq 'quit'){ >+ warn "I was asked to quit!"; >+ return; >+ } >+ >+ my $set = $args->{set}; >+ my $resumptionToken = $args->{resumptionToken}; >+ >+ my $response; >+ my $task_params = $self->{task}->{params}; >+ >+ my $harvester = $self->{harvester}; >+ my $verb = $task_params->{verb}; >+ if ($verb eq 'GetRecord'){ >+ $response = $harvester->GetRecord( >+ metadataPrefix => $task_params->{metadataPrefix}, >+ identifier => $task_params->{identifier}, >+ ); >+ } elsif ($verb eq 'ListRecords'){ >+ $response = $harvester->ListRecords( >+ metadataPrefix => $task_params->{metadataPrefix}, >+ from => $task_params->{from}, >+ until => $task_params->{until}, >+ set => $set, >+ resumptionToken => $resumptionToken, >+ ); >+ } >+ return $response; >+} >+ >+sub create_harvester { >+ my ( $self ) = @_; >+ my $task_params = $self->{task}->{params}; >+ >+ #FIXME: DEBUGGING >+ #use HTTP::OAI::Debug qw(+); >+ >+ #Create HTTP::OAI::Harvester object >+ my $harvester = new HTTP::OAI::Harvester( baseURL => $task_params->{url} ); >+ if ($harvester){ >+ $harvester->timeout(5); #NOTE: the default timeout is 180 >+ #Set HTTP Basic Authentication Credentials >+ my $uri = URI->new($task_params->{url}); >+ my $host = $uri->host; >+ my $port = $uri->port; >+ $harvester->credentials($host.":".$port, $task_params->{realm}, $task_params->{username}, $task_params->{password}); >+ } >+ $self->{harvester} = $harvester; >+} >+ >+sub prepare_sets { >+ my ( $self ) = @_; >+ my $task_params = $self->{task}->{params}; >+ my @sets = (); >+ if ($task_params->{sets}){ >+ @sets = split(/\|/, $task_params->{sets}); >+ } >+ #If no sets are defined, create a null element to force the foreach loop to run once >+ if (!@sets){ >+ push(@sets,undef) >+ } >+ return \@sets; >+} >+ >+sub handle_response { >+ my ( $self, $args ) = @_; >+ my $params = $self->{task}->{params}; >+ my $response = $args->{response}; >+ my $set = $args->{set}; >+ if ($response){ >+ #NOTE: We have options at this point >+ #Option 1: Use $response->toDOM() to handle the XML response as a single document >+ #Option 2: Use $response->next() to handle each record individually. You would need to create a new document using $rec->header->dom() and $rec->metadata->dom() anyway. >+ >+ #NOTE: I wonder which option would be the fastest. For now, we're going with Option 1: >+ my $dom = $response->toDOM; >+ my $root = $dom->documentElement; >+ >+ #FIXME: Provide these as arguments so you're not re-creating them for each response >+ my $xpc = XML::LibXML::XPathContext->new(); >+ $xpc->registerNs('oai','http://www.openarchives.org/OAI/2.0/'); >+ my $xpath = XML::LibXML::XPathExpression->new("(oai:GetRecord|oai:ListRecords)/oai:record"); >+ >+ >+ my @records = $xpc->findnodes($xpath,$root); >+ my $now_pretty = DateTime->now(); >+ >+ $now_pretty->set_formatter($strp); >+ print "Downloaded ".scalar @records." records at $now_pretty\n"; >+ foreach my $record (@records) { >+ >+ #FIXME: This is where you could put a filter to prevent certain records from being saved... >+ >+ #Create a new XML document from the XML fragment >+ my $document = XML::LibXML::Document->new( "1.0", "UTF-8" ); >+ $document->setDocumentElement($record); >+ my $record_string = $document->toString; >+ >+ #NOTE: We have options at this point. >+ #Option 1: Write documents to disk, and have a separate importer upload the documents >+ #Option 2: Use AnyEvent::HTTP or POE::Component::Client::HTTP to send to a HTTP API asynchronously >+ #Option 3: Write records to a database, and have a separate importer upload the documents >+ #Option 4: Shared memory, although that seems fragile if nothing else >+ #Option 5: Write the records to a socket/pipe >+ >+ #NOTE: I wonder which option would be the fastest. For now, we're going to go with Option 1: >+ $self->{digester}->add($record_string); >+ my $digest = $self->{digester}->hexdigest; >+ #FIXME: If a record appears more than once during the download signified by $now, you'll >+ #overwrite the former with the latter. While this acts as a sort of heavy-handed de-duplication, >+ #you need to take into account the importer daemon... >+ >+ require Time::HiRes; >+ my $epoch = Time::HiRes::time(); >+ my $now = DateTime->from_epoch(epoch => $epoch); >+ $now->set_formatter($strp); >+ >+ my $filename = "$now-$digest"; >+ #NOTE: Here is where we write the XML out to disk >+ my $state = $document->toFile($filename); >+ } >+ >+ >+ #NOTE: Check if object has method due to bug in HTTP::OAI which causes fatal error on $response->resumptionToken if no real response is fetched from the OAI-PMH server >+ if ($response->can("resumptionToken")){ >+ my $resumption_token = $response->resumptionToken->resumptionToken if $response->resumptionToken && $response->resumptionToken->resumptionToken; >+ if ($resumption_token){ >+ #warn "Resumption Token = $resumption_token"; >+ my $resumed_response = $self->send_request({set => $set, resumptionToken => $resumption_token}); >+ $self->handle_response({ response => $resumed_response, set => $set,}); >+ } >+ } >+ >+ #In theory $response->resume(resumptionToken => resumptionToken) should kick off another response... >+ warn $response->message if $response->is_error; >+ } >+} >+ >+1; >diff --git a/Koha/Icarus/Task/Upload/OAIPMH/Biblio.pm b/Koha/Icarus/Task/Upload/OAIPMH/Biblio.pm >new file mode 100755 >index 0000000..c2bfb80 >--- /dev/null >+++ b/Koha/Icarus/Task/Upload/OAIPMH/Biblio.pm >@@ -0,0 +1,118 @@ >+package Koha::Icarus::Task::Upload::OAIPMH::Biblio; >+ >+use Modern::Perl; >+use parent 'Koha::Icarus::Task::Base'; >+use URI; >+use LWP::UserAgent; >+use HTTP::Status qw(:constants); >+ >+my $ua = LWP::UserAgent->new; >+ >+#FIXME: If we store the cookie jar on disk, we can prevent unnecessary HTTP requests... >+#We would need to make sure that it's stored on a private per-instance basis though... >+$ua->cookie_jar({}); >+ >+ >+sub new { >+ my ($class, $args) = @_; >+ $args = {} unless defined $args; >+ return bless ($args, $class); >+} >+ >+sub run { >+ my ( $self ) = @_; >+ >+ my $task = $self->{task}; >+ >+ if ($self->{Verbosity} && $self->{Verbosity} == 9){ >+ use Data::Dumper; >+ warn Dumper($task); >+ } >+ >+ my $params = $task->{params}; >+ >+ >+ >+ >+ my $queue = $params->{queue}; >+ my $queue_uri = URI->new($queue); >+ >+ if ($queue_uri->scheme eq 'file'){ >+ >+ my $path = $queue_uri->path; >+ opendir(my $dh, $path); >+ my @files = sort readdir($dh); >+ foreach my $file (@files){ >+ #NOTE: This is plugin specific as the plugins define when they stop to listen for instructions... >+ my $instruction = $self->listen_for_instruction(); >+ if ($instruction eq 'quit'){ >+ warn "I was asked to quit!"; >+ return; >+ } >+ >+ next if $file =~ /^\.+$/; >+ my $filepath = "$path/$file"; >+ if ( -d $filepath ){ >+ #Do nothing for directories >+ } elsif ( -e $filepath ){ >+ print "File: $file\n"; >+ >+ #Slurp mode >+ local $/; >+ #TODO: Check flock on $filepath first >+ open( my $fh, '<', $filepath ); >+ my $data = <$fh>; >+ >+ #TODO: Improve this section... >+ #Send to Koha API... (we could speed this up using Asynchronous HTTP requests with AnyEvent::HTTP...) >+ my $resp = post_to_api($data,$params); >+ >+ my $status = $resp->code; >+ >+ if ($status == HTTP_UNAUTHORIZED || $status == HTTP_FORBIDDEN) { >+ $resp = remote_authenticate($params); >+ $resp = post_to_api($data,$params) if $resp->is_success; >+ } >+ >+ if ($resp->code == HTTP_OK){ >+ print "Success.\n"; >+ print $resp->decoded_content; >+ print "\n"; >+ unlink $filepath; >+ } >+ } >+ } >+ } >+} >+ >+sub post_to_api { >+ my ($data, $params) = @_; >+ print "Posting to API...\n"; >+ my $resp = $ua->post( $params->{target_uri}, >+ {'nomatch_action' => $params->{nomatch_action}, >+ 'overlay_action' => $params->{overlay_action}, >+ 'match' => $params->{match}, >+ 'import_mode' => $params->{import_mode}, >+ 'framework' => $params->{framework}, >+ 'item_action' => $params->{item_action}, >+ 'filter' => $params->{filter}, >+ 'xml' => $data} >+ ); >+ return $resp; >+} >+ >+sub remote_authenticate { >+ my ($params) = @_; >+ print "Authenticating...\n"; >+ >+ my $auth_uri = $params->{auth_uri}; >+ my $user = $params->{auth_username}; >+ my $password = $params->{auth_password}; >+ my $resp = $ua->post( $auth_uri, { userid => $user, password => $password } ); >+ if ($resp->code == HTTP_OK){ >+ print "Authenticated.\n"; >+ } >+ return $resp >+} >+ >+1; >diff --git a/Koha/OAI/Client/Record.pm b/Koha/OAI/Client/Record.pm >new file mode 100755 >index 0000000..678d311 >--- /dev/null >+++ b/Koha/OAI/Client/Record.pm >@@ -0,0 +1,249 @@ >+package Koha::OAI::Client::Record; >+ >+# Copyright 2016 Prosentient Systems >+# >+# This file is part of Koha. >+# >+# Koha is free software; you can redistribute it and/or modify it >+# under the terms of the GNU General Public License as published by >+# the Free Software Foundation; either version 3 of the License, or >+# (at your option) any later version. >+# >+# Koha is distributed in the hope that it will be useful, but >+# WITHOUT ANY WARRANTY; without even the implied warranty of >+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the >+# GNU General Public License for more details. >+# >+# You should have received a copy of the GNU General Public License >+# along with Koha; if not, see <http://www.gnu.org/licenses>. >+# >+ >+use Modern::Perl; >+use XML::LibXML; >+use XML::LibXSLT; >+use MARC::Record; >+ >+use C4::Context; >+use C4::Biblio; >+use C4::ImportBatch; >+use C4::Matcher; >+ >+use constant MAX_MATCHES => 99999; #NOTE: This is an arbitrary value. We want to get all matches. >+ >+sub new { >+ my ($class, $args) = @_; >+ $args = {} unless defined $args; >+ >+ if (my $inxml = $args->{xml_string}){ >+ >+ #Parse the XML string into a XML::LibXML object >+ my $doc = XML::LibXML->load_xml(string => $inxml, { no_blanks => 1 }); >+ $args->{doc} = $doc; >+ #NOTE: Don't load blank nodes... >+ >+ #Get the root element >+ my $root = $doc->documentElement; >+ >+ #Register namespaces for searching purposes >+ my $xpc = XML::LibXML::XPathContext->new(); >+ $xpc->registerNs('oai','http://www.openarchives.org/OAI/2.0/'); >+ >+ my $xpath_identifier = XML::LibXML::XPathExpression->new("oai:header/oai:identifier"); >+ my $identifier = $xpc->findnodes($xpath_identifier,$root)->shift; >+ #my $identifier_string = $identifier->textContent; >+ $args->{header_identifier} = $identifier->textContent; >+ >+ my $xpath_datestamp = XML::LibXML::XPathExpression->new("oai:header/oai:datestamp"); >+ my $datestamp = $xpc->findnodes($xpath_datestamp,$root)->shift; >+ #my $datestamp_string = $datestamp->textContent; >+ $args->{header_datestamp} = $datestamp->textContent; >+ >+ my $xpath_status = XML::LibXML::XPathExpression->new(q{oai:header/@status}); >+ my $status_node = $xpc->findnodes($xpath_status,$root)->shift; >+ #my $status_string = $status_node ? $status_node->textContent : ""; >+ $args->{header_status} = $status_node ? $status_node->textContent : ""; >+ } >+ >+ return bless ($args, $class); >+} >+ >+sub is_deleted_upstream { >+ my ($self, $args) = @_; >+ if ($self->{header_status}){ >+ if ($self->{header_status} eq "deleted"){ >+ return 1; >+ } >+ } >+ return 0; >+} >+ >+sub filter { >+ my ($self, $args) = @_; >+ my $doc = $self->{doc}; >+ my $filter = $args->{filter}; >+ $self->{filter} = $filter; #FIXME >+ #FIXME: Check that it's an XSLT here... >+ if ( -f $filter ){ >+ #Filter is a valid filepath >+ >+ #FIXME: Ideally, it would be good to use Koha::XSLT_Handler here... (especially for persistent environments...) >+ my $xslt = XML::LibXSLT->new(); >+ my $style_doc = XML::LibXML->load_xml(location => $filter); >+ my $stylesheet = $xslt->parse_stylesheet($style_doc); >+ if ($stylesheet){ >+ my $results = $stylesheet->transform($doc); >+ my $metadata_xml = $stylesheet->output_as_bytes($results); >+ #If the XSLT outputs nothing, then we don't meet the following condition, and we'll return 0 instead. >+ if ($metadata_xml){ >+ $self->{filtered_record} = $metadata_xml; >+ return 1; >+ } >+ } >+ } >+ return 0; >+} >+ >+ >+ >+ >+ >+ >+sub import_record { >+ my ($self, $args) = @_; >+ my $koha_record_numbers = ""; >+ my $errors = []; >+ my $import_status = "error"; >+ my $match_status = "no_match"; >+ >+ my $batch_id = $args->{import_batch_id}; >+ $self->{import_batch_id} = $batch_id; #FIXME >+ my $matcher = $args->{matcher}; >+ my $framework = $args->{framework}; >+ my $import_mode = $args->{import_mode}; >+ >+ my $metadata_xml = $self->{filtered_record}; >+ >+ if ($metadata_xml){ >+ #Convert MARCXML into MARC::Record object >+ my $marcflavour = C4::Context->preference('marcflavour') || 'MARC21'; >+ my $marc_record = eval {MARC::Record::new_from_xml( $metadata_xml, "utf8", $marcflavour)}; >+ if ($@) { >+ warn "Error converting OAI-PMH filtered metadata into MARC::Record object: $@"; >+ #FIXME: Improve error handling >+ } >+ >+ if ($self->is_deleted_upstream){ >+ >+=pod >+ my @matches = $matcher->get_matches($marc_record, MAX_MATCHES); >+ if (@matches){ >+ $match_status = "matched"; >+ } >+ my $delete_error; >+ foreach my $match (@matches){ >+ if (my $record_id = $match->{record_id}){ >+ #FIXME: This is biblio specific... what about authority records? >+ my $error = C4::Biblio::DelBiblio($record_id); >+ if ($error){ >+ $delete_error++; >+ $koha_record_numbers = []; >+ push(@$koha_record_numbers,$record_id); >+ >+ #FIXME: Find a better way of sending the errors in a predictable way... >+ push(@$errors,{"record_id" => $record_id, "error" => $error, }); >+ } >+ } >+ >+ } >+ >+ #If there are no delete errors, then the import was ok >+ if ( ! $delete_error){ >+ $import_status = "ok"; >+ } >+ #Deleted records will never actually have an records in them, so always mark them as cleaned so that other imports don't try to pick up the same batch. >+ C4::ImportBatch::SetImportBatchStatus($batch_id, 'cleaned'); >+=cut >+ my $import_record_id = AddBiblioToBatch($batch_id, 0, $marc_record, "utf8", int(rand(99999))); >+ my $number_of_matches = BatchFindDuplicates($batch_id, $matcher, MAX_MATCHES); >+ if ($number_of_matches > 0){ >+ $match_status = "auto_match"; #See `import_records` table for other options... but this should be the right one. >+ } >+ my $results = GetImportRecordMatches($import_record_id); #Only works for biblio... >+ my $delete_error; >+ foreach my $result (@$results){ >+ if (my $record_id = $result->{biblionumber}){ >+ #FIXME: This is biblio specific... what about authority records? >+ my $error = C4::Biblio::DelBiblio($record_id); >+ if ($error){ >+ $delete_error++; >+ >+ $koha_record_numbers = []; >+ push(@$koha_record_numbers,$record_id); >+ >+ #FIXME: Find a better way of sending the errors in a predictable way... >+ push(@$errors,{"record_id" => $record_id, "error_msg" => $error, }); >+ } >+ } >+ } >+ >+ if ($delete_error){ >+ $import_status = "error"; >+ C4::ImportBatch::SetImportBatchStatus($batch_id, 'importing'); >+ } else { >+ $import_status = "ok"; >+ #Ideally, it would be nice to say what records were deleted, but Koha doesn't have that capacity at the moment, so just clean the batch. >+ CleanBatch($batch_id); >+ } >+ >+ >+ >+ >+ } else { >+ #Import the MARCXML record into Koha >+ my $import_record_id = AddBiblioToBatch($batch_id, 0, $marc_record, "utf8", int(rand(99999))); >+ #FIXME: Don't allow item imports do to the nature of OAI-PMH records updating over time... >+ #my @import_items_ids = AddItemsToImportBiblio($batch_id, $import_record_id, $marc_record, 'UPDATE COUNTS'); >+ my $number_of_matches = BatchFindDuplicates($batch_id, $matcher); >+ >+ # XXX we are ignoring the result of this; >+ BatchCommitRecords($batch_id, $framework) if lc($import_mode) eq 'direct'; >+ >+ my $dbh = C4::Context->dbh(); >+ my $sth = $dbh->prepare("SELECT matched_biblionumber FROM import_biblios WHERE import_record_id =?"); >+ $sth->execute($import_record_id); >+ $koha_record_numbers = $sth->fetchrow_arrayref->[0] || ''; >+ $sth = $dbh->prepare("SELECT overlay_status FROM import_records WHERE import_record_id =?"); >+ $sth->execute($import_record_id); >+ >+ $match_status = $sth->fetchrow_arrayref->[0] || 'no_match'; >+ $import_status = "ok"; >+ } >+ } else { >+ #There's no filtered metadata... >+ #Clean the batch, so future imports don't use the same batch. >+ CleanBatch($batch_id); >+ } >+ $self->{status} = $import_status; #FIXME >+ #$self->save_to_database(); >+ return ($import_status,$match_status,$koha_record_numbers, $errors); >+} >+ >+sub save_to_database { >+ my ($self,$args) = @_; >+ >+ my $header_identifier = $self->{header_identifier}; >+ my $header_datestamp = $self->{header_datestamp}; >+ my $header_status = $self->{header_status}; >+ my $metadata = $self->{doc}->toString(1); >+ my $import_batch_id = $self->{import_batch_id}; >+ my $filter = $self->{filter}; >+ my $status = $self->{status}; >+ >+ my $dbh = C4::Context->dbh; >+ my $sql = "INSERT INTO import_oai (header_identifier, header_datestamp, header_status, metadata, import_batch_id, filter, status) VALUES (?, ?, ?, ?, ?, ?, ?)"; >+ my $sth = $dbh->prepare($sql); >+ $sth->execute($header_identifier,$header_datestamp,$header_status,$metadata, $import_batch_id, $filter, $status); >+} >+ >+ >+1; >\ No newline at end of file >diff --git a/Koha/SavedTask.pm b/Koha/SavedTask.pm >new file mode 100755 >index 0000000..89cbc51 >--- /dev/null >+++ b/Koha/SavedTask.pm >@@ -0,0 +1,86 @@ >+package Koha::SavedTask; >+ >+# Copyright Prosentient Systems 2016 >+# >+# This file is part of Koha. >+# >+# Koha is free software; you can redistribute it and/or modify it under the >+# terms of the GNU General Public License as published by the Free Software >+# Foundation; either version 3 of the License, or (at your option) any later >+# version. >+# >+# Koha is distributed in the hope that it will be useful, but WITHOUT ANY >+# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR >+# A PARTICULAR PURPOSE. See the GNU General Public License for more details. >+# >+# You should have received a copy of the GNU General Public License along >+# with Koha; if not, write to the Free Software Foundation, Inc., >+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. >+ >+use Modern::Perl; >+ >+use Carp; >+ >+use JSON; >+ >+use base qw(Koha::Object); >+ >+ >+ >+=head1 NAME >+ >+Koha::SavedTask - >+ >+=head1 API >+ >+=head2 Class Methods >+ >+=cut >+ >+ >+ >+=head3 _type >+ >+=cut >+ >+sub _type { >+ return 'SavedTask'; >+} >+ >+sub params_as_perl { >+ my ($self) = @_; >+ my $perl = from_json($self->params); >+ return $perl; >+} >+ >+sub serialize { >+ my ($self,$args) = @_; >+ my $for = $args->{for}; >+ my $type = $args->{type}; >+ if ($for eq 'icarus'){ >+ my $json_params = $self->params; >+ my $perl_params = from_json($json_params); >+ >+ my $icarus_task = { >+ type => $self->task_type, >+ start => $self->start_time, >+ repeat_interval => $self->repeat_interval, >+ params => $perl_params, >+ }; >+ if ($type eq 'perl'){ >+ return $icarus_task; >+ } elsif ($type eq 'json'){ >+ my $json = to_json($icarus_task); >+ return $json; >+ } >+ } >+ return undef; >+} >+ >+=head1 AUTHOR >+ >+David Cook <dcook@prosentient.com.au> >+ >+=cut >+ >+1; >diff --git a/Koha/SavedTasks.pm b/Koha/SavedTasks.pm >new file mode 100755 >index 0000000..a1ae9c5 >--- /dev/null >+++ b/Koha/SavedTasks.pm >@@ -0,0 +1,62 @@ >+package Koha::SavedTasks; >+ >+# Copyright Prosentient Systems 2016 >+# >+# This file is part of Koha. >+# >+# Koha is free software; you can redistribute it and/or modify it under the >+# terms of the GNU General Public License as published by the Free Software >+# Foundation; either version 3 of the License, or (at your option) any later >+# version. >+# >+# Koha is distributed in the hope that it will be useful, but WITHOUT ANY >+# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR >+# A PARTICULAR PURPOSE. See the GNU General Public License for more details. >+# >+# You should have received a copy of the GNU General Public License along >+# with Koha; if not, write to the Free Software Foundation, Inc., >+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. >+ >+use Modern::Perl; >+ >+use Carp; >+ >+use Koha::Database; >+ >+use Koha::SavedTask; >+ >+use base qw(Koha::Objects); >+ >+=head1 NAME >+ >+Koha::SavedTasks - >+ >+=head1 API >+ >+=head2 Class Methods >+ >+=cut >+ >+=head3 _type >+ >+=cut >+ >+sub _type { >+ return 'SavedTask'; >+} >+ >+=head3 object_class >+ >+=cut >+ >+sub object_class { >+ return 'Koha::SavedTask'; >+} >+ >+=head1 AUTHOR >+ >+David Cook <dcook@prosentient.com.au> >+ >+=cut >+ >+1; >diff --git a/Koha/Schema/Result/ImportOai.pm b/Koha/Schema/Result/ImportOai.pm >new file mode 100755 >index 0000000..bfa1a25 >--- /dev/null >+++ b/Koha/Schema/Result/ImportOai.pm >@@ -0,0 +1,152 @@ >+use utf8; >+package Koha::Schema::Result::ImportOai; >+ >+# Created by DBIx::Class::Schema::Loader >+# DO NOT MODIFY THE FIRST PART OF THIS FILE >+ >+=head1 NAME >+ >+Koha::Schema::Result::ImportOai >+ >+=cut >+ >+use strict; >+use warnings; >+ >+use base 'DBIx::Class::Core'; >+ >+=head1 TABLE: C<import_oai> >+ >+=cut >+ >+__PACKAGE__->table("import_oai"); >+ >+=head1 ACCESSORS >+ >+=head2 import_oai_id >+ >+ data_type: 'integer' >+ extra: {unsigned => 1} >+ is_auto_increment: 1 >+ is_nullable: 0 >+ >+=head2 header_identifier >+ >+ data_type: 'varchar' >+ is_nullable: 0 >+ size: 45 >+ >+=head2 header_datestamp >+ >+ data_type: 'datetime' >+ datetime_undef_if_invalid: 1 >+ is_nullable: 0 >+ >+=head2 header_status >+ >+ data_type: 'varchar' >+ is_nullable: 1 >+ size: 45 >+ >+=head2 metadata >+ >+ data_type: 'longtext' >+ is_nullable: 0 >+ >+=head2 last_modified >+ >+ data_type: 'timestamp' >+ datetime_undef_if_invalid: 1 >+ default_value: current_timestamp >+ is_nullable: 0 >+ >+=head2 status >+ >+ data_type: 'varchar' >+ is_nullable: 0 >+ size: 45 >+ >+=head2 import_batch_id >+ >+ data_type: 'integer' >+ is_foreign_key: 1 >+ is_nullable: 0 >+ >+=head2 filter >+ >+ data_type: 'text' >+ is_nullable: 0 >+ >+=cut >+ >+__PACKAGE__->add_columns( >+ "import_oai_id", >+ { >+ data_type => "integer", >+ extra => { unsigned => 1 }, >+ is_auto_increment => 1, >+ is_nullable => 0, >+ }, >+ "header_identifier", >+ { data_type => "varchar", is_nullable => 0, size => 45 }, >+ "header_datestamp", >+ { >+ data_type => "datetime", >+ datetime_undef_if_invalid => 1, >+ is_nullable => 0, >+ }, >+ "header_status", >+ { data_type => "varchar", is_nullable => 1, size => 45 }, >+ "metadata", >+ { data_type => "longtext", is_nullable => 0 }, >+ "last_modified", >+ { >+ data_type => "timestamp", >+ datetime_undef_if_invalid => 1, >+ default_value => \"current_timestamp", >+ is_nullable => 0, >+ }, >+ "status", >+ { data_type => "varchar", is_nullable => 0, size => 45 }, >+ "import_batch_id", >+ { data_type => "integer", is_foreign_key => 1, is_nullable => 0 }, >+ "filter", >+ { data_type => "text", is_nullable => 0 }, >+); >+ >+=head1 PRIMARY KEY >+ >+=over 4 >+ >+=item * L</import_oai_id> >+ >+=back >+ >+=cut >+ >+__PACKAGE__->set_primary_key("import_oai_id"); >+ >+=head1 RELATIONS >+ >+=head2 import_batch >+ >+Type: belongs_to >+ >+Related object: L<Koha::Schema::Result::ImportBatch> >+ >+=cut >+ >+__PACKAGE__->belongs_to( >+ "import_batch", >+ "Koha::Schema::Result::ImportBatch", >+ { import_batch_id => "import_batch_id" }, >+ { is_deferrable => 1, on_delete => "RESTRICT", on_update => "RESTRICT" }, >+); >+ >+ >+# Created by DBIx::Class::Schema::Loader v0.07042 @ 2016-04-12 11:02:33 >+# DO NOT MODIFY THIS OR ANYTHING ABOVE! md5sum:QmCetOjXql0gsAi+wZ74Ng >+ >+ >+# You can replace this text with custom code or comments, and it will be preserved on regeneration >+1; >diff --git a/Koha/Schema/Result/SavedTask.pm b/Koha/Schema/Result/SavedTask.pm >new file mode 100755 >index 0000000..948804f >--- /dev/null >+++ b/Koha/Schema/Result/SavedTask.pm >@@ -0,0 +1,98 @@ >+use utf8; >+package Koha::Schema::Result::SavedTask; >+ >+# Created by DBIx::Class::Schema::Loader >+# DO NOT MODIFY THE FIRST PART OF THIS FILE >+ >+=head1 NAME >+ >+Koha::Schema::Result::SavedTask >+ >+=cut >+ >+use strict; >+use warnings; >+ >+use base 'DBIx::Class::Core'; >+ >+=head1 TABLE: C<saved_tasks> >+ >+=cut >+ >+__PACKAGE__->table("saved_tasks"); >+ >+=head1 ACCESSORS >+ >+=head2 task_id >+ >+ data_type: 'integer' >+ extra: {unsigned => 1} >+ is_auto_increment: 1 >+ is_nullable: 0 >+ >+=head2 start_time >+ >+ data_type: 'datetime' >+ datetime_undef_if_invalid: 1 >+ is_nullable: 0 >+ >+=head2 repeat_interval >+ >+ data_type: 'integer' >+ extra: {unsigned => 1} >+ is_nullable: 0 >+ >+=head2 task_type >+ >+ data_type: 'varchar' >+ is_nullable: 0 >+ size: 255 >+ >+=head2 params >+ >+ data_type: 'text' >+ is_nullable: 0 >+ >+=cut >+ >+__PACKAGE__->add_columns( >+ "task_id", >+ { >+ data_type => "integer", >+ extra => { unsigned => 1 }, >+ is_auto_increment => 1, >+ is_nullable => 0, >+ }, >+ "start_time", >+ { >+ data_type => "datetime", >+ datetime_undef_if_invalid => 1, >+ is_nullable => 0, >+ }, >+ "repeat_interval", >+ { data_type => "integer", extra => { unsigned => 1 }, is_nullable => 0 }, >+ "task_type", >+ { data_type => "varchar", is_nullable => 0, size => 255 }, >+ "params", >+ { data_type => "text", is_nullable => 0 }, >+); >+ >+=head1 PRIMARY KEY >+ >+=over 4 >+ >+=item * L</task_id> >+ >+=back >+ >+=cut >+ >+__PACKAGE__->set_primary_key("task_id"); >+ >+ >+# Created by DBIx::Class::Schema::Loader v0.07042 @ 2016-01-27 13:35:22 >+# DO NOT MODIFY THIS OR ANYTHING ABOVE! md5sum:gnoi7I9fiXM3IfDysMTm+A >+ >+ >+# You can replace this text with custom code or comments, and it will be preserved on regeneration >+1; >diff --git a/Makefile.PL b/Makefile.PL >index 24d2ad3..b2cedf0 100644 >--- a/Makefile.PL >+++ b/Makefile.PL >@@ -198,6 +198,10 @@ Directory for Zebra's data files. > > Directory for Zebra's UNIX-domain sockets. > >+=item ICARUS_RUN_DIR >+ >+Directory for Icarus's UNIX-domain socket and pid file. >+ > =item MISC_DIR > > Directory for for miscellaenous scripts, among other >@@ -316,6 +320,7 @@ my $target_map = { > './skel/var/log/koha' => { target => 'LOG_DIR', trimdir => -1 }, > './skel/var/spool/koha' => { target => 'BACKUP_DIR', trimdir => -1 }, > './skel/var/run/koha/zebradb' => { target => 'ZEBRA_RUN_DIR', trimdir => -1 }, >+ './skel/var/run/koha/icarus' => { target => 'ICARUS_RUN_DIR', trimdir => 6 }, > './skel/var/lock/koha/zebradb/authorities' => { target => 'ZEBRA_LOCK_DIR', trimdir => 6 }, > './skel/var/lib/koha/zebradb/authorities/key' => { target => 'ZEBRA_DATA_DIR', trimdir => 6 }, > './skel/var/lib/koha/zebradb/authorities/register' => { target => 'ZEBRA_DATA_DIR', trimdir => 6 }, >@@ -413,6 +418,10 @@ System user account that will own Koha's files. > > System group that will own Koha's files. > >+=item ICARUS_MAX_TASKS >+ >+Maximum number of tasks allowed by Icarus. >+ > =back > > =cut >@@ -447,7 +456,8 @@ my %config_defaults = ( > 'USE_MEMCACHED' => 'no', > 'MEMCACHED_SERVERS' => '127.0.0.1:11211', > 'MEMCACHED_NAMESPACE' => 'KOHA', >- 'FONT_DIR' => '/usr/share/fonts/truetype/ttf-dejavu' >+ 'FONT_DIR' => '/usr/share/fonts/truetype/ttf-dejavu', >+ 'ICARUS_MAX_TASKS' => '30', > ); > > # set some default configuration options based on OS >@@ -1091,6 +1101,10 @@ Memcached namespace?); > Path to DejaVu fonts?); > $config{'FONT_DIR'} = _get_value('FONT_DIR', $msg, $defaults->{'FONT_DIR'}, $valid_values, $install_log_values); > >+ $msg = q( >+Maximum number of tasks allowed by Icarus?); >+ $config{'ICARUS_MAX_TASKS'} = _get_value('ICARUS_MAX_TASKS', $msg, $defaults->{'ICARUS_MAX_TASKS'}, $valid_values, $install_log_values); >+ > > $msg = q( > Would you like to run the database-dependent test suite?); >@@ -1239,6 +1253,7 @@ sub get_target_directories { > $dirmap{'PLUGINS_DIR'} = File::Spec->catdir(@basedir, $package, 'var', 'lib', 'koha', 'plugins'); > $dirmap{'ZEBRA_DATA_DIR'} = File::Spec->catdir(@basedir, $package, 'var', 'lib', 'zebradb'); > $dirmap{'ZEBRA_RUN_DIR'} = File::Spec->catdir(@basedir, $package, 'var', 'run', 'zebradb'); >+ $dirmap{'ICARUS_RUN_DIR'} = File::Spec->catdir(@basedir, $package, 'var', 'run', 'icarus'); > } elsif ($mode eq 'dev') { > my $curdir = File::Spec->rel2abs(File::Spec->curdir()); > $dirmap{'INTRANET_CGI_DIR'} = File::Spec->catdir($curdir); >@@ -1272,6 +1287,7 @@ sub get_target_directories { > $dirmap{'PLUGINS_DIR'} = File::Spec->catdir(@basedir, $package, 'var', 'lib', 'plugins'); > $dirmap{'ZEBRA_DATA_DIR'} = File::Spec->catdir(@basedir, $package, 'var', 'lib', 'zebradb'); > $dirmap{'ZEBRA_RUN_DIR'} = File::Spec->catdir(@basedir, $package, 'var', 'run', 'zebradb'); >+ $dirmap{'ICARUS_RUN_DIR'} = File::Spec->catdir(@basedir, $package, 'var', 'run', 'icarus'); > } else { > # mode is standard, i.e., 'fhs' > $dirmap{'INTRANET_CGI_DIR'} = File::Spec->catdir(@basedir, $package, 'intranet', 'cgi-bin'); >@@ -1295,6 +1311,7 @@ sub get_target_directories { > $dirmap{'PLUGINS_DIR'} = File::Spec->catdir(File::Spec->rootdir(), 'var', 'lib', $package, 'plugins'); > $dirmap{'ZEBRA_DATA_DIR'} = File::Spec->catdir(File::Spec->rootdir(), 'var', 'lib', $package, 'zebradb'); > $dirmap{'ZEBRA_RUN_DIR'} = File::Spec->catdir(File::Spec->rootdir(), 'var', 'run', $package, 'zebradb'); >+ $dirmap{'ICARUS_RUN_DIR'} = File::Spec->catdir(File::Spec->rootdir(), 'var', 'run', $package, 'icarus'); > } > > _get_env_overrides(\%dirmap); >diff --git a/admin/saved_tasks.pl b/admin/saved_tasks.pl >new file mode 100755 >index 0000000..940adcc >--- /dev/null >+++ b/admin/saved_tasks.pl >@@ -0,0 +1,347 @@ >+#!/usr/bin/perl >+ >+# Copyright Prosentient Systems 2016 >+# >+# This file is part of Koha. >+# >+# Koha is free software; you can redistribute it and/or modify it >+# under the terms of the GNU General Public License as published by >+# the Free Software Foundation; either version 3 of the License, or >+# (at your option) any later version. >+# >+# Koha is distributed in the hope that it will be useful, but >+# WITHOUT ANY WARRANTY; without even the implied warranty of >+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the >+# GNU General Public License for more details. >+# >+# You should have received a copy of the GNU General Public License >+# along with Koha; if not, see <http://www.gnu.org/licenses>. >+ >+=head1 NAME >+ >+saved_tasks.pl >+ >+=head1 DESCRIPTION >+ >+Admin page to manage saved tasks >+ >+=cut >+ >+use Modern::Perl; >+use CGI qw ( -utf8 ); >+use C4::Auth; >+use C4::Output; >+use C4::Context; >+ >+use Koha::SavedTasks; >+use Koha::Icarus; >+use Module::Load::Conditional qw/can_load check_install/; >+use JSON; >+ >+my $input = new CGI; >+my ($template, $loggedinuser, $cookie, $flags) = get_template_and_user( { >+ template_name => 'admin/saved_tasks.tt', >+ query => $input, >+ type => 'intranet', >+ authnotrequired => 0, >+ flagsrequired => { 'parameters' => 'parameters_remaining_permissions' }, >+} ); >+ >+my $filename = "saved_tasks.pl"; >+$template->param( >+ filename => $filename, >+); >+ >+my $context = C4::Context->new(); >+ >+ >+my $task_server = $input->param("task_server") // "icarus"; >+ >+ >+my $socket_uri = $context->{"icarus"}->{"socket"}; >+ >+my @available_plugins = (); >+my $task_plugins = $context->{"icarus"}->{"task_plugin"}; >+if ($task_plugins && ref $task_plugins eq 'ARRAY'){ >+ #FIXME: This should probably be a module method... validation that a plugin is installed... >+ foreach my $task_plugin (@$task_plugins){ >+ #Check that plugin module is installed >+ if ( check_install( module => $task_plugin ) ){ >+ push(@available_plugins,$task_plugin); >+ } >+ } >+} >+ >+$template->param( >+ available_plugins => \@available_plugins, >+); >+ >+#Server action and task id >+my $server_action = $input->param("server_action"); >+my $server_task_id = $input->param('server_task_id'); >+ >+#Saved task op >+my $op = $input->param('op'); >+my $step = $input->param('step'); >+ >+#Saved task id >+my $saved_task_id = $input->param('saved_task_id'); >+ >+ >+#Create Koha-Icarus interface object >+my $icarus = Koha::Icarus->new({ socket_uri => $socket_uri }); >+my $daemon_status = ""; >+ >+#Connect to Icarus >+if ( $icarus->connect() ){ >+ $daemon_status = "online"; >+ if ($server_action){ >+ if ($server_action eq 'shutdown'){ >+ my $response = $icarus->shutdown; >+ if ( $response && (my $action = $response->{action}) ){ >+ $daemon_status = $action; >+ } >+ } elsif ($server_action eq 'start' && $server_task_id){ >+ my $response = $icarus->start_task({ task_id => $server_task_id }); >+ $template->param( >+ task_response => $response, >+ ); >+ } elsif ($server_action eq 'remove' && $server_task_id){ >+ my $response = $icarus->remove_task({ task_id => $server_task_id }); >+ $template->param( >+ task_response => $response, >+ ); >+ } >+ } >+} else { >+ $daemon_status = $!; >+} >+$template->param( >+ daemon_status => $daemon_status, >+); >+ >+ >+ >+my $params = $input->param("params"); >+ >+#NOTE: Parse the parameters manually, so that you can "name[]" style of parameter, which we use in the special plugin templates... >+my $saved_params = {}; >+#Fetch the names of all the parameters passed to your script >+my @parameter_names = $input->param; >+#Iterate through these parameter names and look for "params[]" >+foreach my $parameter_name (@parameter_names){ >+ if ($parameter_name =~ /^params\[(.*)\]$/){ >+ #Capture the hash key >+ my $key = $1; >+ #Fetch the actual individual value >+ my $parameter_value = $input->param($parameter_name); >+ if ($parameter_value){ >+ $saved_params->{$key} = $parameter_value; >+ } >+ } >+} >+if (%$saved_params){ >+ my $json = to_json($saved_params, { pretty => 1, }); >+ if ($json){ >+ $params = $json; >+ } >+} >+ >+my $start_time = $input->param("start_time"); >+my $repeat_interval = $input->param("repeat_interval"); >+my $task_type = $input->param("task_type"); >+if ($task_type){ >+ my $task_template = $task_type; >+ #Create the template name by stripping the colons out of the task type text >+ $task_template =~ s/://g; >+ $template->param( >+ task_template => "tasks/$task_template.inc", >+ ); >+} >+ >+ >+if ($op){ >+ if ($op eq 'new'){ >+ >+ } elsif ($op eq 'create'){ >+ >+ #Validate the $task here >+ if ($step){ >+ if ($step eq "one"){ >+ >+ $op = "new"; >+ $template->param( >+ step => "two", >+ task_type => $task_type, >+ ); >+ } elsif ($step eq "two"){ >+ my $new_task = Koha::SavedTask->new({ >+ start_time => $start_time, >+ repeat_interval => $repeat_interval, >+ task_type => $task_type, >+ params => $params, >+ }); >+ >+ #Serialize the data as an Icarus task >+ my $icarus_task = $new_task->serialize({ for => "icarus", type => "perl", }); >+ >+ my $valid = 1; >+ #Load the plugin module, and create an object instance in order to validate user-entered data >+ if ( can_load( modules => { $task_type => undef, }, ) ){ >+ my $plugin = $task_type->new({ task => $icarus_task, }); >+ if ($plugin->can("validate")){ >+ my $errors = $plugin->validate({ >+ "tests" => "all", >+ }); >+ if (%$errors){ >+ $template->param( >+ errors => $errors, >+ ); >+ } >+ if ($plugin->{invalid_data} > 0){ >+ $valid = 0; >+ } >+ } >+ } >+ >+ if ($valid){ >+ $new_task->store(); >+ $op = "list"; >+ } else { >+ $op = "new"; >+ #Create a Perl data structure from the JSON >+ my $editable_params = from_json($params); >+ $template->param( >+ step => "two", >+ task_type => $task_type, >+ saved_task => $new_task, >+ params => $editable_params, >+ ); >+ } >+ } >+ } >+ >+ } elsif ($op eq 'edit'){ >+ my $task = Koha::SavedTasks->find($saved_task_id); >+ if ($task){ >+ #Check if the task's saved task type is actually available... >+ #FIXME: This should be a Koha::Icarus method... >+ my $task_type_is_valid = grep { $task->task_type eq $_ } @available_plugins; >+ $template->param( >+ task_type_is_valid => $task_type_is_valid, >+ saved_task => $task, >+ ); >+ } >+ } elsif ($op eq 'update'){ >+ if ($step){ >+ my $task = Koha::SavedTasks->find($saved_task_id); >+ if ($task){ >+ if ($step eq "one"){ >+ #We've completed step one, which is choosing the task type, >+ #so now we're going to populate the form for editing the rest of the values >+ $op = "edit"; >+ #This is the JSON string that we've saved in the database >+ my $current_params_string = $task->params; >+ my $editable_params = from_json($current_params_string); >+ >+ $template->param( >+ step => "two", >+ task_type => $task_type, >+ saved_task => $task, >+ params => $editable_params, >+ >+ ); >+ } elsif ($step eq "two"){ >+ #We've completed step two, so we're storing the data now... >+ $task->set({ >+ start_time => $start_time, >+ repeat_interval => $repeat_interval, >+ task_type => $task_type, >+ params => $params, >+ }); >+ $task->store; >+ #FIXME: Validate the $task here... >+ if (my $valid = 1){ >+ $op = "list"; >+ } else { >+ $op = "edit"; >+ $template->param( >+ step => "two", >+ task_type => $task_type, >+ saved_task => $task, >+ ); >+ } >+ } >+ } >+ } >+ } elsif ($op eq 'send'){ >+ my $sent_response; >+ if ($icarus->connected){ >+ if ($saved_task_id){ >+ #Look up task >+ my $task = Koha::SavedTasks->find($saved_task_id); >+ if ($task){ >+ #Create a task for Icarus, and send it to Icarus >+ my $icarus_task = $task->serialize({ for => "icarus", type => "perl", }); >+ if ($icarus_task){ >+ $icarus->add_task({ task => $icarus_task, }); >+ $op = "list"; >+ } >+ } >+ } >+ } else { >+ $sent_response = "icarus_offline"; >+ $template->param( >+ sent_response => $sent_response, >+ ); >+ $op = "list"; >+ } >+ } elsif ($op eq 'delete'){ >+ my $saved_response = "delete_failure"; >+ if ($saved_task_id){ >+ #Look up task >+ my $task = Koha::SavedTasks->find($saved_task_id); >+ if ($task){ >+ if (my $something = $task->delete){ >+ $saved_response = "delete_success"; >+ } >+ } >+ } >+ $template->param( >+ saved_response => $saved_response, >+ ); >+ $op = "list"; >+ } else { >+ #Don't recognize $op, so fallback to list >+ $op = "list"; >+ } >+} else { >+ #No $op, so fallback to list >+ $op = "list"; >+} >+ >+if ($op eq 'list'){ >+ #Get active tasks from Icarus >+ if ($icarus->connected){ >+ my $tasks = $icarus->list_tasks(); >+ if ($tasks && @$tasks){ >+ #Sort tasks that come from Icarus, since it returns an unsorted list of hashrefs >+ my @sorted_tasks = sort { $a->{task_id} <=> $b->{task_id} } @$tasks; >+ $template->param( >+ tasks => \@sorted_tasks, >+ ); >+ } >+ } >+ >+ #Get saved tasks from Koha >+ my @saved_tasks = Koha::SavedTasks->as_list(); >+ $template->param( >+ saved_tasks => \@saved_tasks, >+ ); >+} >+ >+$template->param( >+ op => $op, >+); >+ >+output_html_with_http_headers $input, $cookie, $template->output; >diff --git a/docs/Icarus/README b/docs/Icarus/README >new file mode 100755 >index 0000000..8694edc >--- /dev/null >+++ b/docs/Icarus/README >@@ -0,0 +1,64 @@ >+TODO: >+- Add paging to tools/manage-oai-imports >+ - Do a simple version... most recent first and "more/less" buttons... >+ >+- Data validation: >+ "Koha::Icarus::Task::Upload::OAIPMH::Biblio": >+ - Validate HTTP URLs and filepaths... >+ - MAKE IT SO YOU HAVE TO USE A RECORD MATCHING RULE! To at the very least strip the OAI wrapper... >+ - Add PLUGIN->validate("parameter_names") >+ - Add PLUGIN->validate("parameter_values") >+ - For the downloader, this would validate HTTP && OAI-PMH parameters... >+ >+- tools/manage-oai-import >+ - Improve error resolution >+ - For deleted records, just try to re-run the import >+ - For normal records, perhaps the ability to change filters and try again, or delete the OAI record all together? >+ - NOTE: there's a problem with re-adding records to the batch... will have to write some SQL and maybe even just a import_record("retry") or a retry_import(). >+ >+- Update kohastructure.sql in accordance with installer/data/mysql/atomicupdate/bug_10662-Build_import_oai_table.sql >+ >+ >+ >+ >+ >+ >+ >+ >+ >+ >+- admin/saved_tasks.pl >+ - Add a clone button to ease task creation >+- Make the "Task type" prettier (and translateable) on saved_tasks.pl. >+- Provide more options for the Icarus dashboard (start, restart, etc) >+- Add the ability to "edit" and "pause" active Icarus tasks >+ - A pause function would make debugging much easier. >+ >+- Add help pages for WEB GUI >+- Add documentation to all code... >+- Add unit tests >+ >+- Add a "file_name" to make the batch more obvious which makes it useful >+- Add default OAI record matching rule >+ - I thought about adding a SQL atomic update 'bug_10662-Add_oai_record_matching_rule.sql', but adding matching rules seems complex. This needs to be done in Perl. >+ - Should the field include other fields like 022, 020, 245 rather than just 001 and 024a? >+- Add entry to Cleanupdatabase.pl cronjob >+ - You could remove all import_oai rows older than a certain age? >+ >+- Make "Koha::Icarus::Task::Upload::OAIPMH::Biblio" use asynchronous HTTP requests to speed up the import >+- Add support for authority records and possibly holdings records (add record type to svc/import_oai parameters) >+- Instead of using file:///home/koha/koha-dev/var/spool/oaipmh, use something like file:///tmp/koha-instance/koha-dev/oaipmh >+ - How is the user going to specify file:///tmp/koha-instance/koha-dev/oaipmh? Or do you put this in koha-conf.xml and then make a user-defined relative path? >+ >+- WEB UI: >+ - Add `name` to saved_tasks? >+- Move "Saved tasks" from Administration to Tools? >+ - Look at existing bugs for schedulers: >+ - https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=14712 >+ - https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=1993 >+- Handle datestamp granularity better for OAI-PMH download tasks? >+- Change `import_oai` database table? >+ - import_oai's "metadata" should actually be "oai_record"... so it's not so confusing... it's NOT the metadata element... but rather the whole OAI record. >+- Resolve all TODO/FIXME comments in the code >+- Clean up the code >+ >diff --git a/etc/koha-conf.xml b/etc/koha-conf.xml >index c361846..3b6705f 100644 >--- a/etc/koha-conf.xml >+++ b/etc/koha-conf.xml >@@ -137,4 +137,12 @@ __PAZPAR2_TOGGLE_XML_POST__ > </ttf> > > </config> >+<icarus> >+ <socket>unix:__ICARUS_RUN_DIR__/icarus.sock</socket> >+ <pidfile>__ICARUS_RUN_DIR__/icarus.pid</pidfile> >+ <log>__LOG_DIR__/icarus.log</log> >+ <task_plugin>Koha::Icarus::Task::Download::OAIPMH::Biblio</task_plugin> >+ <task_plugin>Koha::Icarus::Task::Upload::OAIPMH::Biblio</task_plugin> >+ <max_tasks>__ICARUS_MAX_TASKS__</max_tasks> >+</icarus> > </yazgfs> >diff --git a/installer/data/mysql/atomicupdate/bug_10662-Build_import_oai_table.sql b/installer/data/mysql/atomicupdate/bug_10662-Build_import_oai_table.sql >new file mode 100644 >index 0000000..3f7cd87 >--- /dev/null >+++ b/installer/data/mysql/atomicupdate/bug_10662-Build_import_oai_table.sql >@@ -0,0 +1,25 @@ >+DROP TABLE IF EXISTS import_oai; >+CREATE TABLE import_oai ( >+ import_oai_id int(10) unsigned NOT NULL AUTO_INCREMENT, >+ header_identifier varchar(45) CHARACTER SET utf8 NOT NULL, >+ header_datestamp datetime NOT NULL, >+ header_status varchar(45) CHARACTER SET utf8 DEFAULT NULL, >+ metadata longtext CHARACTER SET utf8 NOT NULL, >+ last_modified timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP, >+ status varchar(45) CHARACTER SET utf8 NOT NULL, >+ import_batch_id int(11) NOT NULL, >+ filter text COLLATE utf8_unicode_ci NOT NULL, >+ PRIMARY KEY (import_oai_id) >+ KEY FK_import_oai_1 (import_batch_id), >+ CONSTRAINT FK_import_oai_1 FOREIGN KEY (import_batch_id) REFERENCES import_batches (import_batch_id) >+) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci; >+ >+DROP TABLE IF EXISTS saved_tasks; >+CREATE TABLE saved_tasks ( >+ task_id int(10) unsigned NOT NULL AUTO_INCREMENT, >+ start_time datetime NOT NULL, >+ repeat_interval int(10) unsigned NOT NULL, >+ task_type varchar(255) CHARACTER SET utf8 NOT NULL, >+ params text CHARACTER SET utf8 NOT NULL, >+ PRIMARY KEY (task_id) USING BTREE >+) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci; >diff --git a/installer/data/mysql/kohastructure.sql b/installer/data/mysql/kohastructure.sql >index e6e3142..4d64abc 100644 >--- a/installer/data/mysql/kohastructure.sql >+++ b/installer/data/mysql/kohastructure.sql >@@ -3723,6 +3723,37 @@ CREATE TABLE IF NOT EXISTS edifact_ean ( > CONSTRAINT efk_branchcode FOREIGN KEY ( branchcode ) REFERENCES branches ( branchcode ) > ) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci; > >+-- >+-- Table structure for table 'import_oai' >+-- >+ >+DROP TABLE IF EXISTS import_oai; >+CREATE TABLE import_oai ( >+ import_oai_id int(10) unsigned NOT NULL AUTO_INCREMENT, >+ header_identifier varchar(45) CHARACTER SET utf8 NOT NULL, >+ header_datestamp datetime NOT NULL, >+ header_status varchar(45) CHARACTER SET utf8 DEFAULT NULL, >+ metadata longtext CHARACTER SET utf8 NOT NULL, >+ last_modified timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP, >+ status varchar(45) CHARACTER SET utf8 NOT NULL, >+ PRIMARY KEY (import_oai_id) >+) ENGINE=InnoDB AUTO_INCREMENT=297 DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci; >+ >+-- >+-- Table structure for table 'saved_tasks' >+-- >+ >+DROP TABLE IF EXISTS saved_tasks; >+CREATE TABLE saved_tasks ( >+ task_id int(10) unsigned NOT NULL AUTO_INCREMENT, >+ start_time datetime NOT NULL, >+ repeat_interval int(10) unsigned NOT NULL, >+ task_type varchar(255) CHARACTER SET utf8 NOT NULL, >+ params text CHARACTER SET utf8 NOT NULL, >+ PRIMARY KEY (task_id) USING BTREE >+) ENGINE=InnoDB AUTO_INCREMENT=13 DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci; >+ >+ > /*!40103 SET TIME_ZONE=@OLD_TIME_ZONE */; > /*!40101 SET SQL_MODE=@OLD_SQL_MODE */; > /*!40014 SET FOREIGN_KEY_CHECKS=@OLD_FOREIGN_KEY_CHECKS */; >diff --git a/koha-tmpl/intranet-tmpl/prog/en/includes/admin-menu.inc b/koha-tmpl/intranet-tmpl/prog/en/includes/admin-menu.inc >index 8a21712..88e39e7 100644 >--- a/koha-tmpl/intranet-tmpl/prog/en/includes/admin-menu.inc >+++ b/koha-tmpl/intranet-tmpl/prog/en/includes/admin-menu.inc >@@ -78,6 +78,7 @@ > [% IF Koha.Preference('SMSSendDriver') == 'Email' %] > <li><a href="/cgi-bin/koha/admin/sms_providers.pl">SMS cellular providers</a></li> > [% END %] >+ <li><a href="/cgi-bin/koha/admin/saved_tasks.pl">Saved tasks</a></li> > </ul> > </div> > </div> >diff --git a/koha-tmpl/intranet-tmpl/prog/en/includes/tasks/KohaIcarusTaskDownloadOAIPMHBiblio.inc b/koha-tmpl/intranet-tmpl/prog/en/includes/tasks/KohaIcarusTaskDownloadOAIPMHBiblio.inc >new file mode 100755 >index 0000000..a66af5c >--- /dev/null >+++ b/koha-tmpl/intranet-tmpl/prog/en/includes/tasks/KohaIcarusTaskDownloadOAIPMHBiblio.inc >@@ -0,0 +1,87 @@ >+[%# USE CGI %] >+[%# server_name = CGI.server_name; server_port = CGI.server_port; server = server_name _ ":" _ server_port; %] >+ >+<fieldset class="rows"> >+ <legend>HTTP parameters:</legend> >+ <ol> >+ <li> >+ <label for="url">URL: </label> >+ [% IF ( params.url ) %] >+ <input type="text" id="url" name="params[url]" value="[% params.url %]" size="30" /> >+ [% ELSE %] >+ <input type="text" id="url" name="params[url]" value="http://" size="30" /> >+ [% END %] >+ [% IF (errors.url.no_path) %]<span class="error">[The URL must have a path after "http://" like "koha-community.org/cgi-bin/koha/oai.pl".]</span>[% END %] >+ [% IF (errors.url.not_http) %]<span class="error">[The URL begin with a scheme of "http://" like "http://koha-community.org/cgi-bin/koha/oai.pl".]</span>[% END %] >+ [% IF (errors.url.not_a_url) %]<span class="error">[The value of this field must be a URL like "http://koha-community.org/cgi-bin/koha/oai.pl".]</span>[% END %] >+ >+ </li> >+ </ol> >+ <span class="help">The following parameters are not required by all OAI-PMH repositories, so they may be optional for this task.</span> >+ <ol> >+ <li> >+ <label for="username">Username: </label> >+ <input type="text" id="username" name="params[username]" value="[% params.username %]" size="30" /> >+ </li> >+ <li> >+ <label for="password">Password: </label> >+ <input type="text" id="password" name="params[password]" value="[% params.password %]" size="30" /> >+ </li> >+ <li> >+ <label for="realm">Realm: </label> >+ <input type="text" id="realm" name="params[realm]" value="[% params.realm %]" size="30" /> >+ </li> >+ </ol> >+</fieldset> >+<fieldset class="rows"> >+ <legend>OAI-PMH parameters:</legend> >+ <ol> >+ <li> >+ <label for="verb">Verb: </label> >+ <select id="verb" name="params[verb]"> >+ [% FOREACH verb IN [ 'GetRecord', 'ListRecords' ] %] >+ [% IF ( params.verb ) && ( verb == params.verb ) %] >+ <option selected="selected" value="[% verb %]">[% verb %]</option> >+ [% ELSE %] >+ <option value="[% verb %]">[% verb %]</option> >+ [% END %] >+ [% END %] >+ </select> >+ </li> >+ <li> >+ <label for="identifier">Identifier: </label> >+ <input type="text" id="identifier" name="params[identifier]" value="[% params.identifier %]" size="30" /> >+ <span class="help">This identifier will only be used with the GetRecord verb.</span> >+ </li> >+ <li> >+ <label for="sets">Sets: </label> >+ <input type="text" id="sets" name="params[sets]" value="[% params.sets %]" size="30" /><span class="help">You may specify several sets by separating the sets with a pipe (e.g. set1|set2 )</span> >+ </li> >+ <li> >+ <label for="metadataPrefix">Metadata Prefix: </label> >+ <input type="text" id="metadataPrefix" name="params[metadataPrefix]" value="[% params.metadataPrefix %]" size="30" /> >+ </li> >+ <li> >+ <label for="opt_from">From: </label> >+ <input type="text" class="datetime_utc" id="opt_from" name="params[from]" value="[% params.from %]" size="30" /><span class="help">This value will be treated as UTC time. Note that some repositories only support YYYY-MM-DD datestamps.</span> >+ </li> >+ <li> >+ <label for="opt_until">Until: </label> >+ <input type="text" class="datetime_utc" id="opt_until" name="params[until]" value="[% params.until %]" size="30" /><span class="help">This value will be treated as UTC time. Note that some repositories only support YYYY-MM-DD datestamps.</span> >+ </li> >+ </ol> >+</fieldset> >+<fieldset class="rows"> >+ <legend>Download parameters:</legend> >+ <ol> >+ <li> >+ <label for="queue">Queue: </label> >+ [% IF ( params.queue ) %] >+ <input type="text" id="queue" name="params[queue]" value="[% params.queue %]" size="30" /> >+ [% ELSE %] >+ <input type="text" id="queue" name="params[queue]" value="file://" size="30" /> >+ [% END %] >+ <span class="help">This is a filepath on your system like file:///var/spool/koha/libraryname/oaipmh</span> >+ </li> >+ </ol> >+</fieldset> >diff --git a/koha-tmpl/intranet-tmpl/prog/en/includes/tasks/KohaIcarusTaskUploadOAIPMHBiblio.inc b/koha-tmpl/intranet-tmpl/prog/en/includes/tasks/KohaIcarusTaskUploadOAIPMHBiblio.inc >new file mode 100755 >index 0000000..415e1c1 >--- /dev/null >+++ b/koha-tmpl/intranet-tmpl/prog/en/includes/tasks/KohaIcarusTaskUploadOAIPMHBiblio.inc >@@ -0,0 +1,143 @@ >+[%# Use CGI plugin to create a default target URI %] >+[%# TODO: Test if this works with Plack... %] >+[% USE CGI %] >+[% server = CGI.virtual_host %] >+[% IF ( server_port = CGI.virtual_port ) %] >+ [% IF ( server_port != '80' ) && ( server_port != '443' ) %] >+ [% server = server _ ':' _ server_port %] >+ [% END %] >+[% END %] >+[% default_auth_uri = 'http://' _ server _ '/cgi-bin/koha/svc/authentication' %] >+[% default_target_uri = 'http://' _ server _ '/cgi-bin/koha/svc/import_oai' %] >+<fieldset class="rows"> >+ <legend>Import source parameters:</legend> >+ <ol> >+ <li> >+ <label for="queue">Queue: </label> >+ [% IF ( params.queue ) %] >+ <input type="text" id="queue" name="params[queue]" value="[% params.queue %]" size="30" /> >+ [% ELSE %] >+ <input type="text" id="queue" name="params[queue]" value="file://" size="30" /> >+ [% END %] >+ <span class="help">This is a filepath on your system like file:///var/spool/koha/libraryname/oaipmh</span> >+ </li> >+ </ol> >+</fieldset> >+<fieldset class="rows"> >+ <legend>API authentication parameters:</legend> >+ <ol> >+ <li> >+ <label for="auth_uri">URL: </label> >+ [% IF ( params.auth_uri ) %] >+ <input type="text" id="auth_uri" name="params[auth_uri]" value="[% params.auth_uri %]" size="30" /> >+ [% ELSE %] >+ <input type="text" id="auth_uri" name="params[auth_uri]" value="[% default_auth_uri %]" size="30" /> >+ [% END %] >+ [% IF (errors.auth_uri.no_path) %]<span class="error">[The URL must have a path after "http://" like "koha-community.org/cgi-bin/koha/svc/authentication".]</span>[% END %] >+ [% IF (errors.auth_uri.not_http) %]<span class="error">[The URL begin with a scheme of "http://" like "http://koha-community.org/cgi-bin/koha/svc/authentication".]</span>[% END %] >+ [% IF (errors.auth_uri.not_a_url) %]<span class="error">[The value of this field must be a URL like "http://koha-community.org/cgi-bin/koha/svc/authentication".]</span>[% END %] >+ <span class="help">This is a Koha authentication URL. The default value </span> >+ </li> >+ <li> >+ <label for="auth_username">Username: </label> >+ <input type="text" id="auth_username" name="params[auth_username]" value="[% params.auth_username %]" size="30" /> >+ <span class="help">This user must have permission to edit the catalogue.</span> >+ </li> >+ <li> >+ <label for="auth_password">Password: </label> >+ <input type="text" id="auth_password" name="params[auth_password]" value="[% params.auth_password %]" size="30" /> >+ </li> >+ </ol> >+</fieldset> >+<fieldset class="rows"> >+ <legend>Import target parameters:</legend> >+ <ol> >+ <li> >+ <label for="target_uri">URL: </label> >+ [% IF ( params.target_uri ) %] >+ <input type="text" id="target_uri" name="params[target_uri]" value="[% params.target_uri %]" size="30" /> >+ [% ELSE %] >+ <input type="text" id="target_uri" name="params[target_uri]" value="[% default_target_uri %]" size="30" /> >+ [% END %] >+ [% IF (errors.target_uri.no_path) %]<span class="error">[The URL must have a path after "http://" like "koha-community.org/cgi-bin/koha/svc/import_oai".]</span>[% END %] >+ [% IF (errors.target_uri.not_http) %]<span class="error">[The URL begin with a scheme of "http://" like "http://koha-community.org/cgi-bin/koha/svc/import_oai".]</span>[% END %] >+ [% IF (errors.target_uri.not_a_url) %]<span class="error">[The value of this field must be a URL like "http://koha-community.org/cgi-bin/koha/svc/import_oai".]</span>[% END %] >+ </li> >+ >+ <li> >+ <label for="match">Record matching rule code</label> >+ <input type="text" id="match" name="params[match]" value="[% params.match %]" size="30" /> >+ <span class="help">This code must exist in "Record matching rules" in Administration for record matching to work. (Example code: OAI)</span> >+ </li> >+ <li> >+ [%# TODO: Ideally, I'd like to use 'tools-overlay-action.inc' but the logic doesn't work here. Perhaps it would be better as a TT plugin. %] >+ <label for="overlay_action">Action if matching record found</label> >+ <select name="params[overlay_action]" id="overlay_action"> >+ [% IF ( params.overlay_action == "replace" ) %] >+ <option value="replace" selected="selected"> >+ [% ELSE %] >+ <option value="replace"> >+ [% END %] >+ Replace existing record with incoming record</option> >+ [% IF ( params.overlay_action == "create_new" ) %] >+ <option value="create_new" selected="selected"> >+ [% ELSE %] >+ <option value="create_new"> >+ [% END %] >+ Add incoming record</option> >+ [% IF ( params.overlay_action == "ignore" ) %] >+ <option value="ignore" selected="selected"> >+ [% ELSE %] >+ <option value="ignore"> >+ [% END %] >+ Ignore incoming record</option> >+ </select> >+ </li> >+ <li> >+ [%# TODO: Ideally, I'd like to use 'tools-nomatch-action.inc' but the logic doesn't work here. Perhaps it would be better as a TT plugin. %] >+ <label for="nomatch_action">Action if no match is found</label> >+ <select name="params[nomatch_action]" id="nomatch_action"> >+ [% IF ( params.nomatch_action == "create_new" ) %] >+ <option value="create_new" selected="selected"> >+ [% ELSE %] >+ <option value="create_new"> >+ [% END %] >+ Add incoming record</option> >+ [% IF ( params.nomatch_action == "ignore" ) %] >+ <option value="ignore" selected="selected"> >+ [% ELSE %] >+ <option value="ignore"> >+ [% END %] >+ Ignore incoming record</option> >+ </select> >+ </li> >+ <li> >+ <label for="item_action">Item action</label> >+ [%# TODO: Will you ever have a different mode than ignore? %] >+ <input type="text" id="item_action" value="ignore" size="30" disabled="disabled"/> >+ <input type="hidden" name="params[item_action]" value="ignore" /> >+ </li> >+ <li> >+ <label for="import_mode">Import mode: </label> >+ [%# TODO: Will you ever have a different mode than direct? %] >+ <input type="text" id="import_mode" value="direct" size="30" disabled="disabled"/> >+ <input type="hidden" name="params[import_mode]" value="direct" /> >+ </li> >+ <li> >+ <label>Framework</label> >+ </li> >+ <li> >+ <label for="filter">Filter</label> >+ [% IF ( params.filter ) %] >+ <input type="text" id="filter" name="params[filter]" value="[% params.filter %]" size="30" /> >+ [% ELSE %] >+ <input type="text" id="filter" name="params[filter]" value="file://" size="30" /> >+ [% END %] >+ <span class="help">This is a filepath on your system like file:///etc/koha/sites/libraryname/OAI2MARC21slim.xsl or file:///usr/share/koha/intranet/htdocs/intranet-tmpl/prog/en/xslt/OAI2MARC21slim.xsl</span> >+ </li> >+ </li> >+ <li> >+ <label>Record type</label> >+ </li> >+ </ol> >+</fieldset> >diff --git a/koha-tmpl/intranet-tmpl/prog/en/modules/admin/admin-home.tt b/koha-tmpl/intranet-tmpl/prog/en/modules/admin/admin-home.tt >index 32bca5a..0e1760f 100644 >--- a/koha-tmpl/intranet-tmpl/prog/en/modules/admin/admin-home.tt >+++ b/koha-tmpl/intranet-tmpl/prog/en/modules/admin/admin-home.tt >@@ -120,6 +120,8 @@ > <dt><a href="/cgi-bin/koha/admin/sms_providers.pl">SMS cellular providers</a></dt> > <dd>Define a list of cellular providers for sending SMS messages via email.</dd> > [% END %] >+ <dt><a href="/cgi-bin/koha/admin/saved_tasks.pl">Saved tasks</a></dt> >+ <dd>Define tasks which may be run in the background</dd> > </dl> > </div> > </div> >diff --git a/koha-tmpl/intranet-tmpl/prog/en/modules/admin/saved_tasks.tt b/koha-tmpl/intranet-tmpl/prog/en/modules/admin/saved_tasks.tt >new file mode 100644 >index 0000000..dd92ec8 >--- /dev/null >+++ b/koha-tmpl/intranet-tmpl/prog/en/modules/admin/saved_tasks.tt >@@ -0,0 +1,338 @@ >+[% INCLUDE 'doc-head-open.inc' %] >+<title>Koha › Administration › Saved tasks</title> >+[% INCLUDE 'doc-head-close.inc' %] >+[% INCLUDE 'calendar.inc' %] >+<script type="text/javascript" src="[% interface %]/lib/jquery/plugins/jquery-ui-timepicker-addon.min.js"></script> >+[% INCLUDE 'timepicker.inc' %] >+[% IF ( op == "list" ) %] >+ <link rel="stylesheet" type="text/css" href="[% themelang %]/css/datatables.css" /> >+ [% INCLUDE 'datatables.inc' %] >+ <script type="text/javascript"> >+ //<![CDATA[ >+ $(document).ready(function() { >+ $("#taskst").dataTable($.extend(true, {}, dataTablesDefaults, { >+ "aoColumnDefs": [ >+ { "aTargets": [3,4,5,6], "bSortable": false }, >+ ], >+ "sPaginationType": "four_button" >+ })); >+ }); >+ //]]> >+ </script> >+[% ELSIF ( op == "edit" ) || ( op == "new" ) %] >+ <script type="text/javascript"> >+ //<![CDATA[ >+ $(document).ready(function() { >+ [%# Ideally, it would be nice to record the timezone here too, but currently we use MySQL's DATETIME field which doesn't store ISO 8601 timezone designators... %] >+ $(".datetime_local").datetimepicker({ >+ dateFormat: "yy-mm-dd", >+ timeFormat: "HH:mm:ss", >+ hour: 0, >+ minute: 0, >+ second: 0, >+ showSecond: 1, >+ }); >+ $(".datetime_utc").datetimepicker({ >+ separator: "T", >+ timeSuffix: 'Z', >+ dateFormat: "yy-mm-dd", >+ timeFormat: "HH:mm:ss", >+ hour: 0, >+ minute: 0, >+ second: 0, >+ showSecond: 1, >+ // timezone doesn't work with the "Now" button in v1.4.3 although it appears to in v1.6.1 >+ // timezone: +000, >+ }); >+ >+ }); >+ //]]> >+ </script> >+ <style type="text/css"> >+ /* Override staff-global.css which hides second, millisecond, and microsecond sliders */ >+ .ui_tpicker_second { >+ display: block; >+ } >+ .test-success { >+ /* same color as .text-success in Bootstrap 2.2.2 */ >+ color:#468847; >+ } >+ </style> >+[% END %] >+</head> >+ >+<body id="admin_saved_tasks" class="admin"> >+[% INCLUDE 'header.inc' %] >+[% INCLUDE 'cat-search.inc' %] >+<div id="breadcrumbs"><a href="/cgi-bin/koha/mainpage.pl">Home</a> › <a href="/cgi-bin/koha/admin/admin-home.pl">Administration</a> › Saved tasks</div> >+ >+<div id="doc3" class="yui-t2"> >+ >+<div id="bd"> >+ <div id="yui-main"> >+ <div class="yui-b"> >+ [% IF ( op ) %] >+ [% IF ( op == "new" ) || ( op == "edit" ) %] >+ [%# If step is undefined, force it to be step one %] >+ [% IF ( ! step ); step = "one"; END; %] >+ >+ >+ >+ [%# HEADING %] >+ [% IF ( op == "new" ) %] >+ <h1>New saved task</h1> >+ [% ELSIF ( op == "edit" ) %] >+ <h1>Modify saved task</h1> >+ [% END %] >+ [%# /HEADING %] >+ >+ [%# TODO: Get this working properly... <div class="alert">Validation failed.</div> #] >+ >+ [%# FORM %] >+ <form action="/cgi-bin/koha/admin/[% filename %]" name="detail-form" method="post" id="saved-task-details" novalidate="novalidate"> >+ [% IF ( op == "new" ) %] >+ <input type="hidden" name="op" value="create" /> >+ [% ELSIF ( op == "edit" ) %] >+ <input type="hidden" name="op" value="update" /> >+ <input type="hidden" name="saved_task_id" value="[% saved_task.task_id %]" /> >+ [% END %] >+ <input type="hidden" name="step" value="[% step %]" /> >+ <fieldset class="rows"> >+ <ol> >+ [% IF ( op == "edit") && ( step == "one" ) && (! task_type_is_valid ) %] >+ <li> >+ <label for="invalid_task_type">Current invalid task type:</label> >+ <input id="invalid_task_type" type="text" disabled="disabled" value="[% saved_task.task_type %]" size="60" /> >+ <span class="error">Sorry! This task type is invalid. Please choose a new one from the following list.</span> >+ <li> >+ [% END %] >+ <li> >+ <label for="task_type">Task type: </label> >+ [% IF ( step == "one" ) %] >+ [% IF ( available_plugins ) %] >+ <select id="task_type" name="task_type"> >+ [% IF ( op == "new") %] >+ [% FOREACH plugin IN available_plugins %] >+ <option value="[% plugin %]">[% plugin %]</option> >+ [% END %] >+ [% ELSIF ( op == "edit" ) %] >+ [% FOREACH plugin IN available_plugins %] >+ [% IF ( saved_task.task_type == plugin ) %] >+ <option selected="selected" value="[% plugin %]">[% plugin %]</option> >+ [% ELSE %] >+ <option value="[% plugin %]">[% plugin %]</option> >+ [% END %] >+ [% END %] >+ [% END %] >+ </select> >+ [% END %] >+ >+ [% ELSIF ( step == "two" ) %] >+ <input type="text" disabled="disabled" value="[% task_type %]" size="60" /> >+ <input type="hidden" name="task_type" value="[% task_type %]" /> >+ [% END %] >+ </li> >+ </ol> >+ </fieldset> >+ >+ [% IF ( step == "one" ) %] >+ <fieldset class="action"> >+ <input type="submit" value="Next"> >+ <a class="cancel" href="/cgi-bin/koha/admin/[% filename %]">Cancel</a> >+ </fieldset> >+ [% ELSIF ( step == "two" ) %] >+ <fieldset class="rows"> >+ <legend>Task:</legend> >+ <ol> >+ <li> >+ <label for="start_time">Start time: </label> >+ <input type="text" id="start_time" class="datetime_local" name="start_time" value="[% saved_task.start_time %]" size="30" /> >+ <span class="help">This value will be treated as local server time, and times in the past will start immediately.</span> >+ </li> >+ <li> >+ <label for="repeat_interval">Repeat interval: </label> >+ <input type="text" id="repeat_interval" name="repeat_interval" value="[% saved_task.repeat_interval %]" size="4" /> >+ <span class="help">seconds</span> >+ [% IF (errors.repeat_interval.not_numeric) %]<span class="error">[The repeat interval must be a purely numeric value.]</span>[% END %] >+ </li> >+ </ol> >+ </fieldset> >+ [%# Try to include the template, but if it fails, fallback to a regular text view %] >+ [% TRY %] >+ [% INCLUDE $task_template %] >+ [% CATCH %] >+ <fieldset class="rows"> >+ <legend>Plugin parameters:</legend> >+ <ol> >+ <li> >+ <label for="params">Params: </label> >+ <textarea id="params" name="params" cols="60" rows="20">[% saved_task.params %]</textarea> >+ </li> >+ </ol> >+ </fieldset> >+ [% END %] >+ <fieldset class="action"> >+ <input type="submit" value="Save"> >+ <a class="cancel" href="/cgi-bin/koha/admin/[% filename %]">Cancel</a> >+ </fieldset> >+ [% END %] >+ </form> >+ [%# /FORM %] >+ [% END #/edit or new %] >+ >+ >+ [% IF ( op == "list" ) %] >+ <div id="toolbar" class="btn-toolbar"> >+ <a id="newserver" class="btn btn-small" href="/cgi-bin/koha/admin/[% filename %]?op=new"><i class="icon-plus"></i> New saved task</a> >+ </div> >+ <h1>Saved tasks</h1> >+ [% IF ( saved_response ) %] >+ [% IF ( saved_response == 'delete_success' ) %] >+ <div class="alert">Deletion successful.</div> >+ [% ELSIF ( saved_response == 'delete_failure' ) %] >+ <div class="alert">Deletion failed.</div> >+ [% END %] >+ [% END %] >+ [% IF ( sent_response ) %] >+ [% IF ( sent_response == 'icarus_offline' ) %] >+ <div class="alert">Send failed. Icarus is currently offline.</div> >+ [% END %] >+ [% END %] >+ <table id="taskst"> >+ <thead> >+ <tr> >+ <th>Start time</th> >+ <th>Repeat interval</th> >+ <th>Task type</th> >+ <th>Params</th> >+ <th></th> >+ <th></th> >+ <th></th> >+ </tr> >+ </thead> >+ <tbody> >+ [% FOREACH saved_task IN saved_tasks %] >+ <tr> >+ <td>[% IF ( saved_task.start_time ) != "0000-00-00 00:00:00"; saved_task.start_time; END; %]</td> >+ <td>[% saved_task.repeat_interval %]</td> >+ <td>[% saved_task.task_type %]</td> >+ <td> >+ <ul> >+ [% FOREACH pair IN saved_task.params_as_perl.pairs %] >+ <li>[% pair.key %] => [% pair.value %]</li> >+ [% END %] >+ </ul> >+ </td> >+ <td><a href="/cgi-bin/koha/admin/[% filename %]?op=edit&saved_task_id=[% saved_task.task_id %]">Edit</a></td> >+ <td><a href="/cgi-bin/koha/admin/[% filename %]?op=send&saved_task_id=[% saved_task.task_id %]">Send to Icarus</a></td> >+ <td><a href="/cgi-bin/koha/admin/[% filename %]?op=delete&saved_task_id=[% saved_task.task_id %]">Delete</a></td> >+ </tr> >+ [% END %] >+ </tbody> >+ </table> >+ <div id="daemon_controls"> >+ <h1>Icarus dashboard</h1> >+ <table> >+ <tr> >+ <th>Status</th> >+ <th></th> >+ </tr> >+ <tr> >+ <td> >+ >+ [% IF ( daemon_status == 'Permission denied' ) #Apache doesn't have permission to write to socket >+ || ( daemon_status == 'Connection refused' ) #Socket exists, but server is down >+ || ( daemon_status == 'No such file or directory' ) #Socket doesn't exist at all >+ %] >+ <span id="icarus_status">Unable to contact</span> >+ [% ELSIF ( daemon_status == 'online' ) %] >+ <span id="icarus_status">Online</span> >+ [% ELSIF ( daemon_status == 'shutting down' ) %] >+ <span id="icarus_status">Shutting down</span> >+ [% ELSE %] >+ <span id="icarus_status">[% daemon_status %]</span> >+ [% END %] >+ </td> >+ [%# TODO: Also provide controls for starting/restarting Icarus? %] >+ <td><a href="/cgi-bin/koha/admin/[% filename %]?server_action=shutdown">Shutdown Icarus</a></td> >+ </tr> >+ </table> >+ </div> >+ <div id="tasks"> >+ <h1>Active Icarus tasks</h1> >+ [% IF ( task_response ) %] >+ [% IF ( task_response.action == 'error' ) %] >+ [% IF ( task_response.error_message ) %] >+ [% IF ( task_response.error_message == 'No such process' ) %] >+ <div class="alert">Task [% task_response.task_id %] does not exist.</div> >+ [% END %] >+ [% END %] >+ [% ELSIF ( task_response.action == 'pending' ) %] >+ <div class="alert">Initialising task [% task_response.task_id %].</div> >+ [% ELSIF ( task_response.action == 'already pending' ) %] >+ <div class="alert">Already initialised task [% task_response.task_id %].</div> >+ [% ELSIF ( task_response.action == 'already started' ) %] >+ <div class="alert">Already started task [% task_response.task_id %].</div> >+ [% ELSIF ( task_response.action == 'removed' ) %] >+ <div class="alert">Removing task [% task_response.task_id %].</div> >+ [% END %] >+ [% END %] >+ [% IF ( tasks ) %] >+ <table> >+ <thead> >+ <tr> >+ <th>Task id</th> >+ <th>Status</th> >+ <th>Next start time (local server time)</th> >+ <th>Repeat interval</th> >+ <th>Task type</th> >+ <th>Params</th> >+ <th></th> >+ <th></th> >+ </tr> >+ </thead> >+ <tbody> >+ [% FOREACH task IN tasks %] >+ <tr> >+ <td>[% task.task_id %]</td> >+ <td> >+ [% SWITCH task.task.status %] >+ [% CASE 'new' %] >+ <span>New</span> >+ [% CASE 'pending' %] >+ <span>Pending</span> >+ [% CASE 'started' %] >+ <span>Started</span> >+ [% CASE 'stopping' %] >+ <span>Stopping</span> >+ [% CASE %] >+ <span>[% task.task.status %]</span> >+ [% END %] >+ </td> >+ <td>[% task.task.start %]</td> >+ <td>[% task.task.repeat_interval %]</td> >+ <td>[% task.task.type %]</td> >+ <td> >+ <ul> >+ [% FOREACH pair IN task.task.params.pairs %] >+ <li>[% pair.key %] => [% pair.value %]</li> >+ [% END %] >+ </ul> >+ </td> >+ <td><a href="/cgi-bin/koha/admin/[% filename %]?server_action=start&server_task_id=[% task.task_id %]">Start</a></td> >+ <td><a href="/cgi-bin/koha/admin/[% filename %]?server_action=remove&server_task_id=[% task.task_id %]">Remove</a></td> >+ </tr> >+ [% END %] >+ </tbody> >+ </table> >+ [% END %] >+ </div> >+ [% END #/list %] >+ [% END #/op %] >+ </div> >+ </div> >+ <div class="yui-b"> >+ [% INCLUDE 'admin-menu.inc' %] >+ </div> >+</div> >+[% INCLUDE 'intranet-bottom.inc' %] >diff --git a/koha-tmpl/intranet-tmpl/prog/en/modules/tools/manage-oai-import.tt b/koha-tmpl/intranet-tmpl/prog/en/modules/tools/manage-oai-import.tt >new file mode 100755 >index 0000000..145bbdc >--- /dev/null >+++ b/koha-tmpl/intranet-tmpl/prog/en/modules/tools/manage-oai-import.tt >@@ -0,0 +1,122 @@ >+[% INCLUDE 'doc-head-open.inc' %] >+<title>Koha › Tools › Manage OAI-PMH record imports >+[% IF ( import_oai_id ) %] >+ › Record [% import_oai_id %] >+[% END %] >+</title> >+[% INCLUDE 'doc-head-close.inc' %] >+<link rel="stylesheet" type="text/css" href="[% themelang %]/css/datatables.css" /> >+[% INCLUDE 'datatables.inc' %] >+</head> >+ >+<body id="tools_manage-oai-import" class="tools"> >+[% INCLUDE 'header.inc' %] >+[% INCLUDE 'cat-search.inc' %] >+ >+ <div id="breadcrumbs"><a href="/cgi-bin/koha/mainpage.pl">Home</a> › <a href="/cgi-bin/koha/tools/tools-home.pl">Tools</a> >+ [% IF ( import_oai_id ) %] >+ › >+ <a href="[% script_name %]">Manage OAI-PMH record imports</a> >+ › Record [% import_oai_id %] >+ [% ELSE %] >+ › Manage OAI-PMH record imports >+ [% END %] >+ </div> >+ >+ <div id="doc3" class="yui-t2"> >+ <div id="bd"> >+ <div id="yui-main"> >+ <div class="yui-b"> >+ [% IF ( import_oai_id ) %] >+ [% IF ( view_record ) %] >+ <h1>Record [% import_oai_id %]</h1> >+ [% IF ( oai_record.metadata ) %] >+ <div style="white-space:pre">[% oai_record.metadata | xml %]</div> >+ [% END %] >+ [% ELSIF ( retry ) %] >+ <fieldset class="rows"> >+ <ol> >+ <li> >+ <span class="label">Import status:</span> >+ [% IF ( import_status ) %] >+ [% IF ( import_status == "ok" ) %] >+ OK >+ [% ELSIF ( import_status == "error" ) %] >+ ERROR >+ [% END %] >+ [% END %] >+ </li> >+ [% IF ( errors ) %] >+ [% FOREACH error IN errors %] >+ <li> >+ <span class="label">Error:</span> >+ [%# FIXME: These English messages come straight from C4::Biblio... %] >+ [% error.error_msg %] >+ [% IF ( record_type ) && ( record_type == "biblio" ) %] >+ <a title="View biblio record" href="/cgi-bin/koha/catalogue/detail.pl?biblionumber=[% error.record_id %]">(View biblio record)</a> >+ [% END %] >+ </li> >+ [% END %] >+ [% END %] >+ </ol> >+ </fieldset> >+ [% END %] >+ [% ELSE %] >+ <h1>Manage OAI-PMH record imports</h1> >+ <table> >+ <thead> >+ <tr> >+ <th>Record identifier</th> >+ <th>Record datestamp</th> >+ <th>Provider status</th> >+ <th>Import status</th> >+ <th>Import batch</th> >+ <th>OAI-PMH record</th> >+ [%# <th>Filter</th> %] >+ </tr> >+ </thead> >+ <tbody> >+ [% WHILE (oai_record = oai_records.next) %] >+ <tr> >+ <td>[% oai_record.header_identifier %]</td> >+ <td>[% oai_record.header_datestamp %]</td> >+ <td> >+ [% IF ( oai_record.header_status ) %] >+ [% IF ( oai_record.header_status == "deleted" ) %] >+ DELETED >+ [% END %] >+ [% END %] >+ </td> >+ <td> >+ [% IF ( oai_record.status ) %] >+ [% IF ( oai_record.status == "ok" ) %] >+ OK >+ [% ELSIF ( oai_record.status == "error" ) %] >+ <a title="Retry import" href="[% script_name %]?op=retry&import_oai_id=[% oai_record.import_oai_id %]">ERROR - Click to retry</a> >+ [% END %] >+ >+ [% ELSE %] >+ Unknown >+ [% END %] >+ </td> >+ <td> >+ [% IF ( oai_record.import_batch_id ) %] >+ <a title="View import batch" href="/cgi-bin/koha/tools/manage-marc-import.pl?import_batch_id=[% oai_record.import_batch_id %]">View batch [% oai_record.import_batch_id %]</a> >+ [% END %] >+ </td> >+ [%# oai_record.filter %] >+ <td><a title="View OAI-PMH record" href="[% script_name %]?op=view_record&import_oai_id=[% oai_record.import_oai_id %]">View record [% oai_record.import_oai_id %]</a></td> >+ </tr> >+ [% END %] >+ >+ </tbody> >+ </table> >+ [% END %] >+ </div> >+ </div> >+ <div class="yui-b"> >+ [% INCLUDE 'tools-menu.inc' %] >+ </div> >+ </div> >+ </div> >+[% INCLUDE 'intranet-bottom.inc' %] >diff --git a/koha-tmpl/intranet-tmpl/prog/en/xslt/OAI2MARC21slim.xsl b/koha-tmpl/intranet-tmpl/prog/en/xslt/OAI2MARC21slim.xsl >new file mode 100755 >index 0000000..0f1d6f0 >--- /dev/null >+++ b/koha-tmpl/intranet-tmpl/prog/en/xslt/OAI2MARC21slim.xsl >@@ -0,0 +1,74 @@ >+<?xml version="1.0" encoding="UTF-8"?> >+<xsl:stylesheet version="1.0" >+ xmlns:marc="http://www.loc.gov/MARC21/slim" >+ xmlns:oai="http://www.openarchives.org/OAI/2.0/" >+ xmlns:xsl="http://www.w3.org/1999/XSL/Transform"> >+ <xsl:output method="xml" encoding="UTF-8" indent="yes"/> >+ <!-- NOTE: This XSLT strips the OAI-PMH wrapper from the metadata. --> >+ <!-- NOTE: This XSLT also adds the OAI-PMH identifier back in as a MARC field --> >+ >+ <!-- Match the root oai:record element --> >+ <xsl:template match="oai:record"> >+ <!-- Apply templates only when the oai record is for a deleted item --> >+ <xsl:apply-templates select="oai:header[@status='deleted']" /> >+ <!-- Apply templates only to the child metadata element(s) --> >+ <xsl:apply-templates select="oai:metadata" /> >+ </xsl:template> >+ >+ <!-- Matches an oai:metadata element --> >+ <xsl:template match="oai:metadata"> >+ <!-- Only apply further templates to marc:record elements --> >+ <!-- This prevents the identity transformation from outputting other non-MARC metadata formats --> >+ <xsl:apply-templates select="//marc:record"/> >+ </xsl:template> >+ >+ <!-- We need to create a MARCXML record from OAI records marked "deleted" to handle OAI deletions correctly in Koha --> >+ <xsl:template match="oai:header[@status='deleted']"> >+ <xsl:element name="record" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" >+ xmlns="http://www.loc.gov/MARC21/slim"> >+ <xsl:attribute name="xsi:schemaLocation">http://www.loc.gov/MARC21/slim http://www.loc.gov/standards/marcxml/schema/MARC21slim.xsd</xsl:attribute> >+ <xsl:call-template name="add_oai"/> >+ </xsl:element> >+ </xsl:template> >+ >+ <!-- Identity transformation: this template copies attributes and nodes --> >+ <xsl:template match="@* | node()"> >+ <!-- Create a copy of this attribute or node --> >+ <xsl:copy> >+ <!-- Recursively apply this template to the attributes and child nodes of this element --> >+ <xsl:apply-templates select="@* | node()" /> >+ </xsl:copy> >+ </xsl:template> >+ >+ >+ <xsl:template match="marc:record"> >+ <xsl:copy> >+ <!-- Apply all relevant templates for all attributes and elements --> >+ <xsl:apply-templates select="@* | node()"/> >+ >+ <!-- Add new node (or whatever else you want to do after copying the existing record) --> >+ <xsl:call-template name="add_oai"/> >+ >+ <!-- Newline --> >+ <xsl:text>
</xsl:text> >+ </xsl:copy> >+ </xsl:template> >+ >+ <!-- Template for adding the OAI-PMH identifier as 024$a --> >+ <xsl:template name="add_oai"> >+ <xsl:element name="datafield" xmlns="http://www.loc.gov/MARC21/slim"> >+ <xsl:attribute name="ind1"><xsl:text>7</xsl:text></xsl:attribute> >+ <xsl:attribute name="ind2"><xsl:text> </xsl:text></xsl:attribute> >+ <xsl:attribute name="tag">024</xsl:attribute> >+ <xsl:element name="subfield"> >+ <xsl:attribute name="code">a</xsl:attribute> >+ <xsl:value-of select="/oai:record/oai:header/oai:identifier"/> >+ </xsl:element> >+ <xsl:element name="subfield"> >+ <xsl:attribute name="code">2</xsl:attribute> >+ <xsl:text>uri</xsl:text> >+ </xsl:element> >+ </xsl:element> >+ </xsl:template> >+ >+</xsl:stylesheet> >diff --git a/misc/bin/icarusd.pl b/misc/bin/icarusd.pl >new file mode 100755 >index 0000000..f5a3aa8 >--- /dev/null >+++ b/misc/bin/icarusd.pl >@@ -0,0 +1,181 @@ >+#!/usr/bin/perl >+ >+####################################################################### >+ >+use Modern::Perl; >+use POSIX; #For daemonizing >+use Fcntl qw(:flock); #For pidfile >+use Getopt::Long; >+use Pod::Usage; >+ >+#Make the STDOUT filehandle hot, so that you can use shell re-direction. Otherwise, you'll suffer from buffering. >+STDOUT->autoflush(1); >+#Note that STDERR, by default, is already hot. >+ >+####################################################################### >+#FIXME: Debugging signals >+#BEGIN { >+# package POE::Kernel; >+# use constant TRACE_SIGNALS => 1; >+#} >+ >+use POE; >+use JSON; #For Listener messages >+use XML::LibXML; #For configuration files >+ >+use Koha::Icarus::Listener; >+ >+####################################################################### >+ >+my ($filename,$daemon,$log,$help); >+my $verbosity = 1; >+GetOptions ( >+ "f|file|filename=s" => \$filename, #/kohawebs/dev/dcook/koha-dev/etc/koha-conf.xml >+ "l|log=s" => \$log, >+ "d|daemon" => \$daemon, >+ "v=i" => \$verbosity, >+ "h|?" => \$help, >+) or pod2usage(2); >+pod2usage(1) if $help; >+ >+ >+if ( ! $filename || ! -f $filename ){ >+ print "Failed to start.\n"; >+ if ( ! $filename ){ >+ print("You must provide a valid configuration file using the -f switch.\n"); >+ pod2usage(1); >+ } >+ if ( ! -f $filename ){ >+ die(qq{"$filename" is not a file.\n}); >+ } >+} >+ >+#Declare the variable with file scope so the flock stays for the duration of the process's life >+my $pid_filehandle; >+ >+#Read configuration file >+my $config = read_config_file($filename); >+ >+my $SOCK_PATH = $config->{socket}; >+my $pid_file = $config->{pidfile}; >+my $max_tasks = $config->{max_tasks}; >+ >+#Overwrite configuration file with command line options >+if ($log){ >+ $config->{log} = $log; >+} >+ >+#Go into daemon mode, if user has included flag >+if ($daemon){ >+ daemonize(); >+} >+ >+if ($pid_file){ >+ #NOTE: The filehandle needs to have file scope, so that the flock is preserved. >+ $pid_filehandle = make_pid_file($pid_file); >+} >+ >+#FIXME: Do we want to log to file only in daemon mode? $config->{log} should be populated by either the config file or the l|log GetOpt... >+if ($daemon && $config->{log}){ >+ log_to_file($config->{log}); >+} >+ >+ >+#FIXME: 1) In daemon mode, SIGUSR1 or SIGHUP for reloading/restarting? >+####################################################################### >+ >+#Creates Icarus Listener >+Koha::Icarus::Listener->spawn({ >+ Socket => $SOCK_PATH, >+ MaxTasks => $max_tasks, >+ Verbosity => $verbosity, >+}); >+ >+POE::Kernel->run(); >+ >+exit; >+ >+sub read_config_file { >+ my $filename = shift; >+ my $config = {}; >+ if ( -e $filename ){ >+ eval { >+ my $doc = XML::LibXML->load_xml(location => $filename); >+ if ($doc){ >+ my $root = $doc->documentElement; >+ my $icarus = $root->find('icarus')->shift; >+ if ($icarus){ >+ #Get all child nodes for the 'icarus' element >+ my @childnodes = $icarus->childNodes(); >+ foreach my $node (@childnodes){ >+ #Only consider nodes that are elements >+ if ($node->nodeType == XML_ELEMENT_NODE){ >+ my $config_key = $node->nodeName; >+ my $first_child = $node->firstChild; >+ #Only consider nodes that have a text node as their first child >+ if ($first_child && $first_child->nodeType == XML_TEXT_NODE){ >+ $config->{$config_key} = $first_child->nodeValue; >+ } >+ } >+ } >+ } >+ } >+ }; >+ } >+ return $config; >+} >+ >+####################################################################### >+#NOTE: On Debian, you can use the daemon binary to make a process into a daemon, >+# the following subs are for systems that don't have the daemon binary. >+ >+sub daemonize { >+ my $pid = fork; >+ die "Couldn't fork: $!" unless defined($pid); >+ if ($pid){ >+ exit; #Parent exit >+ } >+ POSIX::setsid() or die "Can't start a new session: $!"; >+} >+ >+sub log_to_file { >+ my $logfile = shift; >+ #Open a filehandle to append to a log file >+ open(LOG, '>>', $logfile) or die "Unable to open a filehandle for $logfile: $!\n"; # --output >+ LOG->autoflush(1); #Make filehandle hot (ie don't buffer) >+ *STDOUT = *LOG; #Re-assign STDOUT to LOG | --stdout >+ *STDERR = *STDOUT; #Re-assign STDERR to STDOUT | --stderr >+} >+ >+sub make_pid_file { >+ my $pidfile = shift; >+ if ( ! -e $pidfile ){ >+ open(my $fh, '>', $pidfile) or die "Unable to write to $pidfile: $!\n"; >+ $fh->close; >+ } >+ >+ open(my $pidfilehandle, '+<', $pidfile) or die "Unable to open a filehandle for $pidfile: $!\n"; >+ if (flock($pidfilehandle, LOCK_EX|LOCK_NB)){ >+ #Write pid to pidfile >+ print "Acquiring lock on $pidfile\n"; >+ #Now that we've acquired a lock, let's truncate the file >+ truncate($pidfilehandle, 0); >+ print $pidfilehandle $$."\n" or die $!; >+ #Flush the filehandle so you're not suffering from buffering >+ $pidfilehandle->flush(); >+ return $pidfilehandle; >+ } else { >+ my $number = <$pidfilehandle>; >+ chomp($number); >+ warn "$0 is already running with pid $number. Exiting.\n"; >+ exit(1); >+ } >+} >+ >+__END__ >+ >+=head1 SYNOPSIS >+ >+icarusd.pl -f koha-conf.xml [--log icarus.log] [--daemon] [ -v 0-9 ] [-h] >+ >+=cut >diff --git a/rewrite-config.PL b/rewrite-config.PL >index 3239a59..38b2bb6 100644 >--- a/rewrite-config.PL >+++ b/rewrite-config.PL >@@ -148,6 +148,8 @@ $prefix = $ENV{'INSTALL_BASE'} || "/usr"; > "__MEMCACHED_SERVERS__" => "", > "__MEMCACHED_NAMESPACE__" => "", > "__FONT_DIR__" => "/usr/share/fonts/truetype/ttf-dejavu", >+ "__ICARUS_RUN_DIR__" => "$prefix/var/run/icarus", >+ "__ICARUS_MAX_TASKS__" => "30", > ); > > # Override configuration from the environment >diff --git a/skel/var/run/koha/icarus/README b/skel/var/run/koha/icarus/README >new file mode 100644 >index 0000000..ecb05dd >--- /dev/null >+++ b/skel/var/run/koha/icarus/README >@@ -0,0 +1 @@ >+icarus dir >diff --git a/svc/import_oai b/svc/import_oai >new file mode 100755 >index 0000000..1362b6e >--- /dev/null >+++ b/svc/import_oai >@@ -0,0 +1,143 @@ >+#!/usr/bin/perl >+ >+# Copyright 2016 Prosentient Systems >+# >+# This file is part of Koha. >+# >+# Koha is free software; you can redistribute it and/or modify it >+# under the terms of the GNU General Public License as published by >+# the Free Software Foundation; either version 3 of the License, or >+# (at your option) any later version. >+# >+# Koha is distributed in the hope that it will be useful, but >+# WITHOUT ANY WARRANTY; without even the implied warranty of >+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the >+# GNU General Public License for more details. >+# >+# You should have received a copy of the GNU General Public License >+# along with Koha; if not, see <http://www.gnu.org/licenses>. >+# >+ >+use Modern::Perl; >+use XML::LibXML; >+use URI; >+use File::Basename; >+ >+use CGI qw ( -utf8 ); >+use C4::Auth qw/check_api_auth/; >+use C4::Context; >+use C4::ImportBatch; >+use C4::Matcher; >+use XML::Simple; >+use C4::Biblio; >+ >+use Koha::OAI::Client::Record; >+ >+my $query = new CGI; >+binmode STDOUT, ':encoding(UTF-8)'; >+ >+my ($status, $cookie, $sessionID) = check_api_auth($query, { editcatalogue => 'edit_catalogue'} ); >+unless ($status eq "ok") { >+ print $query->header(-type => 'text/xml', -status => '403 Forbidden'); >+ print XMLout({ auth_status => $status }, NoAttr => 1, RootName => 'response', XMLDecl => 1); >+ exit 0; >+} >+ >+my $xml; >+if ($query->request_method eq "POST") { >+ $xml = $query->param('xml'); >+} >+if ($xml) { >+ #TODO: You could probably use $query->Vars here instead... >+ my %params = map { $_ => $query->param($_) } $query->param; >+ my $result = import_oai($xml, \%params ); >+ print $query->header(-type => 'text/xml'); >+ print XMLout($result, NoAttr => 1, RootName => 'response', XMLDecl => 1); >+} else { >+ print $query->header(-type => 'text/xml', -status => '400 Bad Request'); >+} >+ >+exit 0; >+ >+sub import_oai { >+ my ($inxml, $params) = @_; >+ >+ my $result = {}; >+ my $status = "error"; >+ >+ my $filter = delete $params->{filter} || ''; >+ my $import_mode = delete $params->{import_mode} || ''; >+ my $framework = delete $params->{framework} || ''; >+ >+ if (my $matcher_code = delete $params->{match}) { >+ $params->{matcher_id} = C4::Matcher::GetMatcherId($matcher_code); >+ } >+ >+ my $batch_id = GetWebserviceBatchId($params); >+ #FIXME: Use the batch_id to create a more useful filename in the import_batches table... >+ unless ($batch_id) { >+ $result->{'status'} = "failed"; >+ $result->{'error'} = "Batch create error"; >+ return $result; >+ } >+ >+ #Source a default XSLT to use for filtering >+ my $htdocs = C4::Context->config('intrahtdocs'); >+ my $theme = C4::Context->preference("template"); >+ #FIXME: This doesn't work for UNIMARC! >+ my $xslfilename = "$htdocs/$theme/en/xslt/OAI2MARC21slim.xsl"; >+ >+ #FIXME: There's a better way to do these filters... >+ if ($filter){ >+ my $filter_uri = URI->new($filter); >+ if ($filter_uri){ >+ my $scheme = $filter_uri->scheme; >+ if ($scheme && $scheme eq "file"){ >+ my $path = $filter_uri->path; >+ #Filters may theoretically be .xsl or .pm files >+ my($filename, $dirs, $suffix) = fileparse($path,(".xsl",".pm")); >+ if ($suffix && $suffix eq ".xsl"){ >+ #If this new path exists, change the filter XSLT to it >+ if ( -f $path ){ >+ $xslfilename = $path; >+ } >+ } >+ } >+ } >+ } >+ >+ #Get matching rule matcher >+ my $matcher = C4::Matcher->new($params->{record_type} || 'biblio'); >+ $matcher = C4::Matcher->fetch($params->{matcher_id}); >+ >+ >+ my $oai_record = Koha::OAI::Client::Record->new({ >+ xml_string => $inxml, >+ }); >+ >+ $oai_record->filter({ >+ filter => $xslfilename, >+ }); >+ >+ my ($import_status, $match_status, $koha_record_numbers, $errors) = $oai_record->import_record({ >+ matcher => $matcher, >+ import_batch_id => $batch_id, >+ import_mode => $import_mode, >+ framework => $framework, >+ }); >+ >+ $oai_record->save_to_database(); >+ >+ $result->{'match_status'} = $match_status; >+ $result->{'import_batch_id'} = $batch_id; >+ $result->{'koha_record_numbers'} = $koha_record_numbers; >+ >+ if ($import_status && $import_status eq "ok"){ >+ $result->{'status'} = "ok"; >+ } else { >+ $result->{'status'} = "failed"; >+ $result->{'errors'} = $errors; >+ } >+ >+ return $result; >+} >diff --git a/tools/manage-oai-import.pl b/tools/manage-oai-import.pl >new file mode 100755 >index 0000000..6f6bcfc >--- /dev/null >+++ b/tools/manage-oai-import.pl >@@ -0,0 +1,128 @@ >+#!/usr/bin/perl >+ >+# Copyright 2016 Prosentient Systems >+# >+# This file is part of Koha. >+# >+# Koha is free software; you can redistribute it and/or modify it >+# under the terms of the GNU General Public License as published by >+# the Free Software Foundation; either version 3 of the License, or >+# (at your option) any later version. >+# >+# Koha is distributed in the hope that it will be useful, but >+# WITHOUT ANY WARRANTY; without even the implied warranty of >+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the >+# GNU General Public License for more details. >+# >+# You should have received a copy of the GNU General Public License >+# along with Koha; if not, see <http://www.gnu.org/licenses>. >+ >+use Modern::Perl; >+ >+use Koha::Database; >+ >+use C4::Auth; >+use C4::Output; >+use C4::Koha; >+use C4::Context; >+use C4::Matcher; >+use Koha::OAI::Client::Record; >+ >+my $script_name = "/cgi-bin/koha/tools/manage-oai-import.pl"; >+ >+my $input = new CGI; >+my $op = $input->param('op') || ''; >+ >+my $import_oai_id = $input->param('import_oai_id'); >+#my $results_per_page = $input->param('results_per_page') || 25; >+ >+ >+ >+my ($template, $loggedinuser, $cookie) = >+ get_template_and_user({template_name => "tools/manage-oai-import.tt", >+ query => $input, >+ type => "intranet", >+ authnotrequired => 0, >+ flagsrequired => {tools => 'manage_staged_marc'}, >+ debug => 1, >+ }); >+ >+my $schema = Koha::Database->new()->schema(); >+my $resultset = $schema->resultset('ImportOai'); >+my $oai_records = $resultset->search; >+ >+if ($import_oai_id){ >+ my $import_oai_record = $resultset->find($import_oai_id); >+ $template->param( >+ oai_record => $import_oai_record, >+ ); >+ >+ if ($op eq "view_record" && $import_oai_id){ >+ $template->param( >+ view_record => 1, >+ import_oai_id => $import_oai_id, >+ ); >+ } >+ >+ if ($op eq "retry" && $import_oai_record){ >+ my $oai_record = Koha::OAI::Client::Record->new({ >+ xml_string => $import_oai_record->metadata, >+ }); >+ >+ $oai_record->filter({ >+ filter => $import_oai_record->filter, >+ }); >+ my $import_batch_id = $import_oai_record->import_batch_id; >+ if ($import_batch_id){ >+ my $import_batch_rs = $schema->resultset('ImportBatch'); >+ my $import_batch = $import_batch_rs->find($import_batch_id); >+ my $matcher_id = $import_batch->matcher_id; >+ >+ my $record_type = $import_batch->record_type; >+ $template->param( >+ record_type => $record_type, >+ ); >+ >+ >+ #my $matcher = C4::Matcher->new($record_type || 'biblio'); >+ my $matcher = C4::Matcher->fetch($matcher_id); >+ >+ >+ #FIXME >+ my $import_mode = "direct"; >+ #FIXME >+ my $framework = ""; >+ >+ my ($import_status, $match_status, $koha_record_numbers, $errors) = $oai_record->import_record({ >+ matcher => $matcher, >+ import_batch_id => $import_batch_id, >+ import_mode => $import_mode, >+ framework => $framework, >+ }); >+ >+ if ($import_status){ >+ if ($import_status eq 'ok'){ >+ $import_oai_record->status("ok"); >+ $import_oai_record->update(); >+ } else { >+ $template->param( >+ import_status => $import_status, >+ errors => $errors, >+ retry => 1, >+ import_oai_id => $import_oai_id, >+ ); >+ } >+ } >+ } >+ } >+} >+ >+$template->param( >+ script_name => $script_name, >+ oai_records => $oai_records, >+); >+ >+ >+ >+ >+output_html_with_http_headers $input, $cookie, $template->output; >\ No newline at end of file >-- >2.1.4
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Diff
View Attachment As Raw
Actions:
View
|
Diff
|
Splinter Review
Attachments on
bug 10662
:
20757
|
20758
|
21954
|
42446
|
42447
|
42448
|
44792
|
44793
|
47675
|
48096
|
49832
|
50254
|
50980
|
51510
|
51511
|
51512
|
51514
|
51562
|
51563
|
51564
|
51565
|
51705
|
53258
|
53259
|
53260
|
53361
|
53362
|
64444
|
64445
|
64446
|
64479
|
64480
|
64481
|
65016
|
65017
|
65018
|
65019
|
68508
|
71008
|
71009
|
71010
|
71011
|
71012
|
71013
|
71014
|
71015
|
78569
|
78570
|
78571
|
78572
|
78573
|
78574
|
78575
|
78576
|
78577
|
78583
|
78601
|
78617
|
78754
|
78756
|
78758
|
78760
|
78762
|
78764
|
78767
|
78769
|
78771
|
78859
|
78860
|
78914
|
78956
|
79039
|
79045
|
81783
|
81784
|
81786
|
81789
|
81790
|
81855
|
81856
|
81861
|
81862
|
81863
|
81864
|
81902
|
82219
|
84318
|
84319
|
84320
|
84321
|
84322
|
85152
|
85224
|
85225
|
85226
|
85227
|
85228
|
85229
|
99739
|
99740
|
99741
|
99742
|
99743
|
99744
|
99745