Bug 2317 - BiblioAddsAuthorities causes failure when adding certain items throws an error
Summary: BiblioAddsAuthorities causes failure when adding certain items throws an error
Status: CLOSED FIXED
Alias: None
Product: Koha
Classification: Unclassified
Component: Cataloging (show other bugs)
Version: rel_3_0
Hardware: PC All
: PATCH-Sent (DO NOT USE) blocker (vote)
Assignee: Galen Charlton
QA Contact:
URL:
Keywords:
Depends on:
Blocks:
 
Reported: 2008-07-08 08:39 UTC by Chris Nighswonger
Modified: 2019-06-27 09:24 UTC (History)
0 users

See Also:
Change sponsored?: ---
Patch complexity: ---
Documentation contact:
Documentation submission:
Text to go in the release notes:
Version(s) released in:


Attachments
Screenshot of error (960.05 KB, image/bmp)
2008-07-08 08:40 UTC, Chris Cormack
Details

Note You need to log in before you can comment on or make changes to this bug.
Description Chris Cormack 2010-05-21 00:49:18 UTC


---- Reported by cnighswonger@foundations.edu 2008-07-08 08:39:31 ----

Wide character in null operation at /usr/local/share/perl/5.8.8/MARC/Charset/Table.pm line 96.

This only occurs with certain books. For example:

Diccionario De La Lengua Espanola, 2001 printing

The Marc record is imported fine. Attempting to add the item breaks things.



---- Additional Comments From cnighswonger@foundations.edu 2008-07-08 08:40:28 ----

Created an attachment
Screenshot of error





---- Additional Comments From mjr@ttllp.co.uk 2008-07-08 10:54:33 ----

This looks like a configuration error to me.  Probably the wrong character set specified for a z39.50 target.  Can you add

SetEnv KOHA_BACKTRACES 1

to your librarian VirtualHost configuration, retry and paste the full error here?

Also, can you verify that your z39.50 server settings are correct?




---- Additional Comments From jmf@liblime.com 2008-07-08 20:53:26 ----

I suspect this is a problem with the record, claiming to be one encoding and actually being another. It would help if we could locate the specific target and record and get a copy of it in binary format. Can you provide us with the target info and ISBN or exact reference to find the item? I'm also going to bump this to 3.2 since it's unlikely we'll be able to fit a fix into 3.0.



---- Additional Comments From jmf@liblime.com 2008-07-09 09:20:55 ----

Here's a specific example:

 UNC @ Chapel Hill
afton.lib.unc.edu:210/innopac

utf8 encoding

ISBN 8423968146

The title comes through as: Diccionario de la lengua espa�nola /
(note the ? in place of a diacritic) -- this happens in both the results list in the Z39.50 popup, and in the MARC editor when you choose the item.

I'll test now with marc8 encoding to see if that helps.

This is related to http://bugs.koha.org/cgi-bin/bugzilla3/show_bug.cgi?id=2327



---- Additional Comments From jmf@liblime.com 2008-07-09 09:29:15 ----

OK, switching to MARC8 solves the problem, the encoding comes through just fine, and is automatically converted to UTF8 anyway by Koha when the record's loaded in the editor. This can be safely de-prioritized for 3.0, but should be investigated further for rel_3_2.



---- Additional Comments From Eric.Begin@inLibro.com 2008-07-11 11:28:49 ----

I was able to repro this bug on the Library of Congress server on http://koha.liblime.com, with encoding set to MARC8.

I tried with the previously mentioned example: ISBN: 8423968146

This seems to be related to the authority since the we do not have this error when BiblioAddsAuthorities' syspref is OFF.

I feel that under those circumstances, we should re-prioritized this bug for 3.0.




---- Additional Comments From mjr@ttllp.co.uk 2008-07-14 02:00:21 ----


Confirmed on my test server running 3.00.00.094.  Full backtrace:-

Wide character in null operation at /home/mjr/perl/share/perl/5.8.8/MARC/Charset/Table.pm line 96.
 at /home/mjr/perl/share/perl/5.8.8/MARC/Charset/Table.pm line 96
	MARC::Charset::Table::get_code('MARC::Charset::Table=HASH(0x8beec2c)', 'B:̃') called at /home/mjr/perl/share/perl/5.8.8/MARC/Charset/Table.pm line 116
	MARC::Charset::Table::lookup_by_marc8('MARC::Charset::Table=HASH(0x8beec2c)', 's', '̃') called at /home/mjr/perl/share/perl/5.8.8/MARC/Charset.pm line 184
	MARC::Charset::marc8_to_utf8('Diccionario de la lengua española (Real Academia Española)') called at /home/mjr/perl/share/perl/5.8.8/MARC/File/XML.pm line 360
	MARC::File::XML::record('MARC::Record=HASH(0x9cc57e8)', 'MARC21', 1, 'UTF-8') called at /home/mjr/perl/share/perl/5.8.8/MARC/File/XML.pm line 485
	MARC::File::XML::encode('MARC::Record=HASH(0x9cc57e8)', 'MARC21', 1) called at /home/mjr/perl/share/perl/5.8.8/MARC/File/XML.pm line 146
	MARC::Record::as_xml_record('MARC::Record=HASH(0x9cc57e8)', 'MARC21') called at /home/mjr/public_html/koha.git/C4/AuthoritiesMarc.pm line 566
	C4::AuthoritiesMarc::AddAuthority('MARC::Record=HASH(0x9cc57e8)', '', 'UNIF_TITLE') called at /home/mjr/public_html/koha.git/cataloguing/addbiblio.pl line 742
	main::BiblioAddAuthorities('MARC::Record=HASH(0x958d3f8)', '') called at /home/mjr/public_html/koha.git/cataloguing/addbiblio.pl line 850





---- Additional Comments From jmf@liblime.com 2008-07-14 07:37:34 ----

OK, I've updated the Summary and agree we should investigate further pre-3.0-stable



---- Additional Comments From Andrew.moore@liblime.com 2008-07-30 13:36:25 ----

reopening: apparently accidently closed.




---- Additional Comments From chris.nighswonger@liblime.com 2008-07-31 15:16:23 ----

This same error occurs at apparently random times when hand building a MARC record from scratch.



---- Additional Comments From gmcharlt@gmail.com 2008-08-05 07:55:25 ----

*** http://bugs.koha.org/cgi-bin/bugzilla3/show_bug.cgi?id=2468 has been marked as a duplicate of this bug. ***



---- Additional Comments From gmcharlt@gmail.com 2008-08-05 19:16:59 ----

Patch submitted to correct this bug.



---- Additional Comments From chris.nighswonger@liblime.com 2008-08-07 11:51:27 ----

This appears to fix the problem in that our catalogers were able to import all previously error causing records.

A note that might be good to add to the Koha MARC editor screen help: Users should be careful when direct entering MARC fields and more especially when doing any cut-and-paste operations to ensure that the characters are truly UTF-8 or wide-character errors will occur.

Thanks Galen!



--- Bug imported by chris@bigballofwax.co.nz 2010-05-21 00:49 UTC  ---

This bug was previously known as _bug_ 2317 at http://bugs.koha.org/cgi-bin/bugzilla3/show_bug.cgi?id=2317
Imported an attachment (id=613)

Actual time not defined. Setting to 0.0
CC member ccslibrary@gmail.com does not have an account here
CC member Eric.Begin@inLibro.com does not have an account here
CC member mjr@ttllp.co.uk does not have an account here
The original submitter of attachment 613 [details] is unknown.
   Reassigning to the person who moved it here: chris@bigballofwax.co.nz.