Summary: | (very) large biblio/item handling | ||
---|---|---|---|
Product: | Koha | Reporter: | Paul Poulain <paul.poulain> |
Component: | Cataloging | Assignee: | Galen Charlton <gmcharlt> |
Status: | CLOSED FIXED | QA Contact: | Bugs List <koha-bugs> |
Severity: | critical | ||
Priority: | P3 | CC: | cnighswonger, koha.sekjal |
Version: | Main | ||
Hardware: | PC | ||
OS: | All | ||
Change sponsored?: | --- | Patch complexity: | --- |
Documentation contact: | Documentation submission: | ||
Text to go in the release notes: | Version(s) released in: | ||
Circulation function: | |||
Bug Depends on: | 5579, 6789 | ||
Bug Blocks: |
Description
Chris Cormack
2010-05-21 00:51:21 UTC
Is this resolved by removing items from the XML in 3.4? (17:45:15) kf: I was looking at bug 2453 (17:45:16) huginn: Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=2453 critical, P3, ---, gmcharlt, NEW , (very) large biblio/item handling (17:45:35) kf: can this be closed? (17:48:30) sekjal: I think so... if you use -x on the rebuild_zebra.pl script, it should work fine (17:49:06) gmcharlt: yes (17:50:40) gmcharlt: there is one more potential improvement that could be made, namely tweaking how the marcxml blobs are handled by C4::Search so that we grab the current item information from mysql rather than zebra when rendering search results (17:50:51) gmcharlt: but that can certainly be the topic of a separate bug I was wrong, this is still an issue. If a title has many, many items, enough to break ISO formatting, it can still be passed to Zebra, but any information beyond the character limit seems to be "lost". Since the indexing key is in the 999$c, and the items are in 952, and are inserted ordered, this means many items can push the key off the edge. I'm seeing this result in search results where the link to the details page contains a blank biblionumber. The solution to this would seem to be to append the 952s. An additional wrinkle comes if you run without --no-sanitize; the 999$c and $d are recalculated right before handoff to Zebra, and are inserted ordered (so at the very end). So this step would need to be moved BEFORE the embedding of items, which would mean in the GetMarcBiblio() subroutine. The embedding of items could also be done there, instead of in rebuild_zebra.pl, since we can pass a flag to GetMarcBiblio to do that. Further research has shown that my issue is not with ISO 2709, but the implicit file size limit of the Net::Z3950::ZOOM connection. The perl module does not make use of any connection options, one of which is maximumRecordSize. Since it's not specified on connection, the default of 1MB is used. This is almost always enough, but in the case of a VERY prolific serial, not always. For example, the record I was testing showed 1673 items, and came out to 1.2MB of XML once the items were added in. While fixing Net::Z3950::ZOOM would be ideal, it's probably easier to alter Koha in the way described in my previous comment. comment: UNIMARC users often use 009 for storing biblionumber, and should not face this problem (does not mean it must not be addressed, of course) Pushing to master. |